Apr 16 19:15:15.105817 ip-10-0-128-123 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 19:15:15.105936 ip-10-0-128-123 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 19:15:15.105951 ip-10-0-128-123 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 19:15:15.106242 ip-10-0-128-123 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 19:15:25.150091 ip-10-0-128-123 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 19:15:25.150117 ip-10-0-128-123 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot f0c21bcad9a14d639a345e9c6409ab24 -- Apr 16 19:17:52.384475 ip-10-0-128-123 systemd[1]: Starting Kubernetes Kubelet... Apr 16 19:17:52.832672 ip-10-0-128-123 kubenswrapper[2582]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:17:52.832672 ip-10-0-128-123 kubenswrapper[2582]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 19:17:52.832672 ip-10-0-128-123 kubenswrapper[2582]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:17:52.832672 ip-10-0-128-123 kubenswrapper[2582]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 19:17:52.832672 ip-10-0-128-123 kubenswrapper[2582]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:17:52.836159 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.835935 2582 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 19:17:52.839100 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839085 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:17:52.839100 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839098 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:17:52.839100 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839102 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:17:52.839206 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839106 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:17:52.839206 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839109 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:17:52.839206 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839113 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:17:52.839206 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839115 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:17:52.839206 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839118 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:17:52.839206 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839121 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:17:52.839206 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839124 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:17:52.839206 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839127 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:17:52.839206 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839130 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:17:52.839206 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839132 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:17:52.839206 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839135 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:17:52.839206 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839139 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:17:52.839206 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839142 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:17:52.839206 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839144 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:17:52.839206 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839159 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:17:52.839206 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839163 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:17:52.839206 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839166 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:17:52.839206 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839169 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:17:52.839206 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839171 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:17:52.839206 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839174 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:17:52.839676 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839177 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:17:52.839676 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839179 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:17:52.839676 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839182 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:17:52.839676 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839185 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:17:52.839676 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839187 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:17:52.839676 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839190 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:17:52.839676 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839193 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:17:52.839676 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839195 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:17:52.839676 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839198 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:17:52.839676 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839201 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:17:52.839676 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839203 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:17:52.839676 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839206 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:17:52.839676 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839211 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:17:52.839676 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839216 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:17:52.839676 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839220 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:17:52.839676 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839222 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:17:52.839676 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839225 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:17:52.839676 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839228 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:17:52.839676 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839231 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:17:52.840215 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839233 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:17:52.840215 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839236 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:17:52.840215 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839238 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:17:52.840215 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839241 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:17:52.840215 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839244 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:17:52.840215 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839246 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:17:52.840215 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839249 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:17:52.840215 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839251 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:17:52.840215 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839254 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:17:52.840215 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839257 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:17:52.840215 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839259 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:17:52.840215 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839261 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:17:52.840215 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839264 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:17:52.840215 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839267 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:17:52.840215 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839270 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:17:52.840215 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839273 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:17:52.840215 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839277 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:17:52.840215 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839282 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:17:52.840215 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839284 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:17:52.840686 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839288 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:17:52.840686 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839290 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:17:52.840686 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839293 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:17:52.840686 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839296 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:17:52.840686 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839298 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:17:52.840686 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839300 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:17:52.840686 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839303 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:17:52.840686 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839306 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:17:52.840686 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839308 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:17:52.840686 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839311 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:17:52.840686 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839314 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:17:52.840686 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839316 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:17:52.840686 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839318 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:17:52.840686 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839321 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:17:52.840686 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839324 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:17:52.840686 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839327 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:17:52.840686 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839330 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:17:52.840686 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839332 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:17:52.840686 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839335 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:17:52.840686 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839337 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:17:52.841173 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839340 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:17:52.841173 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839342 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:17:52.841173 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839345 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:17:52.841173 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839348 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:17:52.841173 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839350 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:17:52.841173 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839740 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:17:52.841173 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839746 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:17:52.841173 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839750 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:17:52.841173 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839753 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:17:52.841173 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839756 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:17:52.841173 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839759 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:17:52.841173 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839762 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:17:52.841173 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839765 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:17:52.841173 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839767 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:17:52.841173 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839770 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:17:52.841173 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839772 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:17:52.841173 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839775 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:17:52.841173 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839778 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:17:52.841173 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839780 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:17:52.841173 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839783 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:17:52.841652 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839786 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:17:52.841652 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839789 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:17:52.841652 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839791 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:17:52.841652 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839794 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:17:52.841652 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839796 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:17:52.841652 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839798 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:17:52.841652 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839801 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:17:52.841652 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839805 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:17:52.841652 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839807 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:17:52.841652 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839810 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:17:52.841652 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839812 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:17:52.841652 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839815 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:17:52.841652 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839817 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:17:52.841652 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839820 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:17:52.841652 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839823 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:17:52.841652 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839825 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:17:52.841652 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839828 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:17:52.841652 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839830 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:17:52.841652 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839833 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:17:52.842123 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839836 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:17:52.842123 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839839 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:17:52.842123 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839842 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:17:52.842123 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839844 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:17:52.842123 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839847 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:17:52.842123 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839849 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:17:52.842123 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839852 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:17:52.842123 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839854 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:17:52.842123 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839856 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:17:52.842123 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839859 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:17:52.842123 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839862 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:17:52.842123 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839865 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:17:52.842123 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839867 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:17:52.842123 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839870 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:17:52.842123 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839873 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:17:52.842123 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839875 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:17:52.842123 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839878 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:17:52.842123 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839880 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:17:52.842123 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839883 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:17:52.842123 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839885 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:17:52.842668 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839888 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:17:52.842668 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839891 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:17:52.842668 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839893 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:17:52.842668 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839896 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:17:52.842668 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839898 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:17:52.842668 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839901 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:17:52.842668 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839903 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:17:52.842668 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839906 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:17:52.842668 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839909 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:17:52.842668 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839914 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:17:52.842668 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839918 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:17:52.842668 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839921 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:17:52.842668 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839924 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:17:52.842668 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839927 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:17:52.842668 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839930 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:17:52.842668 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839932 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:17:52.842668 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839935 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:17:52.842668 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839937 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:17:52.842668 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839940 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:17:52.842668 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839942 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:17:52.843257 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839944 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:17:52.843257 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839947 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:17:52.843257 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839950 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:17:52.843257 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839952 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:17:52.843257 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839955 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:17:52.843257 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839957 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:17:52.843257 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839959 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:17:52.843257 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839962 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:17:52.843257 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839965 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:17:52.843257 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839968 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:17:52.843257 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839970 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:17:52.843257 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.839973 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:17:52.843257 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840050 2582 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 19:17:52.843257 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840057 2582 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 19:17:52.843257 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840064 2582 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 19:17:52.843257 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840070 2582 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 19:17:52.843257 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840076 2582 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 19:17:52.843257 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840080 2582 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 19:17:52.843257 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840088 2582 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 19:17:52.843257 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840095 2582 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840100 2582 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840103 2582 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840106 2582 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840109 2582 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840112 2582 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840115 2582 flags.go:64] FLAG: --cgroup-root="" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840118 2582 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840121 2582 flags.go:64] FLAG: --client-ca-file="" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840124 2582 flags.go:64] FLAG: --cloud-config="" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840127 2582 flags.go:64] FLAG: --cloud-provider="external" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840130 2582 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840134 2582 flags.go:64] FLAG: --cluster-domain="" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840137 2582 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840140 2582 flags.go:64] FLAG: --config-dir="" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840143 2582 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840146 2582 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840164 2582 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840167 2582 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840170 2582 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840174 2582 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840177 2582 flags.go:64] FLAG: --contention-profiling="false" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840183 2582 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840186 2582 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840190 2582 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 19:17:52.843754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840193 2582 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840197 2582 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840200 2582 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840203 2582 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840206 2582 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840209 2582 flags.go:64] FLAG: --enable-server="true" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840212 2582 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840217 2582 flags.go:64] FLAG: --event-burst="100" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840221 2582 flags.go:64] FLAG: --event-qps="50" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840224 2582 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840227 2582 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840231 2582 flags.go:64] FLAG: --eviction-hard="" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840234 2582 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840238 2582 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840241 2582 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840244 2582 flags.go:64] FLAG: --eviction-soft="" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840247 2582 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840250 2582 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840253 2582 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840256 2582 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840259 2582 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840262 2582 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840265 2582 flags.go:64] FLAG: --feature-gates="" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840269 2582 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840272 2582 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 19:17:52.844365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840275 2582 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840278 2582 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840281 2582 flags.go:64] FLAG: --healthz-port="10248" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840284 2582 flags.go:64] FLAG: --help="false" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840290 2582 flags.go:64] FLAG: --hostname-override="ip-10-0-128-123.ec2.internal" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840294 2582 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840297 2582 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840299 2582 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840303 2582 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840306 2582 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840309 2582 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840312 2582 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840315 2582 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840318 2582 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840321 2582 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840324 2582 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840327 2582 flags.go:64] FLAG: --kube-reserved="" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840330 2582 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840333 2582 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840336 2582 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840339 2582 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840342 2582 flags.go:64] FLAG: --lock-file="" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840345 2582 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840348 2582 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 19:17:52.844971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840351 2582 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840357 2582 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840360 2582 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840363 2582 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840366 2582 flags.go:64] FLAG: --logging-format="text" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840368 2582 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840371 2582 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840374 2582 flags.go:64] FLAG: --manifest-url="" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840377 2582 flags.go:64] FLAG: --manifest-url-header="" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840382 2582 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840385 2582 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840390 2582 flags.go:64] FLAG: --max-pods="110" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840394 2582 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840397 2582 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840400 2582 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840403 2582 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840407 2582 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840410 2582 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840412 2582 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840420 2582 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840423 2582 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840426 2582 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840429 2582 flags.go:64] FLAG: --pod-cidr="" Apr 16 19:17:52.845566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840432 2582 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840438 2582 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840441 2582 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840444 2582 flags.go:64] FLAG: --pods-per-core="0" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840447 2582 flags.go:64] FLAG: --port="10250" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840450 2582 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840453 2582 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0d895eb77980ac877" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840457 2582 flags.go:64] FLAG: --qos-reserved="" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840460 2582 flags.go:64] FLAG: --read-only-port="10255" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840463 2582 flags.go:64] FLAG: --register-node="true" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840466 2582 flags.go:64] FLAG: --register-schedulable="true" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840469 2582 flags.go:64] FLAG: --register-with-taints="" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840477 2582 flags.go:64] FLAG: --registry-burst="10" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840480 2582 flags.go:64] FLAG: --registry-qps="5" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840483 2582 flags.go:64] FLAG: --reserved-cpus="" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840486 2582 flags.go:64] FLAG: --reserved-memory="" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840490 2582 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840493 2582 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840496 2582 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840499 2582 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840502 2582 flags.go:64] FLAG: --runonce="false" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840506 2582 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840509 2582 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840513 2582 flags.go:64] FLAG: --seccomp-default="false" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840515 2582 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840519 2582 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 19:17:52.846122 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840522 2582 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840525 2582 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840528 2582 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840531 2582 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840536 2582 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840539 2582 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840542 2582 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840545 2582 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840548 2582 flags.go:64] FLAG: --system-cgroups="" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840551 2582 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840556 2582 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840559 2582 flags.go:64] FLAG: --tls-cert-file="" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840561 2582 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840565 2582 flags.go:64] FLAG: --tls-min-version="" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840568 2582 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840571 2582 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840574 2582 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840577 2582 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840580 2582 flags.go:64] FLAG: --v="2" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840584 2582 flags.go:64] FLAG: --version="false" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840588 2582 flags.go:64] FLAG: --vmodule="" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840592 2582 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.840595 2582 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840701 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840706 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:17:52.846755 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840709 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:17:52.847420 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840712 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:17:52.847420 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840717 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:17:52.847420 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840719 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:17:52.847420 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840722 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:17:52.847420 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840725 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:17:52.847420 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840728 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:17:52.847420 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840730 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:17:52.847420 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840733 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:17:52.847420 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840735 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:17:52.847420 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840738 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:17:52.847420 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840742 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:17:52.847420 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840744 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:17:52.847420 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840747 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:17:52.847420 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840749 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:17:52.847420 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840752 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:17:52.847420 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840755 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:17:52.847420 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840757 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:17:52.847420 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840759 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:17:52.847420 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840762 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:17:52.847420 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840764 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:17:52.848302 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840767 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:17:52.848302 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840770 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:17:52.848302 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840772 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:17:52.848302 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840775 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:17:52.848302 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840777 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:17:52.848302 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840780 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:17:52.848302 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840782 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:17:52.848302 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840785 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:17:52.848302 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840787 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:17:52.848302 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840790 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:17:52.848302 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840793 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:17:52.848302 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840795 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:17:52.848302 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840798 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:17:52.848302 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840802 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:17:52.848302 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840805 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:17:52.848302 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840807 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:17:52.848302 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840810 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:17:52.848302 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840813 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:17:52.848302 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840815 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:17:52.848842 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840818 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:17:52.848842 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840821 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:17:52.848842 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840823 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:17:52.848842 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840827 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:17:52.848842 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840831 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:17:52.848842 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840834 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:17:52.848842 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840837 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:17:52.848842 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840839 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:17:52.848842 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840842 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:17:52.848842 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840845 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:17:52.848842 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840847 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:17:52.848842 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840850 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:17:52.848842 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840852 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:17:52.848842 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840855 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:17:52.848842 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840857 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:17:52.848842 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840860 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:17:52.848842 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840862 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:17:52.848842 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840865 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:17:52.848842 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840868 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:17:52.848842 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840870 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:17:52.849366 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840873 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:17:52.849366 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840875 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:17:52.849366 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840878 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:17:52.849366 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840881 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:17:52.849366 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840883 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:17:52.849366 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840885 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:17:52.849366 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840891 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:17:52.849366 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840894 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:17:52.849366 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840897 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:17:52.849366 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840900 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:17:52.849366 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840902 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:17:52.849366 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840906 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:17:52.849366 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840908 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:17:52.849366 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840911 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:17:52.849366 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840914 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:17:52.849366 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840918 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:17:52.849366 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840920 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:17:52.849366 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840923 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:17:52.849366 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840925 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:17:52.849833 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840928 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:17:52.849833 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840930 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:17:52.849833 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840933 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:17:52.849833 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840935 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:17:52.849833 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.840938 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:17:52.849833 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.841726 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:17:52.849992 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.849947 2582 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 19:17:52.849992 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.849963 2582 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 19:17:52.850043 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850017 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:17:52.850043 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850022 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:17:52.850043 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850026 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:17:52.850043 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850029 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:17:52.850043 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850032 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:17:52.850043 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850035 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:17:52.850043 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850038 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:17:52.850043 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850041 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:17:52.850043 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850045 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:17:52.850043 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850048 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:17:52.850307 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850051 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:17:52.850307 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850054 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:17:52.850307 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850056 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:17:52.850307 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850060 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:17:52.850307 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850062 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:17:52.850307 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850065 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:17:52.850307 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850068 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:17:52.850307 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850070 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:17:52.850307 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850073 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:17:52.850307 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850076 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:17:52.850307 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850078 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:17:52.850307 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850081 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:17:52.850307 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850084 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:17:52.850307 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850086 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:17:52.850307 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850089 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:17:52.850307 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850091 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:17:52.850307 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850098 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:17:52.850307 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850101 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:17:52.850307 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850103 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:17:52.850307 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850106 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:17:52.850794 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850109 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:17:52.850794 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850111 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:17:52.850794 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850114 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:17:52.850794 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850117 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:17:52.850794 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850119 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:17:52.850794 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850122 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:17:52.850794 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850124 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:17:52.850794 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850126 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:17:52.850794 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850129 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:17:52.850794 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850133 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:17:52.850794 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850136 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:17:52.850794 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850139 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:17:52.850794 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850142 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:17:52.850794 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850145 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:17:52.850794 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850164 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:17:52.850794 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850167 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:17:52.850794 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850170 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:17:52.850794 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850172 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:17:52.851274 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850176 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:17:52.851274 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850181 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:17:52.851274 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850184 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:17:52.851274 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850187 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:17:52.851274 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850189 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:17:52.851274 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850192 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:17:52.851274 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850195 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:17:52.851274 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850198 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:17:52.851274 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850200 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:17:52.851274 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850203 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:17:52.851274 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850205 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:17:52.851274 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850208 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:17:52.851274 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850210 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:17:52.851274 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850213 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:17:52.851274 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850216 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:17:52.851274 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850219 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:17:52.851274 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850221 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:17:52.851274 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850224 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:17:52.851274 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850226 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:17:52.851274 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850229 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:17:52.851762 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850232 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:17:52.851762 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850234 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:17:52.851762 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850237 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:17:52.851762 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850240 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:17:52.851762 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850242 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:17:52.851762 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850245 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:17:52.851762 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850247 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:17:52.851762 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850250 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:17:52.851762 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850253 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:17:52.851762 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850256 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:17:52.851762 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850259 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:17:52.851762 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850262 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:17:52.851762 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850264 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:17:52.851762 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850267 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:17:52.851762 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850270 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:17:52.851762 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850272 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:17:52.851762 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850275 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:17:52.851762 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850277 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:17:52.852246 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.850282 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:17:52.852246 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850388 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:17:52.852246 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850393 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:17:52.852246 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850396 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:17:52.852246 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850399 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:17:52.852246 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850401 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:17:52.852246 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850404 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:17:52.852246 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850407 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:17:52.852246 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850409 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:17:52.852246 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850412 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:17:52.852246 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850414 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:17:52.852246 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850417 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:17:52.852246 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850420 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:17:52.852246 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850422 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:17:52.852246 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850425 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:17:52.852246 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850428 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:17:52.852644 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850430 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:17:52.852644 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850433 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:17:52.852644 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850435 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:17:52.852644 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850438 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:17:52.852644 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850441 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:17:52.852644 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850443 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:17:52.852644 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850446 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:17:52.852644 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850449 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:17:52.852644 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850452 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:17:52.852644 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850455 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:17:52.852644 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850458 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:17:52.852644 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850460 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:17:52.852644 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850463 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:17:52.852644 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850465 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:17:52.852644 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850468 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:17:52.852644 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850470 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:17:52.852644 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850473 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:17:52.852644 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850475 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:17:52.852644 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850478 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:17:52.852644 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850480 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:17:52.853132 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850483 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:17:52.853132 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850485 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:17:52.853132 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850488 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:17:52.853132 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850491 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:17:52.853132 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850493 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:17:52.853132 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850496 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:17:52.853132 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850499 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:17:52.853132 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850501 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:17:52.853132 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850504 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:17:52.853132 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850506 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:17:52.853132 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850509 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:17:52.853132 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850512 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:17:52.853132 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850516 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:17:52.853132 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850519 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:17:52.853132 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850522 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:17:52.853132 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850525 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:17:52.853132 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850528 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:17:52.853132 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850531 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:17:52.853132 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850534 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:17:52.853612 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850537 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:17:52.853612 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850541 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:17:52.853612 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850543 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:17:52.853612 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850546 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:17:52.853612 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850548 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:17:52.853612 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850551 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:17:52.853612 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850553 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:17:52.853612 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850556 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:17:52.853612 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850558 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:17:52.853612 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850561 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:17:52.853612 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850563 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:17:52.853612 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850566 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:17:52.853612 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850569 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:17:52.853612 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850571 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:17:52.853612 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850575 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:17:52.853612 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850578 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:17:52.853612 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850581 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:17:52.853612 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850584 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:17:52.853612 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850587 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:17:52.854078 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850589 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:17:52.854078 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850592 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:17:52.854078 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850594 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:17:52.854078 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850596 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:17:52.854078 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850599 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:17:52.854078 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850601 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:17:52.854078 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850604 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:17:52.854078 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850606 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:17:52.854078 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850608 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:17:52.854078 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850611 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:17:52.854078 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850613 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:17:52.854078 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850616 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:17:52.854078 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:52.850619 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:17:52.854078 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.850624 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:17:52.854078 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.851309 2582 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 19:17:52.855987 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.855973 2582 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 19:17:52.857534 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.857522 2582 server.go:1019] "Starting client certificate rotation" Apr 16 19:17:52.857631 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.857615 2582 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:17:52.857662 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.857648 2582 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:17:52.885099 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.885080 2582 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:17:52.887426 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.887400 2582 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:17:52.901355 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.901325 2582 log.go:25] "Validated CRI v1 runtime API" Apr 16 19:17:52.907550 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.907531 2582 log.go:25] "Validated CRI v1 image API" Apr 16 19:17:52.909622 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.909587 2582 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 19:17:52.913370 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.913347 2582 fs.go:135] Filesystem UUIDs: map[3a351ad2-a482-4a9f-9b2c-57f5486333da:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 95d692b4-bb3b-4950-92c5-2e4f96e99029:/dev/nvme0n1p4] Apr 16 19:17:52.913455 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.913368 2582 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 19:17:52.918126 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.918106 2582 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:17:52.921390 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.921281 2582 manager.go:217] Machine: {Timestamp:2026-04-16 19:17:52.919456553 +0000 UTC m=+0.414047162 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3095854 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec29d32975aefeed9f7071d1212f6a67 SystemUUID:ec29d329-75ae-feed-9f70-71d1212f6a67 BootID:f0c21bca-d9a1-4d63-9a34-5e9c6409ab24 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:be:19:37:c3:0d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:be:19:37:c3:0d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:2a:32:f7:e3:2e:97 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 19:17:52.921390 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.921381 2582 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 19:17:52.921517 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.921508 2582 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 19:17:52.921826 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.921806 2582 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 19:17:52.921957 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.921828 2582 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-123.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 19:17:52.922003 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.921966 2582 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 19:17:52.922003 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.921975 2582 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 19:17:52.922003 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.921988 2582 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:17:52.922082 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.922003 2582 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:17:52.923561 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.923551 2582 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:17:52.923666 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.923657 2582 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 19:17:52.926091 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.926081 2582 kubelet.go:491] "Attempting to sync node with API server" Apr 16 19:17:52.926123 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.926096 2582 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 19:17:52.926123 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.926107 2582 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 19:17:52.926123 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.926117 2582 kubelet.go:397] "Adding apiserver pod source" Apr 16 19:17:52.926258 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.926128 2582 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 19:17:52.927235 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.927224 2582 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:17:52.927282 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.927242 2582 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:17:52.931463 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.931448 2582 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 19:17:52.932844 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.932817 2582 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 19:17:52.935025 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.935010 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 19:17:52.935120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.935031 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 19:17:52.935120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.935041 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 19:17:52.935120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.935059 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 19:17:52.935120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.935068 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 19:17:52.935120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.935077 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 19:17:52.935120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.935086 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 19:17:52.935120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.935094 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 19:17:52.935120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.935103 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 19:17:52.935120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.935112 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 19:17:52.935409 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.935127 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 19:17:52.935409 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.935140 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 19:17:52.936799 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.936784 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 19:17:52.936799 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.936800 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 19:17:52.938075 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:52.938048 2582 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 19:17:52.938174 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:52.938090 2582 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-123.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 19:17:52.940416 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.940404 2582 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 19:17:52.940455 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.940441 2582 server.go:1295] "Started kubelet" Apr 16 19:17:52.940533 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.940500 2582 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 19:17:52.940635 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.940595 2582 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 19:17:52.940697 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.940664 2582 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 19:17:52.941409 ip-10-0-128-123 systemd[1]: Started Kubernetes Kubelet. Apr 16 19:17:52.942044 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.941431 2582 server.go:317] "Adding debug handlers to kubelet server" Apr 16 19:17:52.942465 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.942438 2582 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 19:17:52.947123 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.947107 2582 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 19:17:52.947436 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.947423 2582 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-123.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 19:17:52.948387 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.948373 2582 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 19:17:52.948977 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:52.947488 2582 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-123.ec2.internal.18a6ec70f5505d2d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-123.ec2.internal,UID:ip-10-0-128-123.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-123.ec2.internal,},FirstTimestamp:2026-04-16 19:17:52.940416301 +0000 UTC m=+0.435006909,LastTimestamp:2026-04-16 19:17:52.940416301 +0000 UTC m=+0.435006909,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-123.ec2.internal,}" Apr 16 19:17:52.949132 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.949118 2582 factory.go:55] Registering systemd factory Apr 16 19:17:52.949193 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.949137 2582 factory.go:223] Registration of the systemd container factory successfully Apr 16 19:17:52.949523 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.949364 2582 factory.go:153] Registering CRI-O factory Apr 16 19:17:52.949523 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.949379 2582 factory.go:223] Registration of the crio container factory successfully Apr 16 19:17:52.949523 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:52.949389 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-123.ec2.internal\" not found" Apr 16 19:17:52.949523 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.949435 2582 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 19:17:52.949523 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.949463 2582 factory.go:103] Registering Raw factory Apr 16 19:17:52.949523 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.949479 2582 manager.go:1196] Started watching for new ooms in manager Apr 16 19:17:52.949523 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.949483 2582 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 19:17:52.949523 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.949489 2582 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 19:17:52.949523 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.949504 2582 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 19:17:52.949966 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.949628 2582 reconstruct.go:97] "Volume reconstruction finished" Apr 16 19:17:52.949966 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.949637 2582 reconciler.go:26] "Reconciler: start to sync state" Apr 16 19:17:52.949966 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.949866 2582 manager.go:319] Starting recovery of all containers Apr 16 19:17:52.950847 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:52.950797 2582 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 19:17:52.956479 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:52.956450 2582 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 19:17:52.956580 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:52.956503 2582 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-123.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 19:17:52.960607 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.960546 2582 manager.go:324] Recovery completed Apr 16 19:17:52.964861 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.964849 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:17:52.967379 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.967365 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-123.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:17:52.967434 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.967392 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-123.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:17:52.967434 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.967402 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-123.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:17:52.967831 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.967817 2582 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 19:17:52.967892 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.967832 2582 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 19:17:52.967892 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.967850 2582 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:17:52.969449 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:52.969385 2582 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-123.ec2.internal.18a6ec70f6ebcd0e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-123.ec2.internal,UID:ip-10-0-128-123.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-128-123.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-128-123.ec2.internal,},FirstTimestamp:2026-04-16 19:17:52.967380238 +0000 UTC m=+0.461970848,LastTimestamp:2026-04-16 19:17:52.967380238 +0000 UTC m=+0.461970848,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-123.ec2.internal,}" Apr 16 19:17:52.970226 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.970209 2582 policy_none.go:49] "None policy: Start" Apr 16 19:17:52.970290 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.970229 2582 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 19:17:52.970290 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.970243 2582 state_mem.go:35] "Initializing new in-memory state store" Apr 16 19:17:52.979270 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.979251 2582 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6ktbs" Apr 16 19:17:52.981081 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:52.981021 2582 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-123.ec2.internal.18a6ec70f6ec0f13 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-123.ec2.internal,UID:ip-10-0-128-123.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-128-123.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-128-123.ec2.internal,},FirstTimestamp:2026-04-16 19:17:52.967397139 +0000 UTC m=+0.461987746,LastTimestamp:2026-04-16 19:17:52.967397139 +0000 UTC m=+0.461987746,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-123.ec2.internal,}" Apr 16 19:17:52.985496 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:52.985479 2582 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-6ktbs" Apr 16 19:17:53.011731 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.011715 2582 manager.go:341] "Starting Device Plugin manager" Apr 16 19:17:53.023073 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:53.011746 2582 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 19:17:53.023073 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.011756 2582 server.go:85] "Starting device plugin registration server" Apr 16 19:17:53.023073 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.012052 2582 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 19:17:53.023073 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.012068 2582 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 19:17:53.023073 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.012132 2582 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 19:17:53.023073 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.012228 2582 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 19:17:53.023073 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.012238 2582 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 19:17:53.023073 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:53.013015 2582 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 19:17:53.023073 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:53.013067 2582 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-123.ec2.internal\" not found" Apr 16 19:17:53.108469 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.108379 2582 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 19:17:53.109638 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.109617 2582 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 19:17:53.109753 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.109645 2582 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 19:17:53.109753 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.109665 2582 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 19:17:53.109753 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.109671 2582 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 19:17:53.109886 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:53.109774 2582 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 19:17:53.112228 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.112212 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:17:53.112683 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.112653 2582 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:17:53.112955 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.112938 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-123.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:17:53.113026 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.112967 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-123.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:17:53.113026 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.112981 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-123.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:17:53.113026 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.113007 2582 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-123.ec2.internal" Apr 16 19:17:53.122357 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.122342 2582 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-123.ec2.internal" Apr 16 19:17:53.122426 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:53.122362 2582 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-123.ec2.internal\": node \"ip-10-0-128-123.ec2.internal\" not found" Apr 16 19:17:53.157331 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:53.157300 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-123.ec2.internal\" not found" Apr 16 19:17:53.210524 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.210494 2582 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-123.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-128-123.ec2.internal"] Apr 16 19:17:53.210627 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.210588 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:17:53.212369 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.212355 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-123.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:17:53.212444 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.212385 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-123.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:17:53.212444 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.212396 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-123.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:17:53.214105 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.214092 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:17:53.214276 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.214262 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-123.ec2.internal" Apr 16 19:17:53.214322 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.214293 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:17:53.214806 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.214789 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-123.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:17:53.214881 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.214816 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-123.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:17:53.214881 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.214829 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-123.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:17:53.214881 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.214862 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-123.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:17:53.214881 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.214881 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-123.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:17:53.215069 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.214894 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-123.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:17:53.216198 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.216180 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-123.ec2.internal" Apr 16 19:17:53.216304 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.216209 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:17:53.216836 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.216821 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-123.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:17:53.216916 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.216845 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-123.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:17:53.216916 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.216857 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-123.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:17:53.247844 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:53.247826 2582 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-123.ec2.internal\" not found" node="ip-10-0-128-123.ec2.internal" Apr 16 19:17:53.250714 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.250692 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/af059cf061db47d5f95bfb0c454f454b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-123.ec2.internal\" (UID: \"af059cf061db47d5f95bfb0c454f454b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-123.ec2.internal" Apr 16 19:17:53.250789 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.250725 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/af059cf061db47d5f95bfb0c454f454b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-123.ec2.internal\" (UID: \"af059cf061db47d5f95bfb0c454f454b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-123.ec2.internal" Apr 16 19:17:53.250789 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.250778 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4602511561889ed4b3b1e98e97d43dc5-config\") pod \"kube-apiserver-proxy-ip-10-0-128-123.ec2.internal\" (UID: \"4602511561889ed4b3b1e98e97d43dc5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-123.ec2.internal" Apr 16 19:17:53.252340 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:53.252326 2582 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-123.ec2.internal\" not found" node="ip-10-0-128-123.ec2.internal" Apr 16 19:17:53.258011 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:53.257996 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-123.ec2.internal\" not found" Apr 16 19:17:53.351622 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.351591 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/af059cf061db47d5f95bfb0c454f454b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-123.ec2.internal\" (UID: \"af059cf061db47d5f95bfb0c454f454b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-123.ec2.internal" Apr 16 19:17:53.351754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.351628 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/af059cf061db47d5f95bfb0c454f454b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-123.ec2.internal\" (UID: \"af059cf061db47d5f95bfb0c454f454b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-123.ec2.internal" Apr 16 19:17:53.351754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.351650 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4602511561889ed4b3b1e98e97d43dc5-config\") pod \"kube-apiserver-proxy-ip-10-0-128-123.ec2.internal\" (UID: \"4602511561889ed4b3b1e98e97d43dc5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-123.ec2.internal" Apr 16 19:17:53.351754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.351675 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/af059cf061db47d5f95bfb0c454f454b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-123.ec2.internal\" (UID: \"af059cf061db47d5f95bfb0c454f454b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-123.ec2.internal" Apr 16 19:17:53.351754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.351674 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4602511561889ed4b3b1e98e97d43dc5-config\") pod \"kube-apiserver-proxy-ip-10-0-128-123.ec2.internal\" (UID: \"4602511561889ed4b3b1e98e97d43dc5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-123.ec2.internal" Apr 16 19:17:53.351754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.351692 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/af059cf061db47d5f95bfb0c454f454b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-123.ec2.internal\" (UID: \"af059cf061db47d5f95bfb0c454f454b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-123.ec2.internal" Apr 16 19:17:53.358723 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:53.358673 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-123.ec2.internal\" not found" Apr 16 19:17:53.459086 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:53.459052 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-123.ec2.internal\" not found" Apr 16 19:17:53.549398 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.549372 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-123.ec2.internal" Apr 16 19:17:53.555213 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.555193 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-123.ec2.internal" Apr 16 19:17:53.560071 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:53.560033 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-123.ec2.internal\" not found" Apr 16 19:17:53.660640 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:53.660553 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-123.ec2.internal\" not found" Apr 16 19:17:53.761196 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:53.761171 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-123.ec2.internal\" not found" Apr 16 19:17:53.857777 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.857743 2582 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 19:17:53.858444 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.857899 2582 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:17:53.861904 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:53.861877 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-123.ec2.internal\" not found" Apr 16 19:17:53.947571 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.947482 2582 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 19:17:53.962286 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:53.962259 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-123.ec2.internal\" not found" Apr 16 19:17:53.966349 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.966330 2582 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:17:53.987579 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.987548 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 19:12:52 +0000 UTC" deadline="2027-10-31 01:42:32.799056933 +0000 UTC" Apr 16 19:17:53.987579 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.987571 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13494h24m38.811488008s" Apr 16 19:17:53.991579 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.991560 2582 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-2n2h8" Apr 16 19:17:53.997927 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:53.997913 2582 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-2n2h8" Apr 16 19:17:54.062489 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:54.062463 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-123.ec2.internal\" not found" Apr 16 19:17:54.163441 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:54.163266 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-123.ec2.internal\" not found" Apr 16 19:17:54.191617 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:54.191581 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4602511561889ed4b3b1e98e97d43dc5.slice/crio-717bea3ad2ef9a1e65ae8909e2460a159c7909e50cccfb0fbc99efc227bc6e7a WatchSource:0}: Error finding container 717bea3ad2ef9a1e65ae8909e2460a159c7909e50cccfb0fbc99efc227bc6e7a: Status 404 returned error can't find the container with id 717bea3ad2ef9a1e65ae8909e2460a159c7909e50cccfb0fbc99efc227bc6e7a Apr 16 19:17:54.192040 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:54.192015 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf059cf061db47d5f95bfb0c454f454b.slice/crio-58930e25f3cdb3e34321eee9b1e34d484523fb4d1505710f0faa21088b404a22 WatchSource:0}: Error finding container 58930e25f3cdb3e34321eee9b1e34d484523fb4d1505710f0faa21088b404a22: Status 404 returned error can't find the container with id 58930e25f3cdb3e34321eee9b1e34d484523fb4d1505710f0faa21088b404a22 Apr 16 19:17:54.195726 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.195711 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:17:54.264125 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:54.264084 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-123.ec2.internal\" not found" Apr 16 19:17:54.364695 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:54.364643 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-123.ec2.internal\" not found" Apr 16 19:17:54.373554 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.373529 2582 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:17:54.440421 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.440395 2582 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:17:54.449403 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.449347 2582 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-123.ec2.internal" Apr 16 19:17:54.464489 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.464469 2582 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:17:54.465264 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.465244 2582 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-123.ec2.internal" Apr 16 19:17:54.474901 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.474885 2582 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:17:54.555507 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.555483 2582 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:17:54.927610 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.927533 2582 apiserver.go:52] "Watching apiserver" Apr 16 19:17:54.932297 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.932276 2582 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 19:17:54.934160 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.934120 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-s29hv","openshift-multus/network-metrics-daemon-lvp6d","kube-system/konnectivity-agent-lfd4r","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn","openshift-dns/node-resolver-bmngp","openshift-multus/multus-additional-cni-plugins-j5vpx","openshift-network-diagnostics/network-check-target-kk8zc","openshift-network-operator/iptables-alerter-7ddkt","openshift-ovn-kubernetes/ovnkube-node-4xbfq","kube-system/kube-apiserver-proxy-ip-10-0-128-123.ec2.internal","openshift-cluster-node-tuning-operator/tuned-t278p","openshift-image-registry/node-ca-mgz4l","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-123.ec2.internal"] Apr 16 19:17:54.937725 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.937700 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:17:54.937832 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:54.937779 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kk8zc" podUID="0ef22a96-6828-4636-8255-3aa3eaae036d" Apr 16 19:17:54.938880 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.938861 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:17:54.938969 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:54.938926 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvp6d" podUID="0fa55098-1c0e-4cf5-963c-602d47a411cc" Apr 16 19:17:54.938969 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.938952 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lfd4r" Apr 16 19:17:54.940263 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.940237 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:54.941023 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.941002 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 19:17:54.941266 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.941238 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 19:17:54.941371 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.941269 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-gcn7q\"" Apr 16 19:17:54.941550 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.941532 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bmngp" Apr 16 19:17:54.942545 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.942323 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 19:17:54.942545 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.942428 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 19:17:54.942545 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.942470 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qxzft\"" Apr 16 19:17:54.942733 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.942608 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 19:17:54.943606 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.943174 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 19:17:54.943606 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.943354 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:54.943730 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.943608 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 19:17:54.943958 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.943937 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-jrghd\"" Apr 16 19:17:54.944780 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.944762 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s29hv" Apr 16 19:17:54.945581 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.945411 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 19:17:54.945581 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.945412 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 19:17:54.945796 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.945720 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 19:17:54.945796 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.945763 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 19:17:54.945899 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.945725 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 19:17:54.945899 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.945862 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-bc9sd\"" Apr 16 19:17:54.946064 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.946043 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7ddkt" Apr 16 19:17:54.946893 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.946809 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 19:17:54.946893 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.946826 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-fdrl6\"" Apr 16 19:17:54.948234 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.948082 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:54.948888 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.948867 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 19:17:54.948987 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.948914 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:17:54.948987 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.948952 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 19:17:54.948987 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.948955 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-dmmlb\"" Apr 16 19:17:54.949531 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.949510 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:54.950318 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.950297 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 19:17:54.951663 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.951097 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mgz4l" Apr 16 19:17:54.951663 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.951662 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 19:17:54.951970 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.951945 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 19:17:54.952703 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.952681 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 19:17:54.952871 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.952857 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-sh9hm\"" Apr 16 19:17:54.953052 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.953038 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 19:17:54.953219 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.953201 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 19:17:54.953328 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.953217 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:17:54.953328 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.953263 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 19:17:54.953650 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.953628 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 19:17:54.953755 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.953738 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 19:17:54.953904 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.953885 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-dt27d\"" Apr 16 19:17:54.954591 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.954562 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-fsnql\"" Apr 16 19:17:54.955344 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.955326 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 19:17:54.960055 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960034 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-host-var-lib-kubelet\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:54.960137 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960070 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-var-lib-openvswitch\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:54.960137 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960099 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-run-ovn\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:54.960137 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960123 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-host-cni-bin\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:54.960286 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960146 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-etc-sysctl-conf\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:54.960334 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960290 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e449e07d-bc0e-4d5f-878f-d0f6299e1791-device-dir\") pod \"aws-ebs-csi-driver-node-swbsn\" (UID: \"e449e07d-bc0e-4d5f-878f-d0f6299e1791\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:54.960334 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960325 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c40d6168-44bb-4a03-9beb-7bf1152625f5-iptables-alerter-script\") pod \"iptables-alerter-7ddkt\" (UID: \"c40d6168-44bb-4a03-9beb-7bf1152625f5\") " pod="openshift-network-operator/iptables-alerter-7ddkt" Apr 16 19:17:54.960434 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960374 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-system-cni-dir\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:54.960485 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960430 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-systemd-units\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:54.960485 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960463 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ltxw\" (UniqueName: \"kubernetes.io/projected/fb965cc4-1192-4694-81d4-b4802f0b6e56-kube-api-access-7ltxw\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:54.960580 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960491 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-os-release\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:54.960580 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960517 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-host-run-netns\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:54.960580 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960540 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-log-socket\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:54.960713 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960572 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/03e41e43-a8fe-424e-85ea-c86ea5b657e4-ovnkube-script-lib\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:54.960713 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960610 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-etc-sysconfig\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:54.960713 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960630 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e449e07d-bc0e-4d5f-878f-d0f6299e1791-socket-dir\") pod \"aws-ebs-csi-driver-node-swbsn\" (UID: \"e449e07d-bc0e-4d5f-878f-d0f6299e1791\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:54.960713 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960661 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fb965cc4-1192-4694-81d4-b4802f0b6e56-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:54.960713 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960686 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ff091581-6d2a-4584-b8a2-9f02cd7c342d-multus-daemon-config\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:54.960713 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960701 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-run-openvswitch\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:54.960975 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960720 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-etc-systemd\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:54.960975 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960758 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fb965cc4-1192-4694-81d4-b4802f0b6e56-cni-binary-copy\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:54.960975 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960785 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fb965cc4-1192-4694-81d4-b4802f0b6e56-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:54.960975 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960812 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-host-slash\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:54.960975 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960839 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:54.960975 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960871 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cvtj\" (UniqueName: \"kubernetes.io/projected/03e41e43-a8fe-424e-85ea-c86ea5b657e4-kube-api-access-6cvtj\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:54.960975 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960902 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e449e07d-bc0e-4d5f-878f-d0f6299e1791-sys-fs\") pod \"aws-ebs-csi-driver-node-swbsn\" (UID: \"e449e07d-bc0e-4d5f-878f-d0f6299e1791\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:54.960975 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960917 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvvkc\" (UniqueName: \"kubernetes.io/projected/c40d6168-44bb-4a03-9beb-7bf1152625f5-kube-api-access-wvvkc\") pod \"iptables-alerter-7ddkt\" (UID: \"c40d6168-44bb-4a03-9beb-7bf1152625f5\") " pod="openshift-network-operator/iptables-alerter-7ddkt" Apr 16 19:17:54.960975 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960940 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d0b93718-99a8-48ec-8713-62b20201de35-agent-certs\") pod \"konnectivity-agent-lfd4r\" (UID: \"d0b93718-99a8-48ec-8713-62b20201de35\") " pod="kube-system/konnectivity-agent-lfd4r" Apr 16 19:17:54.960975 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960963 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-etc-openvswitch\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:54.961418 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.960988 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-host-cni-netd\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:54.961418 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961011 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-etc-modprobe-d\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:54.961418 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961040 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e449e07d-bc0e-4d5f-878f-d0f6299e1791-etc-selinux\") pod \"aws-ebs-csi-driver-node-swbsn\" (UID: \"e449e07d-bc0e-4d5f-878f-d0f6299e1791\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:54.961418 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961065 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j6v2\" (UniqueName: \"kubernetes.io/projected/e449e07d-bc0e-4d5f-878f-d0f6299e1791-kube-api-access-6j6v2\") pod \"aws-ebs-csi-driver-node-swbsn\" (UID: \"e449e07d-bc0e-4d5f-878f-d0f6299e1791\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:54.961418 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961088 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-etc-kubernetes\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:54.961418 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961130 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/97d2de57-ec6a-4f59-985c-24aea83be3fd-hosts-file\") pod \"node-resolver-bmngp\" (UID: \"97d2de57-ec6a-4f59-985c-24aea83be3fd\") " pod="openshift-dns/node-resolver-bmngp" Apr 16 19:17:54.961418 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961175 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-run-systemd\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:54.961418 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961211 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c40d6168-44bb-4a03-9beb-7bf1152625f5-host-slash\") pod \"iptables-alerter-7ddkt\" (UID: \"c40d6168-44bb-4a03-9beb-7bf1152625f5\") " pod="openshift-network-operator/iptables-alerter-7ddkt" Apr 16 19:17:54.961418 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961233 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff091581-6d2a-4584-b8a2-9f02cd7c342d-cni-binary-copy\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:54.961418 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961267 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-hostroot\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:54.961418 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961281 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-host-kubelet\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:54.961418 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961298 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/03e41e43-a8fe-424e-85ea-c86ea5b657e4-ovn-node-metrics-cert\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:54.961418 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961316 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-etc-sysctl-d\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:54.961418 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961338 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e449e07d-bc0e-4d5f-878f-d0f6299e1791-registration-dir\") pod \"aws-ebs-csi-driver-node-swbsn\" (UID: \"e449e07d-bc0e-4d5f-878f-d0f6299e1791\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:54.961418 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961355 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/03e41e43-a8fe-424e-85ea-c86ea5b657e4-env-overrides\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:54.961418 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961372 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-sys\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:54.962347 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961395 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-var-lib-kubelet\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:54.962347 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961409 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-tmp\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:54.962347 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961429 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-446g2\" (UniqueName: \"kubernetes.io/projected/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-kube-api-access-446g2\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:54.962347 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961450 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fb965cc4-1192-4694-81d4-b4802f0b6e56-os-release\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:54.962347 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961474 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fb965cc4-1192-4694-81d4-b4802f0b6e56-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:54.962347 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961500 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs\") pod \"network-metrics-daemon-lvp6d\" (UID: \"0fa55098-1c0e-4cf5-963c-602d47a411cc\") " pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:17:54.962347 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961520 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d0b93718-99a8-48ec-8713-62b20201de35-konnectivity-ca\") pod \"konnectivity-agent-lfd4r\" (UID: \"d0b93718-99a8-48ec-8713-62b20201de35\") " pod="kube-system/konnectivity-agent-lfd4r" Apr 16 19:17:54.962347 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961534 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-host-run-k8s-cni-cncf-io\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:54.962347 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961549 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-host-run-ovn-kubernetes\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:54.962347 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961563 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-run\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:54.962347 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961582 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7caf9f3a-4884-4e15-b154-262d7a60b314-host\") pod \"node-ca-mgz4l\" (UID: \"7caf9f3a-4884-4e15-b154-262d7a60b314\") " pod="openshift-image-registry/node-ca-mgz4l" Apr 16 19:17:54.962347 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961600 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkmgw\" (UniqueName: \"kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw\") pod \"network-check-target-kk8zc\" (UID: \"0ef22a96-6828-4636-8255-3aa3eaae036d\") " pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:17:54.962347 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961627 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-host-var-lib-cni-multus\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:54.962347 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961676 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdmzf\" (UniqueName: \"kubernetes.io/projected/ff091581-6d2a-4584-b8a2-9f02cd7c342d-kube-api-access-zdmzf\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:54.962347 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961707 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e449e07d-bc0e-4d5f-878f-d0f6299e1791-kubelet-dir\") pod \"aws-ebs-csi-driver-node-swbsn\" (UID: \"e449e07d-bc0e-4d5f-878f-d0f6299e1791\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:54.962347 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961732 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/97d2de57-ec6a-4f59-985c-24aea83be3fd-tmp-dir\") pod \"node-resolver-bmngp\" (UID: \"97d2de57-ec6a-4f59-985c-24aea83be3fd\") " pod="openshift-dns/node-resolver-bmngp" Apr 16 19:17:54.962930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961769 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-host-var-lib-cni-bin\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:54.962930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961802 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-host-run-multus-certs\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:54.962930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961847 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-node-log\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:54.962930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961902 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-etc-tuned\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:54.962930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961933 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-multus-conf-dir\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:54.962930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961956 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-lib-modules\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:54.962930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.961998 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f68nn\" (UniqueName: \"kubernetes.io/projected/7caf9f3a-4884-4e15-b154-262d7a60b314-kube-api-access-f68nn\") pod \"node-ca-mgz4l\" (UID: \"7caf9f3a-4884-4e15-b154-262d7a60b314\") " pod="openshift-image-registry/node-ca-mgz4l" Apr 16 19:17:54.962930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.962022 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxdcq\" (UniqueName: \"kubernetes.io/projected/97d2de57-ec6a-4f59-985c-24aea83be3fd-kube-api-access-lxdcq\") pod \"node-resolver-bmngp\" (UID: \"97d2de57-ec6a-4f59-985c-24aea83be3fd\") " pod="openshift-dns/node-resolver-bmngp" Apr 16 19:17:54.962930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.962062 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-cnibin\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:54.962930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.962089 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-etc-kubernetes\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:54.962930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.962104 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7caf9f3a-4884-4e15-b154-262d7a60b314-serviceca\") pod \"node-ca-mgz4l\" (UID: \"7caf9f3a-4884-4e15-b154-262d7a60b314\") " pod="openshift-image-registry/node-ca-mgz4l" Apr 16 19:17:54.962930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.962137 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fb965cc4-1192-4694-81d4-b4802f0b6e56-cnibin\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:54.962930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.962195 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkx8t\" (UniqueName: \"kubernetes.io/projected/0fa55098-1c0e-4cf5-963c-602d47a411cc-kube-api-access-bkx8t\") pod \"network-metrics-daemon-lvp6d\" (UID: \"0fa55098-1c0e-4cf5-963c-602d47a411cc\") " pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:17:54.962930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.962227 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-multus-cni-dir\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:54.962930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.962250 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-multus-socket-dir-parent\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:54.962930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.962274 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-host-run-netns\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:54.962930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.962302 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/03e41e43-a8fe-424e-85ea-c86ea5b657e4-ovnkube-config\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:54.963492 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.962345 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fb965cc4-1192-4694-81d4-b4802f0b6e56-system-cni-dir\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:54.963492 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:54.962394 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-host\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.000267 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.000235 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:12:53 +0000 UTC" deadline="2027-09-26 18:51:22.664924178 +0000 UTC" Apr 16 19:17:55.000267 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.000265 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12671h33m27.664662419s" Apr 16 19:17:55.050343 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.050314 2582 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 19:17:55.063273 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.063244 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e449e07d-bc0e-4d5f-878f-d0f6299e1791-registration-dir\") pod \"aws-ebs-csi-driver-node-swbsn\" (UID: \"e449e07d-bc0e-4d5f-878f-d0f6299e1791\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:55.063416 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.063287 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/03e41e43-a8fe-424e-85ea-c86ea5b657e4-env-overrides\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.063416 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.063312 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-sys\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.063416 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.063338 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-var-lib-kubelet\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.063416 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.063362 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-tmp\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.063416 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.063386 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-446g2\" (UniqueName: \"kubernetes.io/projected/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-kube-api-access-446g2\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.063416 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.063414 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fb965cc4-1192-4694-81d4-b4802f0b6e56-os-release\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:55.063680 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.063439 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fb965cc4-1192-4694-81d4-b4802f0b6e56-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:55.063680 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.063467 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs\") pod \"network-metrics-daemon-lvp6d\" (UID: \"0fa55098-1c0e-4cf5-963c-602d47a411cc\") " pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:17:55.063680 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.063494 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d0b93718-99a8-48ec-8713-62b20201de35-konnectivity-ca\") pod \"konnectivity-agent-lfd4r\" (UID: \"d0b93718-99a8-48ec-8713-62b20201de35\") " pod="kube-system/konnectivity-agent-lfd4r" Apr 16 19:17:55.063680 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.063518 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-host-run-k8s-cni-cncf-io\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.063680 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.063582 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-host-run-k8s-cni-cncf-io\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.063900 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.063865 2582 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 19:17:55.064307 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064087 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fb965cc4-1192-4694-81d4-b4802f0b6e56-os-release\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:55.064307 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064177 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-sys\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.064307 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064231 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e449e07d-bc0e-4d5f-878f-d0f6299e1791-registration-dir\") pod \"aws-ebs-csi-driver-node-swbsn\" (UID: \"e449e07d-bc0e-4d5f-878f-d0f6299e1791\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:55.064307 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064284 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-var-lib-kubelet\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.064307 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064300 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-host-run-ovn-kubernetes\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.064645 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064331 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-run\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.064645 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064366 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7caf9f3a-4884-4e15-b154-262d7a60b314-host\") pod \"node-ca-mgz4l\" (UID: \"7caf9f3a-4884-4e15-b154-262d7a60b314\") " pod="openshift-image-registry/node-ca-mgz4l" Apr 16 19:17:55.064645 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064391 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkmgw\" (UniqueName: \"kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw\") pod \"network-check-target-kk8zc\" (UID: \"0ef22a96-6828-4636-8255-3aa3eaae036d\") " pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:17:55.064645 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064402 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fb965cc4-1192-4694-81d4-b4802f0b6e56-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:55.064645 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064410 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d0b93718-99a8-48ec-8713-62b20201de35-konnectivity-ca\") pod \"konnectivity-agent-lfd4r\" (UID: \"d0b93718-99a8-48ec-8713-62b20201de35\") " pod="kube-system/konnectivity-agent-lfd4r" Apr 16 19:17:55.064645 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064418 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-host-var-lib-cni-multus\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.064645 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064467 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdmzf\" (UniqueName: \"kubernetes.io/projected/ff091581-6d2a-4584-b8a2-9f02cd7c342d-kube-api-access-zdmzf\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.064645 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064489 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/03e41e43-a8fe-424e-85ea-c86ea5b657e4-env-overrides\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.064645 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064513 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e449e07d-bc0e-4d5f-878f-d0f6299e1791-kubelet-dir\") pod \"aws-ebs-csi-driver-node-swbsn\" (UID: \"e449e07d-bc0e-4d5f-878f-d0f6299e1791\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:55.064645 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:55.064532 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:17:55.064645 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064540 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/97d2de57-ec6a-4f59-985c-24aea83be3fd-tmp-dir\") pod \"node-resolver-bmngp\" (UID: \"97d2de57-ec6a-4f59-985c-24aea83be3fd\") " pod="openshift-dns/node-resolver-bmngp" Apr 16 19:17:55.064645 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064565 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-host-var-lib-cni-bin\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.064645 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064568 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7caf9f3a-4884-4e15-b154-262d7a60b314-host\") pod \"node-ca-mgz4l\" (UID: \"7caf9f3a-4884-4e15-b154-262d7a60b314\") " pod="openshift-image-registry/node-ca-mgz4l" Apr 16 19:17:55.064645 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:55.064601 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs podName:0fa55098-1c0e-4cf5-963c-602d47a411cc nodeName:}" failed. No retries permitted until 2026-04-16 19:17:55.564571986 +0000 UTC m=+3.059162584 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs") pod "network-metrics-daemon-lvp6d" (UID: "0fa55098-1c0e-4cf5-963c-602d47a411cc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:17:55.064645 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064599 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-host-run-ovn-kubernetes\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.064645 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064613 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-host-var-lib-cni-bin\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.064645 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064624 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-run\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.065429 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064647 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-host-run-multus-certs\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.065429 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064672 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-node-log\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.065429 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064698 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-etc-tuned\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.065429 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064722 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-multus-conf-dir\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.065429 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064746 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-lib-modules\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.065429 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064771 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f68nn\" (UniqueName: \"kubernetes.io/projected/7caf9f3a-4884-4e15-b154-262d7a60b314-kube-api-access-f68nn\") pod \"node-ca-mgz4l\" (UID: \"7caf9f3a-4884-4e15-b154-262d7a60b314\") " pod="openshift-image-registry/node-ca-mgz4l" Apr 16 19:17:55.065429 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064793 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-host-run-multus-certs\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.065429 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064797 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxdcq\" (UniqueName: \"kubernetes.io/projected/97d2de57-ec6a-4f59-985c-24aea83be3fd-kube-api-access-lxdcq\") pod \"node-resolver-bmngp\" (UID: \"97d2de57-ec6a-4f59-985c-24aea83be3fd\") " pod="openshift-dns/node-resolver-bmngp" Apr 16 19:17:55.065429 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064848 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-cnibin\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.065429 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064849 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e449e07d-bc0e-4d5f-878f-d0f6299e1791-kubelet-dir\") pod \"aws-ebs-csi-driver-node-swbsn\" (UID: \"e449e07d-bc0e-4d5f-878f-d0f6299e1791\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:55.065429 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064881 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-host-var-lib-cni-multus\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.065429 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064909 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-cnibin\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.065429 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064991 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-lib-modules\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.065429 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.064998 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-node-log\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.065429 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065021 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-multus-conf-dir\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.065429 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065050 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-etc-kubernetes\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.065429 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065066 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7caf9f3a-4884-4e15-b154-262d7a60b314-serviceca\") pod \"node-ca-mgz4l\" (UID: \"7caf9f3a-4884-4e15-b154-262d7a60b314\") " pod="openshift-image-registry/node-ca-mgz4l" Apr 16 19:17:55.065429 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065081 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fb965cc4-1192-4694-81d4-b4802f0b6e56-cnibin\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:55.066292 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065098 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkx8t\" (UniqueName: \"kubernetes.io/projected/0fa55098-1c0e-4cf5-963c-602d47a411cc-kube-api-access-bkx8t\") pod \"network-metrics-daemon-lvp6d\" (UID: \"0fa55098-1c0e-4cf5-963c-602d47a411cc\") " pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:17:55.066292 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065113 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-multus-cni-dir\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.066292 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065127 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-multus-socket-dir-parent\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.066292 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065142 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-host-run-netns\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.066292 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065141 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/97d2de57-ec6a-4f59-985c-24aea83be3fd-tmp-dir\") pod \"node-resolver-bmngp\" (UID: \"97d2de57-ec6a-4f59-985c-24aea83be3fd\") " pod="openshift-dns/node-resolver-bmngp" Apr 16 19:17:55.066292 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065179 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/03e41e43-a8fe-424e-85ea-c86ea5b657e4-ovnkube-config\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.066292 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065209 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fb965cc4-1192-4694-81d4-b4802f0b6e56-system-cni-dir\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:55.066292 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065232 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-host\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.066292 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065273 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-host-var-lib-kubelet\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.066292 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065298 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-var-lib-openvswitch\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.066292 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065323 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-run-ovn\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.066292 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065347 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-host-cni-bin\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.066292 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065373 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-etc-sysctl-conf\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.066292 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065394 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fb965cc4-1192-4694-81d4-b4802f0b6e56-cnibin\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:55.066292 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065398 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e449e07d-bc0e-4d5f-878f-d0f6299e1791-device-dir\") pod \"aws-ebs-csi-driver-node-swbsn\" (UID: \"e449e07d-bc0e-4d5f-878f-d0f6299e1791\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:55.066292 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065434 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c40d6168-44bb-4a03-9beb-7bf1152625f5-iptables-alerter-script\") pod \"iptables-alerter-7ddkt\" (UID: \"c40d6168-44bb-4a03-9beb-7bf1152625f5\") " pod="openshift-network-operator/iptables-alerter-7ddkt" Apr 16 19:17:55.066292 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065445 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e449e07d-bc0e-4d5f-878f-d0f6299e1791-device-dir\") pod \"aws-ebs-csi-driver-node-swbsn\" (UID: \"e449e07d-bc0e-4d5f-878f-d0f6299e1791\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:55.067120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065451 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-system-cni-dir\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.067120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065467 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-systemd-units\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.067120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065485 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ltxw\" (UniqueName: \"kubernetes.io/projected/fb965cc4-1192-4694-81d4-b4802f0b6e56-kube-api-access-7ltxw\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:55.067120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065501 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-os-release\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.067120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065503 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-multus-cni-dir\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.067120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065517 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-host-run-netns\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.067120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065544 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-log-socket\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.067120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065557 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-multus-socket-dir-parent\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.067120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065570 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/03e41e43-a8fe-424e-85ea-c86ea5b657e4-ovnkube-script-lib\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.067120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065582 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7caf9f3a-4884-4e15-b154-262d7a60b314-serviceca\") pod \"node-ca-mgz4l\" (UID: \"7caf9f3a-4884-4e15-b154-262d7a60b314\") " pod="openshift-image-registry/node-ca-mgz4l" Apr 16 19:17:55.067120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065595 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-etc-sysconfig\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.067120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065599 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-host-run-netns\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.067120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065622 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e449e07d-bc0e-4d5f-878f-d0f6299e1791-socket-dir\") pod \"aws-ebs-csi-driver-node-swbsn\" (UID: \"e449e07d-bc0e-4d5f-878f-d0f6299e1791\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:55.067120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065648 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fb965cc4-1192-4694-81d4-b4802f0b6e56-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:55.067120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065676 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ff091581-6d2a-4584-b8a2-9f02cd7c342d-multus-daemon-config\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.067120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065692 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-var-lib-openvswitch\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.067120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065700 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-run-openvswitch\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.068057 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065739 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fb965cc4-1192-4694-81d4-b4802f0b6e56-system-cni-dir\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:55.068057 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065747 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-run-openvswitch\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.068057 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065767 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-etc-systemd\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.068057 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065797 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fb965cc4-1192-4694-81d4-b4802f0b6e56-cni-binary-copy\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:55.068057 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065821 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fb965cc4-1192-4694-81d4-b4802f0b6e56-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:55.068057 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065867 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-host-slash\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.068057 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065895 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.068057 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065925 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cvtj\" (UniqueName: \"kubernetes.io/projected/03e41e43-a8fe-424e-85ea-c86ea5b657e4-kube-api-access-6cvtj\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.068057 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065954 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e449e07d-bc0e-4d5f-878f-d0f6299e1791-sys-fs\") pod \"aws-ebs-csi-driver-node-swbsn\" (UID: \"e449e07d-bc0e-4d5f-878f-d0f6299e1791\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:55.068057 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065980 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wvvkc\" (UniqueName: \"kubernetes.io/projected/c40d6168-44bb-4a03-9beb-7bf1152625f5-kube-api-access-wvvkc\") pod \"iptables-alerter-7ddkt\" (UID: \"c40d6168-44bb-4a03-9beb-7bf1152625f5\") " pod="openshift-network-operator/iptables-alerter-7ddkt" Apr 16 19:17:55.068057 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066005 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d0b93718-99a8-48ec-8713-62b20201de35-agent-certs\") pod \"konnectivity-agent-lfd4r\" (UID: \"d0b93718-99a8-48ec-8713-62b20201de35\") " pod="kube-system/konnectivity-agent-lfd4r" Apr 16 19:17:55.068057 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066030 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-etc-openvswitch\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.068057 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066057 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/03e41e43-a8fe-424e-85ea-c86ea5b657e4-ovnkube-config\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.068057 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066108 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-host\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.068057 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066112 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-host-cni-netd\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.068057 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066166 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-host-var-lib-kubelet\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.068057 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066176 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-run-ovn\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.068754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066212 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-etc-systemd\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.068754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066216 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-host-cni-bin\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.068754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066061 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-host-cni-netd\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.068754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066247 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c40d6168-44bb-4a03-9beb-7bf1152625f5-iptables-alerter-script\") pod \"iptables-alerter-7ddkt\" (UID: \"c40d6168-44bb-4a03-9beb-7bf1152625f5\") " pod="openshift-network-operator/iptables-alerter-7ddkt" Apr 16 19:17:55.068754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066262 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-etc-modprobe-d\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.068754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066290 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e449e07d-bc0e-4d5f-878f-d0f6299e1791-etc-selinux\") pod \"aws-ebs-csi-driver-node-swbsn\" (UID: \"e449e07d-bc0e-4d5f-878f-d0f6299e1791\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:55.068754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066317 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-system-cni-dir\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.068754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066318 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6j6v2\" (UniqueName: \"kubernetes.io/projected/e449e07d-bc0e-4d5f-878f-d0f6299e1791-kube-api-access-6j6v2\") pod \"aws-ebs-csi-driver-node-swbsn\" (UID: \"e449e07d-bc0e-4d5f-878f-d0f6299e1791\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:55.068754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066359 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-etc-kubernetes\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.068754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066383 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/97d2de57-ec6a-4f59-985c-24aea83be3fd-hosts-file\") pod \"node-resolver-bmngp\" (UID: \"97d2de57-ec6a-4f59-985c-24aea83be3fd\") " pod="openshift-dns/node-resolver-bmngp" Apr 16 19:17:55.068754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066430 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-run-systemd\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.068754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066455 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c40d6168-44bb-4a03-9beb-7bf1152625f5-host-slash\") pod \"iptables-alerter-7ddkt\" (UID: \"c40d6168-44bb-4a03-9beb-7bf1152625f5\") " pod="openshift-network-operator/iptables-alerter-7ddkt" Apr 16 19:17:55.068754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066482 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff091581-6d2a-4584-b8a2-9f02cd7c342d-cni-binary-copy\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.068754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066507 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-hostroot\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.068754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066533 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-host-kubelet\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.068754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066558 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/03e41e43-a8fe-424e-85ea-c86ea5b657e4-ovn-node-metrics-cert\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.068754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066583 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-etc-sysctl-d\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.069457 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066626 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e449e07d-bc0e-4d5f-878f-d0f6299e1791-etc-selinux\") pod \"aws-ebs-csi-driver-node-swbsn\" (UID: \"e449e07d-bc0e-4d5f-878f-d0f6299e1791\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:55.069457 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066584 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-etc-modprobe-d\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.069457 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066669 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-systemd-units\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.069457 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066799 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fb965cc4-1192-4694-81d4-b4802f0b6e56-cni-binary-copy\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:55.069457 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066897 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e449e07d-bc0e-4d5f-878f-d0f6299e1791-sys-fs\") pod \"aws-ebs-csi-driver-node-swbsn\" (UID: \"e449e07d-bc0e-4d5f-878f-d0f6299e1791\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:55.069457 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066915 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fb965cc4-1192-4694-81d4-b4802f0b6e56-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:55.069457 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066920 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-os-release\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.069457 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066974 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-host-run-netns\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.069457 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066985 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.069457 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.067024 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-host-slash\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.069457 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.067040 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-etc-kubernetes\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.069457 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.067069 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/97d2de57-ec6a-4f59-985c-24aea83be3fd-hosts-file\") pod \"node-resolver-bmngp\" (UID: \"97d2de57-ec6a-4f59-985c-24aea83be3fd\") " pod="openshift-dns/node-resolver-bmngp" Apr 16 19:17:55.069457 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.067082 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-log-socket\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.069457 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.067104 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-run-systemd\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.069457 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.067145 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c40d6168-44bb-4a03-9beb-7bf1152625f5-host-slash\") pod \"iptables-alerter-7ddkt\" (UID: \"c40d6168-44bb-4a03-9beb-7bf1152625f5\") " pod="openshift-network-operator/iptables-alerter-7ddkt" Apr 16 19:17:55.069457 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.067585 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/03e41e43-a8fe-424e-85ea-c86ea5b657e4-ovnkube-script-lib\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.069457 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.066327 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-etc-sysctl-conf\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.069457 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.067648 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ff091581-6d2a-4584-b8a2-9f02cd7c342d-hostroot\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.070253 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.067762 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-etc-sysctl-d\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.070253 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.067797 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-etc-openvswitch\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.070253 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.067840 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/03e41e43-a8fe-424e-85ea-c86ea5b657e4-host-kubelet\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.070253 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.067891 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-etc-sysconfig\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.070253 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.068029 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-etc-tuned\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.070253 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.065647 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-etc-kubernetes\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.070253 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.068170 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e449e07d-bc0e-4d5f-878f-d0f6299e1791-socket-dir\") pod \"aws-ebs-csi-driver-node-swbsn\" (UID: \"e449e07d-bc0e-4d5f-878f-d0f6299e1791\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:55.070253 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.068281 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ff091581-6d2a-4584-b8a2-9f02cd7c342d-cni-binary-copy\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.070253 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.068287 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/fb965cc4-1192-4694-81d4-b4802f0b6e56-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:55.070253 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.068312 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-tmp\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.070253 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.068867 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ff091581-6d2a-4584-b8a2-9f02cd7c342d-multus-daemon-config\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.071060 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.071038 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/03e41e43-a8fe-424e-85ea-c86ea5b657e4-ovn-node-metrics-cert\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.071269 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:55.071098 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:17:55.071390 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:55.071376 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:17:55.071489 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:55.071479 2582 projected.go:194] Error preparing data for projected volume kube-api-access-qkmgw for pod openshift-network-diagnostics/network-check-target-kk8zc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:17:55.071622 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:55.071611 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw podName:0ef22a96-6828-4636-8255-3aa3eaae036d nodeName:}" failed. No retries permitted until 2026-04-16 19:17:55.571593353 +0000 UTC m=+3.066183963 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qkmgw" (UniqueName: "kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw") pod "network-check-target-kk8zc" (UID: "0ef22a96-6828-4636-8255-3aa3eaae036d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:17:55.076894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.075208 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d0b93718-99a8-48ec-8713-62b20201de35-agent-certs\") pod \"konnectivity-agent-lfd4r\" (UID: \"d0b93718-99a8-48ec-8713-62b20201de35\") " pod="kube-system/konnectivity-agent-lfd4r" Apr 16 19:17:55.076894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.076788 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-446g2\" (UniqueName: \"kubernetes.io/projected/d0fe97f4-07e6-4acf-bc9c-a809afa706ad-kube-api-access-446g2\") pod \"tuned-t278p\" (UID: \"d0fe97f4-07e6-4acf-bc9c-a809afa706ad\") " pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.076894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.076849 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkx8t\" (UniqueName: \"kubernetes.io/projected/0fa55098-1c0e-4cf5-963c-602d47a411cc-kube-api-access-bkx8t\") pod \"network-metrics-daemon-lvp6d\" (UID: \"0fa55098-1c0e-4cf5-963c-602d47a411cc\") " pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:17:55.078286 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.077887 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f68nn\" (UniqueName: \"kubernetes.io/projected/7caf9f3a-4884-4e15-b154-262d7a60b314-kube-api-access-f68nn\") pod \"node-ca-mgz4l\" (UID: \"7caf9f3a-4884-4e15-b154-262d7a60b314\") " pod="openshift-image-registry/node-ca-mgz4l" Apr 16 19:17:55.082960 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.082530 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdmzf\" (UniqueName: \"kubernetes.io/projected/ff091581-6d2a-4584-b8a2-9f02cd7c342d-kube-api-access-zdmzf\") pod \"multus-s29hv\" (UID: \"ff091581-6d2a-4584-b8a2-9f02cd7c342d\") " pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.082960 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.082667 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j6v2\" (UniqueName: \"kubernetes.io/projected/e449e07d-bc0e-4d5f-878f-d0f6299e1791-kube-api-access-6j6v2\") pod \"aws-ebs-csi-driver-node-swbsn\" (UID: \"e449e07d-bc0e-4d5f-878f-d0f6299e1791\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:55.082960 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.082668 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cvtj\" (UniqueName: \"kubernetes.io/projected/03e41e43-a8fe-424e-85ea-c86ea5b657e4-kube-api-access-6cvtj\") pod \"ovnkube-node-4xbfq\" (UID: \"03e41e43-a8fe-424e-85ea-c86ea5b657e4\") " pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.083199 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.083079 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ltxw\" (UniqueName: \"kubernetes.io/projected/fb965cc4-1192-4694-81d4-b4802f0b6e56-kube-api-access-7ltxw\") pod \"multus-additional-cni-plugins-j5vpx\" (UID: \"fb965cc4-1192-4694-81d4-b4802f0b6e56\") " pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:55.083199 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.083112 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxdcq\" (UniqueName: \"kubernetes.io/projected/97d2de57-ec6a-4f59-985c-24aea83be3fd-kube-api-access-lxdcq\") pod \"node-resolver-bmngp\" (UID: \"97d2de57-ec6a-4f59-985c-24aea83be3fd\") " pod="openshift-dns/node-resolver-bmngp" Apr 16 19:17:55.084498 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.084430 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvvkc\" (UniqueName: \"kubernetes.io/projected/c40d6168-44bb-4a03-9beb-7bf1152625f5-kube-api-access-wvvkc\") pod \"iptables-alerter-7ddkt\" (UID: \"c40d6168-44bb-4a03-9beb-7bf1152625f5\") " pod="openshift-network-operator/iptables-alerter-7ddkt" Apr 16 19:17:55.115207 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.115139 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-123.ec2.internal" event={"ID":"4602511561889ed4b3b1e98e97d43dc5","Type":"ContainerStarted","Data":"717bea3ad2ef9a1e65ae8909e2460a159c7909e50cccfb0fbc99efc227bc6e7a"} Apr 16 19:17:55.116611 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.116590 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-123.ec2.internal" event={"ID":"af059cf061db47d5f95bfb0c454f454b","Type":"ContainerStarted","Data":"58930e25f3cdb3e34321eee9b1e34d484523fb4d1505710f0faa21088b404a22"} Apr 16 19:17:55.171581 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.171542 2582 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:17:55.250590 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.250555 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-lfd4r" Apr 16 19:17:55.260793 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.260533 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" Apr 16 19:17:55.270164 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.269908 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bmngp" Apr 16 19:17:55.271693 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:55.271668 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode449e07d_bc0e_4d5f_878f_d0f6299e1791.slice/crio-8f908d5b042dcd43b8a6ac6f2429c6c00b0b3355038ad4c19652e30f88a50e12 WatchSource:0}: Error finding container 8f908d5b042dcd43b8a6ac6f2429c6c00b0b3355038ad4c19652e30f88a50e12: Status 404 returned error can't find the container with id 8f908d5b042dcd43b8a6ac6f2429c6c00b0b3355038ad4c19652e30f88a50e12 Apr 16 19:17:55.276624 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.276596 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-j5vpx" Apr 16 19:17:55.277834 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:55.277775 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97d2de57_ec6a_4f59_985c_24aea83be3fd.slice/crio-efedaef0fd302e86be15d579ce5c0423399b6f19c0bd4aed12b04020dcbf3e7e WatchSource:0}: Error finding container efedaef0fd302e86be15d579ce5c0423399b6f19c0bd4aed12b04020dcbf3e7e: Status 404 returned error can't find the container with id efedaef0fd302e86be15d579ce5c0423399b6f19c0bd4aed12b04020dcbf3e7e Apr 16 19:17:55.284005 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.283980 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-s29hv" Apr 16 19:17:55.286383 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:55.286350 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb965cc4_1192_4694_81d4_b4802f0b6e56.slice/crio-4edead6cc6dce2be0b0cf9263426884895aa7a96923d3c03c14e6468e3e860e8 WatchSource:0}: Error finding container 4edead6cc6dce2be0b0cf9263426884895aa7a96923d3c03c14e6468e3e860e8: Status 404 returned error can't find the container with id 4edead6cc6dce2be0b0cf9263426884895aa7a96923d3c03c14e6468e3e860e8 Apr 16 19:17:55.292093 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:55.292069 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff091581_6d2a_4584_b8a2_9f02cd7c342d.slice/crio-033774e42301b561b8f5c8b31b3459441db34c08394bc4b165a1e18993b41bbb WatchSource:0}: Error finding container 033774e42301b561b8f5c8b31b3459441db34c08394bc4b165a1e18993b41bbb: Status 404 returned error can't find the container with id 033774e42301b561b8f5c8b31b3459441db34c08394bc4b165a1e18993b41bbb Apr 16 19:17:55.294067 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.293827 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7ddkt" Apr 16 19:17:55.299840 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.299816 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:17:55.301578 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:55.301556 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc40d6168_44bb_4a03_9beb_7bf1152625f5.slice/crio-8d8fa16c3604e90a4728f7902ac80a15ef42258c3bc3548705b6328bbd25f1d6 WatchSource:0}: Error finding container 8d8fa16c3604e90a4728f7902ac80a15ef42258c3bc3548705b6328bbd25f1d6: Status 404 returned error can't find the container with id 8d8fa16c3604e90a4728f7902ac80a15ef42258c3bc3548705b6328bbd25f1d6 Apr 16 19:17:55.308739 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.308503 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-t278p" Apr 16 19:17:55.309308 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:55.309287 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03e41e43_a8fe_424e_85ea_c86ea5b657e4.slice/crio-714052d1d64455cef20351f550ced30833d19a575dd42e33bb49a2c311ee35c1 WatchSource:0}: Error finding container 714052d1d64455cef20351f550ced30833d19a575dd42e33bb49a2c311ee35c1: Status 404 returned error can't find the container with id 714052d1d64455cef20351f550ced30833d19a575dd42e33bb49a2c311ee35c1 Apr 16 19:17:55.313894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.313873 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mgz4l" Apr 16 19:17:55.320005 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:55.319951 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0fe97f4_07e6_4acf_bc9c_a809afa706ad.slice/crio-d8715009f4fb3766b5a381ca9498e3f0ac1e24af82c28677e2941d6e1b06aa91 WatchSource:0}: Error finding container d8715009f4fb3766b5a381ca9498e3f0ac1e24af82c28677e2941d6e1b06aa91: Status 404 returned error can't find the container with id d8715009f4fb3766b5a381ca9498e3f0ac1e24af82c28677e2941d6e1b06aa91 Apr 16 19:17:55.323857 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:17:55.323829 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7caf9f3a_4884_4e15_b154_262d7a60b314.slice/crio-17a1c92abb1fb749e42d41c83ccc535b7912f6bc6a6b905b37d1a40e1a9bc46a WatchSource:0}: Error finding container 17a1c92abb1fb749e42d41c83ccc535b7912f6bc6a6b905b37d1a40e1a9bc46a: Status 404 returned error can't find the container with id 17a1c92abb1fb749e42d41c83ccc535b7912f6bc6a6b905b37d1a40e1a9bc46a Apr 16 19:17:55.570303 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.570202 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs\") pod \"network-metrics-daemon-lvp6d\" (UID: \"0fa55098-1c0e-4cf5-963c-602d47a411cc\") " pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:17:55.570440 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:55.570346 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:17:55.570440 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:55.570417 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs podName:0fa55098-1c0e-4cf5-963c-602d47a411cc nodeName:}" failed. No retries permitted until 2026-04-16 19:17:56.570401508 +0000 UTC m=+4.064992106 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs") pod "network-metrics-daemon-lvp6d" (UID: "0fa55098-1c0e-4cf5-963c-602d47a411cc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:17:55.671052 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:55.671015 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkmgw\" (UniqueName: \"kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw\") pod \"network-check-target-kk8zc\" (UID: \"0ef22a96-6828-4636-8255-3aa3eaae036d\") " pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:17:55.671216 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:55.671181 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:17:55.671216 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:55.671202 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:17:55.671216 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:55.671212 2582 projected.go:194] Error preparing data for projected volume kube-api-access-qkmgw for pod openshift-network-diagnostics/network-check-target-kk8zc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:17:55.671329 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:55.671263 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw podName:0ef22a96-6828-4636-8255-3aa3eaae036d nodeName:}" failed. No retries permitted until 2026-04-16 19:17:56.671249297 +0000 UTC m=+4.165839891 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qkmgw" (UniqueName: "kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw") pod "network-check-target-kk8zc" (UID: "0ef22a96-6828-4636-8255-3aa3eaae036d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:17:56.001352 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:56.001310 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:12:53 +0000 UTC" deadline="2028-01-28 01:00:18.412953092 +0000 UTC" Apr 16 19:17:56.001352 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:56.001345 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15629h42m22.411610392s" Apr 16 19:17:56.119644 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:56.119608 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mgz4l" event={"ID":"7caf9f3a-4884-4e15-b154-262d7a60b314","Type":"ContainerStarted","Data":"17a1c92abb1fb749e42d41c83ccc535b7912f6bc6a6b905b37d1a40e1a9bc46a"} Apr 16 19:17:56.121343 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:56.120953 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bmngp" event={"ID":"97d2de57-ec6a-4f59-985c-24aea83be3fd","Type":"ContainerStarted","Data":"efedaef0fd302e86be15d579ce5c0423399b6f19c0bd4aed12b04020dcbf3e7e"} Apr 16 19:17:56.123269 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:56.123240 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" event={"ID":"e449e07d-bc0e-4d5f-878f-d0f6299e1791","Type":"ContainerStarted","Data":"8f908d5b042dcd43b8a6ac6f2429c6c00b0b3355038ad4c19652e30f88a50e12"} Apr 16 19:17:56.124356 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:56.124335 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-t278p" event={"ID":"d0fe97f4-07e6-4acf-bc9c-a809afa706ad","Type":"ContainerStarted","Data":"d8715009f4fb3766b5a381ca9498e3f0ac1e24af82c28677e2941d6e1b06aa91"} Apr 16 19:17:56.125328 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:56.125292 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" event={"ID":"03e41e43-a8fe-424e-85ea-c86ea5b657e4","Type":"ContainerStarted","Data":"714052d1d64455cef20351f550ced30833d19a575dd42e33bb49a2c311ee35c1"} Apr 16 19:17:56.127289 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:56.127245 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7ddkt" event={"ID":"c40d6168-44bb-4a03-9beb-7bf1152625f5","Type":"ContainerStarted","Data":"8d8fa16c3604e90a4728f7902ac80a15ef42258c3bc3548705b6328bbd25f1d6"} Apr 16 19:17:56.129015 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:56.128706 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s29hv" event={"ID":"ff091581-6d2a-4584-b8a2-9f02cd7c342d","Type":"ContainerStarted","Data":"033774e42301b561b8f5c8b31b3459441db34c08394bc4b165a1e18993b41bbb"} Apr 16 19:17:56.130351 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:56.130298 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j5vpx" event={"ID":"fb965cc4-1192-4694-81d4-b4802f0b6e56","Type":"ContainerStarted","Data":"4edead6cc6dce2be0b0cf9263426884895aa7a96923d3c03c14e6468e3e860e8"} Apr 16 19:17:56.131720 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:56.131699 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lfd4r" event={"ID":"d0b93718-99a8-48ec-8713-62b20201de35","Type":"ContainerStarted","Data":"d5162d9b12c0ee78615df545f2606aa430810301df0c1977bcffc4a202de9b32"} Apr 16 19:17:56.576108 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:56.576064 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs\") pod \"network-metrics-daemon-lvp6d\" (UID: \"0fa55098-1c0e-4cf5-963c-602d47a411cc\") " pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:17:56.576284 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:56.576267 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:17:56.576355 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:56.576334 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs podName:0fa55098-1c0e-4cf5-963c-602d47a411cc nodeName:}" failed. No retries permitted until 2026-04-16 19:17:58.576315214 +0000 UTC m=+6.070905809 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs") pod "network-metrics-daemon-lvp6d" (UID: "0fa55098-1c0e-4cf5-963c-602d47a411cc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:17:56.676623 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:56.676507 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkmgw\" (UniqueName: \"kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw\") pod \"network-check-target-kk8zc\" (UID: \"0ef22a96-6828-4636-8255-3aa3eaae036d\") " pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:17:56.676783 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:56.676693 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:17:56.676783 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:56.676725 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:17:56.676783 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:56.676740 2582 projected.go:194] Error preparing data for projected volume kube-api-access-qkmgw for pod openshift-network-diagnostics/network-check-target-kk8zc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:17:56.676930 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:56.676818 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw podName:0ef22a96-6828-4636-8255-3aa3eaae036d nodeName:}" failed. No retries permitted until 2026-04-16 19:17:58.676787387 +0000 UTC m=+6.171377996 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-qkmgw" (UniqueName: "kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw") pod "network-check-target-kk8zc" (UID: "0ef22a96-6828-4636-8255-3aa3eaae036d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:17:57.110820 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:57.110784 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:17:57.111306 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:57.110918 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kk8zc" podUID="0ef22a96-6828-4636-8255-3aa3eaae036d" Apr 16 19:17:57.111368 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:57.111340 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:17:57.111475 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:57.111450 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvp6d" podUID="0fa55098-1c0e-4cf5-963c-602d47a411cc" Apr 16 19:17:57.140020 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:57.139359 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-123.ec2.internal" event={"ID":"4602511561889ed4b3b1e98e97d43dc5","Type":"ContainerStarted","Data":"35a9dbca1bab80eb2a37cae7a3bef2531033afe6b3d2f2ce1fa301e752444dbd"} Apr 16 19:17:58.145289 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:58.145205 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-123.ec2.internal" event={"ID":"af059cf061db47d5f95bfb0c454f454b","Type":"ContainerStarted","Data":"0ecc9c40a259f38f6b190b8eb5ebed11271d412d17f478967e419fd8152f1312"} Apr 16 19:17:58.161161 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:58.161087 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-123.ec2.internal" podStartSLOduration=4.161065658 podStartE2EDuration="4.161065658s" podCreationTimestamp="2026-04-16 19:17:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:17:57.15747205 +0000 UTC m=+4.652062668" watchObservedRunningTime="2026-04-16 19:17:58.161065658 +0000 UTC m=+5.655656277" Apr 16 19:17:58.594472 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:58.594431 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs\") pod \"network-metrics-daemon-lvp6d\" (UID: \"0fa55098-1c0e-4cf5-963c-602d47a411cc\") " pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:17:58.594661 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:58.594593 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:17:58.594723 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:58.594677 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs podName:0fa55098-1c0e-4cf5-963c-602d47a411cc nodeName:}" failed. No retries permitted until 2026-04-16 19:18:02.594656441 +0000 UTC m=+10.089247058 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs") pod "network-metrics-daemon-lvp6d" (UID: "0fa55098-1c0e-4cf5-963c-602d47a411cc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:17:58.694939 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:58.694901 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkmgw\" (UniqueName: \"kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw\") pod \"network-check-target-kk8zc\" (UID: \"0ef22a96-6828-4636-8255-3aa3eaae036d\") " pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:17:58.695133 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:58.695101 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:17:58.695133 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:58.695121 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:17:58.695133 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:58.695133 2582 projected.go:194] Error preparing data for projected volume kube-api-access-qkmgw for pod openshift-network-diagnostics/network-check-target-kk8zc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:17:58.695321 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:58.695226 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw podName:0ef22a96-6828-4636-8255-3aa3eaae036d nodeName:}" failed. No retries permitted until 2026-04-16 19:18:02.695207944 +0000 UTC m=+10.189798542 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-qkmgw" (UniqueName: "kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw") pod "network-check-target-kk8zc" (UID: "0ef22a96-6828-4636-8255-3aa3eaae036d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:17:59.110748 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:59.110715 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:17:59.110932 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:17:59.110758 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:17:59.110932 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:59.110844 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kk8zc" podUID="0ef22a96-6828-4636-8255-3aa3eaae036d" Apr 16 19:17:59.111029 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:17:59.110935 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvp6d" podUID="0fa55098-1c0e-4cf5-963c-602d47a411cc" Apr 16 19:18:00.149279 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:00.149241 2582 generic.go:358] "Generic (PLEG): container finished" podID="af059cf061db47d5f95bfb0c454f454b" containerID="0ecc9c40a259f38f6b190b8eb5ebed11271d412d17f478967e419fd8152f1312" exitCode=0 Apr 16 19:18:00.149722 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:00.149291 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-123.ec2.internal" event={"ID":"af059cf061db47d5f95bfb0c454f454b","Type":"ContainerDied","Data":"0ecc9c40a259f38f6b190b8eb5ebed11271d412d17f478967e419fd8152f1312"} Apr 16 19:18:01.110126 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:01.110091 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:01.110304 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:01.110245 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kk8zc" podUID="0ef22a96-6828-4636-8255-3aa3eaae036d" Apr 16 19:18:01.110379 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:01.110295 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:18:01.110430 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:01.110391 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvp6d" podUID="0fa55098-1c0e-4cf5-963c-602d47a411cc" Apr 16 19:18:02.620852 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:02.620810 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs\") pod \"network-metrics-daemon-lvp6d\" (UID: \"0fa55098-1c0e-4cf5-963c-602d47a411cc\") " pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:18:02.621376 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:02.621004 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:02.621376 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:02.621073 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs podName:0fa55098-1c0e-4cf5-963c-602d47a411cc nodeName:}" failed. No retries permitted until 2026-04-16 19:18:10.621052925 +0000 UTC m=+18.115643541 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs") pod "network-metrics-daemon-lvp6d" (UID: "0fa55098-1c0e-4cf5-963c-602d47a411cc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:02.721970 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:02.721905 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkmgw\" (UniqueName: \"kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw\") pod \"network-check-target-kk8zc\" (UID: \"0ef22a96-6828-4636-8255-3aa3eaae036d\") " pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:02.722142 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:02.722054 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:18:02.722142 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:02.722072 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:18:02.722142 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:02.722084 2582 projected.go:194] Error preparing data for projected volume kube-api-access-qkmgw for pod openshift-network-diagnostics/network-check-target-kk8zc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:02.722142 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:02.722162 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw podName:0ef22a96-6828-4636-8255-3aa3eaae036d nodeName:}" failed. No retries permitted until 2026-04-16 19:18:10.72212948 +0000 UTC m=+18.216720087 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-qkmgw" (UniqueName: "kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw") pod "network-check-target-kk8zc" (UID: "0ef22a96-6828-4636-8255-3aa3eaae036d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:03.111695 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:03.111606 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:03.111870 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:03.111719 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kk8zc" podUID="0ef22a96-6828-4636-8255-3aa3eaae036d" Apr 16 19:18:03.111870 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:03.111804 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:18:03.112042 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:03.111926 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvp6d" podUID="0fa55098-1c0e-4cf5-963c-602d47a411cc" Apr 16 19:18:05.110434 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:05.110190 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:18:05.111305 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:05.110545 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvp6d" podUID="0fa55098-1c0e-4cf5-963c-602d47a411cc" Apr 16 19:18:05.111305 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:05.110210 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:05.111305 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:05.110749 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kk8zc" podUID="0ef22a96-6828-4636-8255-3aa3eaae036d" Apr 16 19:18:05.160076 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:05.160019 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-t278p" event={"ID":"d0fe97f4-07e6-4acf-bc9c-a809afa706ad","Type":"ContainerStarted","Data":"642eb4b1d4cd43d1864f36ba87e56dfe573e9d0ba8ab84d23fc5021c95a6c1de"} Apr 16 19:18:05.162436 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:05.162402 2582 generic.go:358] "Generic (PLEG): container finished" podID="fb965cc4-1192-4694-81d4-b4802f0b6e56" containerID="79869bb2eaa66294cb785adfddcf23bdb626be95c9c781aa6ca74d02f365bcdd" exitCode=0 Apr 16 19:18:05.162707 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:05.162667 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j5vpx" event={"ID":"fb965cc4-1192-4694-81d4-b4802f0b6e56","Type":"ContainerDied","Data":"79869bb2eaa66294cb785adfddcf23bdb626be95c9c781aa6ca74d02f365bcdd"} Apr 16 19:18:05.164754 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:05.164732 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-lfd4r" event={"ID":"d0b93718-99a8-48ec-8713-62b20201de35","Type":"ContainerStarted","Data":"f871c71579df28e797214edb7f26ee183041b7dd5b7d1983403879e4806c39b4"} Apr 16 19:18:05.168981 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:05.168947 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-123.ec2.internal" event={"ID":"af059cf061db47d5f95bfb0c454f454b","Type":"ContainerStarted","Data":"9b2f6ea31fdc299c3c5236fb14880638775c389652164e0f6254a998e2fd207b"} Apr 16 19:18:05.171311 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:05.171288 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mgz4l" event={"ID":"7caf9f3a-4884-4e15-b154-262d7a60b314","Type":"ContainerStarted","Data":"c973d058b61dadc804fe4eb9769b9f69995d7f5b4c3818e57558a33936c67fcc"} Apr 16 19:18:05.173214 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:05.173180 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bmngp" event={"ID":"97d2de57-ec6a-4f59-985c-24aea83be3fd","Type":"ContainerStarted","Data":"133e38abba03c6b0af8a2295c3666c756912446034fa7c31d820cac84a85ddf5"} Apr 16 19:18:05.174775 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:05.174751 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" event={"ID":"e449e07d-bc0e-4d5f-878f-d0f6299e1791","Type":"ContainerStarted","Data":"68be00a7df8bef5f426658a99d17be5c8fb50cf4025bc3279aea0355ce0cc91c"} Apr 16 19:18:05.176512 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:05.176449 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-t278p" podStartSLOduration=3.304388184 podStartE2EDuration="12.176422941s" podCreationTimestamp="2026-04-16 19:17:53 +0000 UTC" firstStartedPulling="2026-04-16 19:17:55.321711438 +0000 UTC m=+2.816302037" lastFinishedPulling="2026-04-16 19:18:04.193746182 +0000 UTC m=+11.688336794" observedRunningTime="2026-04-16 19:18:05.175518464 +0000 UTC m=+12.670109082" watchObservedRunningTime="2026-04-16 19:18:05.176422941 +0000 UTC m=+12.671013557" Apr 16 19:18:05.191335 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:05.190662 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mgz4l" podStartSLOduration=3.335220286 podStartE2EDuration="12.19064706s" podCreationTimestamp="2026-04-16 19:17:53 +0000 UTC" firstStartedPulling="2026-04-16 19:17:55.325886674 +0000 UTC m=+2.820477282" lastFinishedPulling="2026-04-16 19:18:04.181313446 +0000 UTC m=+11.675904056" observedRunningTime="2026-04-16 19:18:05.190069689 +0000 UTC m=+12.684660311" watchObservedRunningTime="2026-04-16 19:18:05.19064706 +0000 UTC m=+12.685237669" Apr 16 19:18:05.231614 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:05.231560 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bmngp" podStartSLOduration=3.368232169 podStartE2EDuration="12.231543469s" podCreationTimestamp="2026-04-16 19:17:53 +0000 UTC" firstStartedPulling="2026-04-16 19:17:55.281555839 +0000 UTC m=+2.776146435" lastFinishedPulling="2026-04-16 19:18:04.144867134 +0000 UTC m=+11.639457735" observedRunningTime="2026-04-16 19:18:05.210105656 +0000 UTC m=+12.704696275" watchObservedRunningTime="2026-04-16 19:18:05.231543469 +0000 UTC m=+12.726134090" Apr 16 19:18:05.250398 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:05.250350 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-123.ec2.internal" podStartSLOduration=11.250334963 podStartE2EDuration="11.250334963s" podCreationTimestamp="2026-04-16 19:17:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:18:05.249552707 +0000 UTC m=+12.744143325" watchObservedRunningTime="2026-04-16 19:18:05.250334963 +0000 UTC m=+12.744925581" Apr 16 19:18:05.771612 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:05.771467 2582 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 19:18:06.019324 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:06.019216 2582 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T19:18:05.771608253Z","UUID":"d6411887-5516-4a0a-abfe-7dc48fad4690","Handler":null,"Name":"","Endpoint":""} Apr 16 19:18:06.022683 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:06.022616 2582 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 19:18:06.022683 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:06.022643 2582 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 19:18:06.179625 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:06.179576 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" event={"ID":"e449e07d-bc0e-4d5f-878f-d0f6299e1791","Type":"ContainerStarted","Data":"f5dc3f2de897c49d26b147be43c090bba420b65488582f49badeacd419792fca"} Apr 16 19:18:06.182112 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:06.181559 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7ddkt" event={"ID":"c40d6168-44bb-4a03-9beb-7bf1152625f5","Type":"ContainerStarted","Data":"fc6411a6d85e60e5b134cd71bd79db38b22c30b89ab0b369554778c696077ba0"} Apr 16 19:18:06.194879 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:06.194801 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-lfd4r" podStartSLOduration=4.308133404 podStartE2EDuration="13.194783707s" podCreationTimestamp="2026-04-16 19:17:53 +0000 UTC" firstStartedPulling="2026-04-16 19:17:55.264695159 +0000 UTC m=+2.759285760" lastFinishedPulling="2026-04-16 19:18:04.151345458 +0000 UTC m=+11.645936063" observedRunningTime="2026-04-16 19:18:05.262606097 +0000 UTC m=+12.757196719" watchObservedRunningTime="2026-04-16 19:18:06.194783707 +0000 UTC m=+13.689374326" Apr 16 19:18:06.195244 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:06.195197 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-7ddkt" podStartSLOduration=4.347315814 podStartE2EDuration="13.195187107s" podCreationTimestamp="2026-04-16 19:17:53 +0000 UTC" firstStartedPulling="2026-04-16 19:17:55.304003565 +0000 UTC m=+2.798594166" lastFinishedPulling="2026-04-16 19:18:04.151874857 +0000 UTC m=+11.646465459" observedRunningTime="2026-04-16 19:18:06.194779922 +0000 UTC m=+13.689370541" watchObservedRunningTime="2026-04-16 19:18:06.195187107 +0000 UTC m=+13.689777727" Apr 16 19:18:07.110906 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:07.110865 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:07.111224 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:07.110997 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kk8zc" podUID="0ef22a96-6828-4636-8255-3aa3eaae036d" Apr 16 19:18:07.111224 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:07.111058 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:18:07.111413 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:07.111232 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvp6d" podUID="0fa55098-1c0e-4cf5-963c-602d47a411cc" Apr 16 19:18:09.092384 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:09.092317 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-fg84m"] Apr 16 19:18:09.095646 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:09.095618 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:09.095763 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:09.095690 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fg84m" podUID="27412d9f-8c9a-4ed3-92cb-4002bafb01fa" Apr 16 19:18:09.111215 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:09.110692 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:18:09.111215 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:09.110717 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:09.111215 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:09.110802 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvp6d" podUID="0fa55098-1c0e-4cf5-963c-602d47a411cc" Apr 16 19:18:09.111215 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:09.111169 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kk8zc" podUID="0ef22a96-6828-4636-8255-3aa3eaae036d" Apr 16 19:18:09.172021 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:09.171992 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-kubelet-config\") pod \"global-pull-secret-syncer-fg84m\" (UID: \"27412d9f-8c9a-4ed3-92cb-4002bafb01fa\") " pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:09.172021 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:09.172023 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-original-pull-secret\") pod \"global-pull-secret-syncer-fg84m\" (UID: \"27412d9f-8c9a-4ed3-92cb-4002bafb01fa\") " pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:09.172256 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:09.172046 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-dbus\") pod \"global-pull-secret-syncer-fg84m\" (UID: \"27412d9f-8c9a-4ed3-92cb-4002bafb01fa\") " pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:09.272603 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:09.272566 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-kubelet-config\") pod \"global-pull-secret-syncer-fg84m\" (UID: \"27412d9f-8c9a-4ed3-92cb-4002bafb01fa\") " pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:09.272603 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:09.272609 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-original-pull-secret\") pod \"global-pull-secret-syncer-fg84m\" (UID: \"27412d9f-8c9a-4ed3-92cb-4002bafb01fa\") " pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:09.272825 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:09.272642 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-dbus\") pod \"global-pull-secret-syncer-fg84m\" (UID: \"27412d9f-8c9a-4ed3-92cb-4002bafb01fa\") " pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:09.272825 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:09.272654 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-kubelet-config\") pod \"global-pull-secret-syncer-fg84m\" (UID: \"27412d9f-8c9a-4ed3-92cb-4002bafb01fa\") " pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:09.272825 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:09.272774 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:09.272971 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:09.272836 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-original-pull-secret podName:27412d9f-8c9a-4ed3-92cb-4002bafb01fa nodeName:}" failed. No retries permitted until 2026-04-16 19:18:09.772817062 +0000 UTC m=+17.267407660 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-original-pull-secret") pod "global-pull-secret-syncer-fg84m" (UID: "27412d9f-8c9a-4ed3-92cb-4002bafb01fa") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:09.272971 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:09.272836 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-dbus\") pod \"global-pull-secret-syncer-fg84m\" (UID: \"27412d9f-8c9a-4ed3-92cb-4002bafb01fa\") " pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:09.776055 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:09.776020 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-original-pull-secret\") pod \"global-pull-secret-syncer-fg84m\" (UID: \"27412d9f-8c9a-4ed3-92cb-4002bafb01fa\") " pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:09.776242 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:09.776166 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:09.776242 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:09.776236 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-original-pull-secret podName:27412d9f-8c9a-4ed3-92cb-4002bafb01fa nodeName:}" failed. No retries permitted until 2026-04-16 19:18:10.776215529 +0000 UTC m=+18.270806127 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-original-pull-secret") pod "global-pull-secret-syncer-fg84m" (UID: "27412d9f-8c9a-4ed3-92cb-4002bafb01fa") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:10.058714 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:10.058633 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-lfd4r" Apr 16 19:18:10.059302 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:10.059276 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-lfd4r" Apr 16 19:18:10.683805 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:10.683778 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs\") pod \"network-metrics-daemon-lvp6d\" (UID: \"0fa55098-1c0e-4cf5-963c-602d47a411cc\") " pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:18:10.684200 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:10.683910 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:10.684200 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:10.683962 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs podName:0fa55098-1c0e-4cf5-963c-602d47a411cc nodeName:}" failed. No retries permitted until 2026-04-16 19:18:26.68394802 +0000 UTC m=+34.178538619 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs") pod "network-metrics-daemon-lvp6d" (UID: "0fa55098-1c0e-4cf5-963c-602d47a411cc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:10.784168 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:10.784123 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkmgw\" (UniqueName: \"kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw\") pod \"network-check-target-kk8zc\" (UID: \"0ef22a96-6828-4636-8255-3aa3eaae036d\") " pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:10.784339 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:10.784208 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-original-pull-secret\") pod \"global-pull-secret-syncer-fg84m\" (UID: \"27412d9f-8c9a-4ed3-92cb-4002bafb01fa\") " pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:10.784339 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:10.784301 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:18:10.784339 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:10.784324 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:18:10.784339 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:10.784324 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:10.784339 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:10.784339 2582 projected.go:194] Error preparing data for projected volume kube-api-access-qkmgw for pod openshift-network-diagnostics/network-check-target-kk8zc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:10.784547 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:10.784382 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-original-pull-secret podName:27412d9f-8c9a-4ed3-92cb-4002bafb01fa nodeName:}" failed. No retries permitted until 2026-04-16 19:18:12.784364473 +0000 UTC m=+20.278955069 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-original-pull-secret") pod "global-pull-secret-syncer-fg84m" (UID: "27412d9f-8c9a-4ed3-92cb-4002bafb01fa") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:10.784547 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:10.784400 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw podName:0ef22a96-6828-4636-8255-3aa3eaae036d nodeName:}" failed. No retries permitted until 2026-04-16 19:18:26.784391078 +0000 UTC m=+34.278981676 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-qkmgw" (UniqueName: "kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw") pod "network-check-target-kk8zc" (UID: "0ef22a96-6828-4636-8255-3aa3eaae036d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:11.110169 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:11.110125 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:11.110322 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:11.110125 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:18:11.110322 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:11.110265 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kk8zc" podUID="0ef22a96-6828-4636-8255-3aa3eaae036d" Apr 16 19:18:11.110322 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:11.110125 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:11.110459 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:11.110348 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvp6d" podUID="0fa55098-1c0e-4cf5-963c-602d47a411cc" Apr 16 19:18:11.110459 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:11.110394 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fg84m" podUID="27412d9f-8c9a-4ed3-92cb-4002bafb01fa" Apr 16 19:18:11.303209 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:11.303180 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-lfd4r" Apr 16 19:18:11.303376 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:11.303283 2582 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 19:18:11.303951 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:11.303928 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-lfd4r" Apr 16 19:18:12.800486 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:12.800449 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-original-pull-secret\") pod \"global-pull-secret-syncer-fg84m\" (UID: \"27412d9f-8c9a-4ed3-92cb-4002bafb01fa\") " pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:12.800852 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:12.800614 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:12.800852 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:12.800692 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-original-pull-secret podName:27412d9f-8c9a-4ed3-92cb-4002bafb01fa nodeName:}" failed. No retries permitted until 2026-04-16 19:18:16.800675565 +0000 UTC m=+24.295266180 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-original-pull-secret") pod "global-pull-secret-syncer-fg84m" (UID: "27412d9f-8c9a-4ed3-92cb-4002bafb01fa") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:13.111105 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:13.111014 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:13.111277 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:13.111113 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:13.111277 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:13.111248 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:18:13.111277 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:13.111251 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kk8zc" podUID="0ef22a96-6828-4636-8255-3aa3eaae036d" Apr 16 19:18:13.111433 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:13.111338 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fg84m" podUID="27412d9f-8c9a-4ed3-92cb-4002bafb01fa" Apr 16 19:18:13.111433 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:13.111422 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvp6d" podUID="0fa55098-1c0e-4cf5-963c-602d47a411cc" Apr 16 19:18:15.110098 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:15.110060 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:15.110629 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:15.110060 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:18:15.110629 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:15.110205 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fg84m" podUID="27412d9f-8c9a-4ed3-92cb-4002bafb01fa" Apr 16 19:18:15.110629 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:15.110060 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:15.110629 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:15.110281 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvp6d" podUID="0fa55098-1c0e-4cf5-963c-602d47a411cc" Apr 16 19:18:15.110629 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:15.110417 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kk8zc" podUID="0ef22a96-6828-4636-8255-3aa3eaae036d" Apr 16 19:18:16.830257 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:16.829976 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-original-pull-secret\") pod \"global-pull-secret-syncer-fg84m\" (UID: \"27412d9f-8c9a-4ed3-92cb-4002bafb01fa\") " pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:16.831037 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:16.830115 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:16.831037 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:16.830394 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-original-pull-secret podName:27412d9f-8c9a-4ed3-92cb-4002bafb01fa nodeName:}" failed. No retries permitted until 2026-04-16 19:18:24.830377846 +0000 UTC m=+32.324968445 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-original-pull-secret") pod "global-pull-secret-syncer-fg84m" (UID: "27412d9f-8c9a-4ed3-92cb-4002bafb01fa") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:17.110818 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:17.110734 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:17.110959 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:17.110845 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:18:17.110959 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:17.110876 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:17.110959 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:17.110841 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kk8zc" podUID="0ef22a96-6828-4636-8255-3aa3eaae036d" Apr 16 19:18:17.110959 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:17.110947 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvp6d" podUID="0fa55098-1c0e-4cf5-963c-602d47a411cc" Apr 16 19:18:17.111114 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:17.111021 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fg84m" podUID="27412d9f-8c9a-4ed3-92cb-4002bafb01fa" Apr 16 19:18:17.204647 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:17.204616 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" event={"ID":"03e41e43-a8fe-424e-85ea-c86ea5b657e4","Type":"ContainerStarted","Data":"bea108f71dc7c842bdf76bd4ca1f6e7b75332c9a9ee6d9132b484f7fc95f50cb"} Apr 16 19:18:17.204647 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:17.204655 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" event={"ID":"03e41e43-a8fe-424e-85ea-c86ea5b657e4","Type":"ContainerStarted","Data":"9b491ca90bc53d05bd92b569ea72b399d73d0ace1f39304c89df20cea5b5f0ce"} Apr 16 19:18:17.204876 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:17.204671 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" event={"ID":"03e41e43-a8fe-424e-85ea-c86ea5b657e4","Type":"ContainerStarted","Data":"981ccfb797b3a8e4382d4099cc5cd77b6dc753cd6c211bd280fb1bd934834c5a"} Apr 16 19:18:17.204876 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:17.204683 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" event={"ID":"03e41e43-a8fe-424e-85ea-c86ea5b657e4","Type":"ContainerStarted","Data":"923724edf59db206003b8977242a0a76d6f063f4be44c51fea9d77753ecf0c85"} Apr 16 19:18:17.204876 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:17.204696 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" event={"ID":"03e41e43-a8fe-424e-85ea-c86ea5b657e4","Type":"ContainerStarted","Data":"b3d96eadbd9786375200d03b955ece2dbd2657ea6d74a56f2abf4a4d3f8a8c36"} Apr 16 19:18:17.204876 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:17.204710 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" event={"ID":"03e41e43-a8fe-424e-85ea-c86ea5b657e4","Type":"ContainerStarted","Data":"2eb2ee832dc696cd6e80d96111458d2f87ab558623afa26c0fd5a8c83c2f8a40"} Apr 16 19:18:17.205955 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:17.205921 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-s29hv" event={"ID":"ff091581-6d2a-4584-b8a2-9f02cd7c342d","Type":"ContainerStarted","Data":"dbd6329f630f1251f221fbeccb94676f0adc60c88c54b2fea1db3f7a51c1646a"} Apr 16 19:18:17.207343 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:17.207321 2582 generic.go:358] "Generic (PLEG): container finished" podID="fb965cc4-1192-4694-81d4-b4802f0b6e56" containerID="bdcfa32684aa2f8b873e77fc63e245bc61cedac9b7b51b5819a011608f0aef29" exitCode=0 Apr 16 19:18:17.207446 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:17.207386 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j5vpx" event={"ID":"fb965cc4-1192-4694-81d4-b4802f0b6e56","Type":"ContainerDied","Data":"bdcfa32684aa2f8b873e77fc63e245bc61cedac9b7b51b5819a011608f0aef29"} Apr 16 19:18:17.209072 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:17.209050 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" event={"ID":"e449e07d-bc0e-4d5f-878f-d0f6299e1791","Type":"ContainerStarted","Data":"6f11a1fc47d7a1aba85a01092804f78b8faa885557e0893990dbc47a3bffdedd"} Apr 16 19:18:17.237082 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:17.237026 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-s29hv" podStartSLOduration=3.033160466 podStartE2EDuration="24.237010926s" podCreationTimestamp="2026-04-16 19:17:53 +0000 UTC" firstStartedPulling="2026-04-16 19:17:55.293955896 +0000 UTC m=+2.788546501" lastFinishedPulling="2026-04-16 19:18:16.497806365 +0000 UTC m=+23.992396961" observedRunningTime="2026-04-16 19:18:17.236447179 +0000 UTC m=+24.731037789" watchObservedRunningTime="2026-04-16 19:18:17.237010926 +0000 UTC m=+24.731601542" Apr 16 19:18:17.296966 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:17.296916 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-swbsn" podStartSLOduration=3.020557367 podStartE2EDuration="24.296900237s" podCreationTimestamp="2026-04-16 19:17:53 +0000 UTC" firstStartedPulling="2026-04-16 19:17:55.273819237 +0000 UTC m=+2.768409849" lastFinishedPulling="2026-04-16 19:18:16.550162108 +0000 UTC m=+24.044752719" observedRunningTime="2026-04-16 19:18:17.265842331 +0000 UTC m=+24.760432949" watchObservedRunningTime="2026-04-16 19:18:17.296900237 +0000 UTC m=+24.791490887" Apr 16 19:18:19.111020 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:19.110826 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:19.111495 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:19.110853 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:18:19.111495 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:19.111130 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fg84m" podUID="27412d9f-8c9a-4ed3-92cb-4002bafb01fa" Apr 16 19:18:19.111495 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:19.110871 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:19.111495 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:19.111227 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvp6d" podUID="0fa55098-1c0e-4cf5-963c-602d47a411cc" Apr 16 19:18:19.111495 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:19.111333 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kk8zc" podUID="0ef22a96-6828-4636-8255-3aa3eaae036d" Apr 16 19:18:19.217108 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:19.217020 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" event={"ID":"03e41e43-a8fe-424e-85ea-c86ea5b657e4","Type":"ContainerStarted","Data":"9b777475d3d3760ffbf8443cdd4d9cfdb12b66dae40cc61e327a3a56568b63bb"} Apr 16 19:18:19.218692 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:19.218669 2582 generic.go:358] "Generic (PLEG): container finished" podID="fb965cc4-1192-4694-81d4-b4802f0b6e56" containerID="ce15fc41ebb10208aace510f1ca6ee4ded538376b27c4b70b96f4facfb92082b" exitCode=0 Apr 16 19:18:19.218807 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:19.218705 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j5vpx" event={"ID":"fb965cc4-1192-4694-81d4-b4802f0b6e56","Type":"ContainerDied","Data":"ce15fc41ebb10208aace510f1ca6ee4ded538376b27c4b70b96f4facfb92082b"} Apr 16 19:18:20.223336 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:20.223309 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j5vpx" event={"ID":"fb965cc4-1192-4694-81d4-b4802f0b6e56","Type":"ContainerStarted","Data":"e361e9014470b3e9dad82614486f9aae85aa35432b48ed11466f106d3249ba34"} Apr 16 19:18:21.110931 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:21.110867 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:18:21.111213 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:21.110867 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:21.111213 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:21.110993 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvp6d" podUID="0fa55098-1c0e-4cf5-963c-602d47a411cc" Apr 16 19:18:21.111213 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:21.111062 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fg84m" podUID="27412d9f-8c9a-4ed3-92cb-4002bafb01fa" Apr 16 19:18:21.111213 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:21.110867 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:21.111213 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:21.111176 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kk8zc" podUID="0ef22a96-6828-4636-8255-3aa3eaae036d" Apr 16 19:18:21.228196 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:21.228146 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" event={"ID":"03e41e43-a8fe-424e-85ea-c86ea5b657e4","Type":"ContainerStarted","Data":"402a7812baa88ad78d2194248f734db81abce5dcca0c775938437e7c5842f646"} Apr 16 19:18:21.228641 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:21.228493 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:18:21.228641 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:21.228520 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:18:21.230180 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:21.230137 2582 generic.go:358] "Generic (PLEG): container finished" podID="fb965cc4-1192-4694-81d4-b4802f0b6e56" containerID="e361e9014470b3e9dad82614486f9aae85aa35432b48ed11466f106d3249ba34" exitCode=0 Apr 16 19:18:21.230270 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:21.230178 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j5vpx" event={"ID":"fb965cc4-1192-4694-81d4-b4802f0b6e56","Type":"ContainerDied","Data":"e361e9014470b3e9dad82614486f9aae85aa35432b48ed11466f106d3249ba34"} Apr 16 19:18:21.243595 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:21.243577 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:18:21.260024 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:21.259984 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" podStartSLOduration=7.269520563 podStartE2EDuration="28.259973714s" podCreationTimestamp="2026-04-16 19:17:53 +0000 UTC" firstStartedPulling="2026-04-16 19:17:55.311435808 +0000 UTC m=+2.806026409" lastFinishedPulling="2026-04-16 19:18:16.301888951 +0000 UTC m=+23.796479560" observedRunningTime="2026-04-16 19:18:21.258890052 +0000 UTC m=+28.753480668" watchObservedRunningTime="2026-04-16 19:18:21.259973714 +0000 UTC m=+28.754564330" Apr 16 19:18:22.233338 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:22.233309 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:18:22.250199 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:22.250170 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:18:23.112217 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:23.112183 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:23.112217 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:23.112218 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:23.112464 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:23.112298 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:18:23.112464 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:23.112318 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kk8zc" podUID="0ef22a96-6828-4636-8255-3aa3eaae036d" Apr 16 19:18:23.112464 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:23.112386 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvp6d" podUID="0fa55098-1c0e-4cf5-963c-602d47a411cc" Apr 16 19:18:23.112464 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:23.112449 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fg84m" podUID="27412d9f-8c9a-4ed3-92cb-4002bafb01fa" Apr 16 19:18:23.273386 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:23.273350 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fg84m"] Apr 16 19:18:23.273827 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:23.273451 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:23.273827 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:23.273544 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fg84m" podUID="27412d9f-8c9a-4ed3-92cb-4002bafb01fa" Apr 16 19:18:23.276653 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:23.276617 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kk8zc"] Apr 16 19:18:23.276773 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:23.276701 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:23.276842 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:23.276799 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kk8zc" podUID="0ef22a96-6828-4636-8255-3aa3eaae036d" Apr 16 19:18:23.277235 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:23.277208 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lvp6d"] Apr 16 19:18:23.277324 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:23.277299 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:18:23.277436 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:23.277419 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvp6d" podUID="0fa55098-1c0e-4cf5-963c-602d47a411cc" Apr 16 19:18:24.896549 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:24.896507 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-original-pull-secret\") pod \"global-pull-secret-syncer-fg84m\" (UID: \"27412d9f-8c9a-4ed3-92cb-4002bafb01fa\") " pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:24.896985 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:24.896666 2582 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:24.896985 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:24.896740 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-original-pull-secret podName:27412d9f-8c9a-4ed3-92cb-4002bafb01fa nodeName:}" failed. No retries permitted until 2026-04-16 19:18:40.896718125 +0000 UTC m=+48.391308720 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-original-pull-secret") pod "global-pull-secret-syncer-fg84m" (UID: "27412d9f-8c9a-4ed3-92cb-4002bafb01fa") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:18:25.110334 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:25.110307 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:25.110334 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:25.110332 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:18:25.110479 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:25.110307 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:25.110479 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:25.110412 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fg84m" podUID="27412d9f-8c9a-4ed3-92cb-4002bafb01fa" Apr 16 19:18:25.110479 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:25.110462 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kk8zc" podUID="0ef22a96-6828-4636-8255-3aa3eaae036d" Apr 16 19:18:25.110586 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:25.110557 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvp6d" podUID="0fa55098-1c0e-4cf5-963c-602d47a411cc" Apr 16 19:18:26.710933 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:26.710745 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs\") pod \"network-metrics-daemon-lvp6d\" (UID: \"0fa55098-1c0e-4cf5-963c-602d47a411cc\") " pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:18:26.711357 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:26.710911 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:26.711357 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:26.711034 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs podName:0fa55098-1c0e-4cf5-963c-602d47a411cc nodeName:}" failed. No retries permitted until 2026-04-16 19:18:58.711020792 +0000 UTC m=+66.205611390 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs") pod "network-metrics-daemon-lvp6d" (UID: "0fa55098-1c0e-4cf5-963c-602d47a411cc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:18:26.811888 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:26.811857 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkmgw\" (UniqueName: \"kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw\") pod \"network-check-target-kk8zc\" (UID: \"0ef22a96-6828-4636-8255-3aa3eaae036d\") " pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:26.812050 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:26.811974 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:18:26.812050 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:26.811987 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:18:26.812050 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:26.811995 2582 projected.go:194] Error preparing data for projected volume kube-api-access-qkmgw for pod openshift-network-diagnostics/network-check-target-kk8zc: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:26.812050 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:26.812043 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw podName:0ef22a96-6828-4636-8255-3aa3eaae036d nodeName:}" failed. No retries permitted until 2026-04-16 19:18:58.812030397 +0000 UTC m=+66.306620992 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-qkmgw" (UniqueName: "kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw") pod "network-check-target-kk8zc" (UID: "0ef22a96-6828-4636-8255-3aa3eaae036d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:18:27.110322 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:27.110287 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:27.110561 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:27.110287 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:27.110561 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:27.110413 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fg84m" podUID="27412d9f-8c9a-4ed3-92cb-4002bafb01fa" Apr 16 19:18:27.110561 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:27.110287 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:18:27.110561 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:27.110496 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kk8zc" podUID="0ef22a96-6828-4636-8255-3aa3eaae036d" Apr 16 19:18:27.110773 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:27.110580 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvp6d" podUID="0fa55098-1c0e-4cf5-963c-602d47a411cc" Apr 16 19:18:29.109987 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.109937 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:29.109987 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.109971 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:29.110657 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.110055 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:18:29.110657 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:29.110068 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-kk8zc" podUID="0ef22a96-6828-4636-8255-3aa3eaae036d" Apr 16 19:18:29.110657 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:29.110190 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvp6d" podUID="0fa55098-1c0e-4cf5-963c-602d47a411cc" Apr 16 19:18:29.110657 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:29.110276 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fg84m" podUID="27412d9f-8c9a-4ed3-92cb-4002bafb01fa" Apr 16 19:18:29.367936 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.367911 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-123.ec2.internal" event="NodeReady" Apr 16 19:18:29.368127 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.368059 2582 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 19:18:29.400416 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.400383 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-86db68fc76-grwtr"] Apr 16 19:18:29.434954 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.434928 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-86db68fc76-grwtr"] Apr 16 19:18:29.434954 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.434957 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8xgc6"] Apr 16 19:18:29.435206 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.435111 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.437422 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.437380 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 19:18:29.437678 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.437655 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 19:18:29.437678 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.437675 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 19:18:29.437810 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.437660 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-bjmd7\"" Apr 16 19:18:29.443277 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.443254 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 19:18:29.466817 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.466791 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dp4cw"] Apr 16 19:18:29.466976 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.466958 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8xgc6" Apr 16 19:18:29.469346 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.469286 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 19:18:29.469346 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.469300 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-n84tw\"" Apr 16 19:18:29.469346 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.469290 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 19:18:29.487612 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.487589 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dp4cw"] Apr 16 19:18:29.487734 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.487617 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8xgc6"] Apr 16 19:18:29.487782 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.487736 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dp4cw" Apr 16 19:18:29.489906 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.489887 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 19:18:29.490003 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.489948 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9bx8r\"" Apr 16 19:18:29.490003 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.489962 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 19:18:29.491176 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.490406 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 19:18:29.530894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.530864 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gks9j\" (UniqueName: \"kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-kube-api-access-gks9j\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.531082 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.530916 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fff8270c-4771-4655-8abd-7341281f3173-image-registry-private-configuration\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.531082 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.530965 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fff8270c-4771-4655-8abd-7341281f3173-registry-certificates\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.531082 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.530982 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fff8270c-4771-4655-8abd-7341281f3173-trusted-ca\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.531082 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.531005 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fff8270c-4771-4655-8abd-7341281f3173-installation-pull-secrets\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.531082 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.531021 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-bound-sa-token\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.531082 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.531042 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.531082 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.531066 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fff8270c-4771-4655-8abd-7341281f3173-ca-trust-extracted\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.632498 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.632399 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-288jr\" (UniqueName: \"kubernetes.io/projected/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-kube-api-access-288jr\") pod \"ingress-canary-dp4cw\" (UID: \"d82ed6e1-d7aa-4d47-bcb6-f4539431d578\") " pod="openshift-ingress-canary/ingress-canary-dp4cw" Apr 16 19:18:29.632498 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.632463 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gks9j\" (UniqueName: \"kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-kube-api-access-gks9j\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.632498 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.632500 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-cert\") pod \"ingress-canary-dp4cw\" (UID: \"d82ed6e1-d7aa-4d47-bcb6-f4539431d578\") " pod="openshift-ingress-canary/ingress-canary-dp4cw" Apr 16 19:18:29.632759 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.632520 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/513c0bb7-f253-4f0d-bc14-1d473d560c39-tmp-dir\") pod \"dns-default-8xgc6\" (UID: \"513c0bb7-f253-4f0d-bc14-1d473d560c39\") " pod="openshift-dns/dns-default-8xgc6" Apr 16 19:18:29.632759 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.632544 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fff8270c-4771-4655-8abd-7341281f3173-image-registry-private-configuration\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.632759 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.632611 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fff8270c-4771-4655-8abd-7341281f3173-registry-certificates\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.632759 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.632686 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/513c0bb7-f253-4f0d-bc14-1d473d560c39-metrics-tls\") pod \"dns-default-8xgc6\" (UID: \"513c0bb7-f253-4f0d-bc14-1d473d560c39\") " pod="openshift-dns/dns-default-8xgc6" Apr 16 19:18:29.632759 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.632720 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk8jt\" (UniqueName: \"kubernetes.io/projected/513c0bb7-f253-4f0d-bc14-1d473d560c39-kube-api-access-mk8jt\") pod \"dns-default-8xgc6\" (UID: \"513c0bb7-f253-4f0d-bc14-1d473d560c39\") " pod="openshift-dns/dns-default-8xgc6" Apr 16 19:18:29.632759 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.632760 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fff8270c-4771-4655-8abd-7341281f3173-trusted-ca\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.633024 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.632787 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fff8270c-4771-4655-8abd-7341281f3173-installation-pull-secrets\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.633024 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.632813 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-bound-sa-token\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.633024 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.632855 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/513c0bb7-f253-4f0d-bc14-1d473d560c39-config-volume\") pod \"dns-default-8xgc6\" (UID: \"513c0bb7-f253-4f0d-bc14-1d473d560c39\") " pod="openshift-dns/dns-default-8xgc6" Apr 16 19:18:29.633024 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.632895 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.633024 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.632961 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fff8270c-4771-4655-8abd-7341281f3173-ca-trust-extracted\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.633258 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:29.633198 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:18:29.633258 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:29.633213 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86db68fc76-grwtr: secret "image-registry-tls" not found Apr 16 19:18:29.633359 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:29.633272 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls podName:fff8270c-4771-4655-8abd-7341281f3173 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:30.133256834 +0000 UTC m=+37.627847434 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls") pod "image-registry-86db68fc76-grwtr" (UID: "fff8270c-4771-4655-8abd-7341281f3173") : secret "image-registry-tls" not found Apr 16 19:18:29.633418 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.633382 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fff8270c-4771-4655-8abd-7341281f3173-ca-trust-extracted\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.633418 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.633403 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fff8270c-4771-4655-8abd-7341281f3173-registry-certificates\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.634048 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.634022 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fff8270c-4771-4655-8abd-7341281f3173-trusted-ca\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.636768 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.636743 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fff8270c-4771-4655-8abd-7341281f3173-image-registry-private-configuration\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.636897 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.636765 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fff8270c-4771-4655-8abd-7341281f3173-installation-pull-secrets\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.644031 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.644001 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-bound-sa-token\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.644136 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.644035 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gks9j\" (UniqueName: \"kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-kube-api-access-gks9j\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:29.734257 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.734222 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/513c0bb7-f253-4f0d-bc14-1d473d560c39-config-volume\") pod \"dns-default-8xgc6\" (UID: \"513c0bb7-f253-4f0d-bc14-1d473d560c39\") " pod="openshift-dns/dns-default-8xgc6" Apr 16 19:18:29.734445 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.734310 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-288jr\" (UniqueName: \"kubernetes.io/projected/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-kube-api-access-288jr\") pod \"ingress-canary-dp4cw\" (UID: \"d82ed6e1-d7aa-4d47-bcb6-f4539431d578\") " pod="openshift-ingress-canary/ingress-canary-dp4cw" Apr 16 19:18:29.734445 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.734341 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-cert\") pod \"ingress-canary-dp4cw\" (UID: \"d82ed6e1-d7aa-4d47-bcb6-f4539431d578\") " pod="openshift-ingress-canary/ingress-canary-dp4cw" Apr 16 19:18:29.734445 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.734368 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/513c0bb7-f253-4f0d-bc14-1d473d560c39-tmp-dir\") pod \"dns-default-8xgc6\" (UID: \"513c0bb7-f253-4f0d-bc14-1d473d560c39\") " pod="openshift-dns/dns-default-8xgc6" Apr 16 19:18:29.734445 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.734427 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/513c0bb7-f253-4f0d-bc14-1d473d560c39-metrics-tls\") pod \"dns-default-8xgc6\" (UID: \"513c0bb7-f253-4f0d-bc14-1d473d560c39\") " pod="openshift-dns/dns-default-8xgc6" Apr 16 19:18:29.734662 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.734450 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mk8jt\" (UniqueName: \"kubernetes.io/projected/513c0bb7-f253-4f0d-bc14-1d473d560c39-kube-api-access-mk8jt\") pod \"dns-default-8xgc6\" (UID: \"513c0bb7-f253-4f0d-bc14-1d473d560c39\") " pod="openshift-dns/dns-default-8xgc6" Apr 16 19:18:29.734858 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.734835 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/513c0bb7-f253-4f0d-bc14-1d473d560c39-config-volume\") pod \"dns-default-8xgc6\" (UID: \"513c0bb7-f253-4f0d-bc14-1d473d560c39\") " pod="openshift-dns/dns-default-8xgc6" Apr 16 19:18:29.735101 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:29.735077 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:18:29.735199 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.735107 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/513c0bb7-f253-4f0d-bc14-1d473d560c39-tmp-dir\") pod \"dns-default-8xgc6\" (UID: \"513c0bb7-f253-4f0d-bc14-1d473d560c39\") " pod="openshift-dns/dns-default-8xgc6" Apr 16 19:18:29.735199 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:29.735178 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/513c0bb7-f253-4f0d-bc14-1d473d560c39-metrics-tls podName:513c0bb7-f253-4f0d-bc14-1d473d560c39 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:30.235144755 +0000 UTC m=+37.729735372 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/513c0bb7-f253-4f0d-bc14-1d473d560c39-metrics-tls") pod "dns-default-8xgc6" (UID: "513c0bb7-f253-4f0d-bc14-1d473d560c39") : secret "dns-default-metrics-tls" not found Apr 16 19:18:29.735199 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:29.735183 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:18:29.735368 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:29.735245 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-cert podName:d82ed6e1-d7aa-4d47-bcb6-f4539431d578 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:30.235227687 +0000 UTC m=+37.729818296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-cert") pod "ingress-canary-dp4cw" (UID: "d82ed6e1-d7aa-4d47-bcb6-f4539431d578") : secret "canary-serving-cert" not found Apr 16 19:18:29.743565 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.743539 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-288jr\" (UniqueName: \"kubernetes.io/projected/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-kube-api-access-288jr\") pod \"ingress-canary-dp4cw\" (UID: \"d82ed6e1-d7aa-4d47-bcb6-f4539431d578\") " pod="openshift-ingress-canary/ingress-canary-dp4cw" Apr 16 19:18:29.757948 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:29.757922 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk8jt\" (UniqueName: \"kubernetes.io/projected/513c0bb7-f253-4f0d-bc14-1d473d560c39-kube-api-access-mk8jt\") pod \"dns-default-8xgc6\" (UID: \"513c0bb7-f253-4f0d-bc14-1d473d560c39\") " pod="openshift-dns/dns-default-8xgc6" Apr 16 19:18:30.137536 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:30.137497 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:30.137985 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:30.137657 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:18:30.137985 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:30.137677 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86db68fc76-grwtr: secret "image-registry-tls" not found Apr 16 19:18:30.137985 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:30.137732 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls podName:fff8270c-4771-4655-8abd-7341281f3173 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:31.137717156 +0000 UTC m=+38.632307751 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls") pod "image-registry-86db68fc76-grwtr" (UID: "fff8270c-4771-4655-8abd-7341281f3173") : secret "image-registry-tls" not found Apr 16 19:18:30.238535 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:30.238502 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/513c0bb7-f253-4f0d-bc14-1d473d560c39-metrics-tls\") pod \"dns-default-8xgc6\" (UID: \"513c0bb7-f253-4f0d-bc14-1d473d560c39\") " pod="openshift-dns/dns-default-8xgc6" Apr 16 19:18:30.238724 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:30.238572 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-cert\") pod \"ingress-canary-dp4cw\" (UID: \"d82ed6e1-d7aa-4d47-bcb6-f4539431d578\") " pod="openshift-ingress-canary/ingress-canary-dp4cw" Apr 16 19:18:30.238724 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:30.238647 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:18:30.238724 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:30.238674 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:18:30.238724 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:30.238707 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/513c0bb7-f253-4f0d-bc14-1d473d560c39-metrics-tls podName:513c0bb7-f253-4f0d-bc14-1d473d560c39 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:31.238693196 +0000 UTC m=+38.733283791 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/513c0bb7-f253-4f0d-bc14-1d473d560c39-metrics-tls") pod "dns-default-8xgc6" (UID: "513c0bb7-f253-4f0d-bc14-1d473d560c39") : secret "dns-default-metrics-tls" not found Apr 16 19:18:30.238724 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:30.238723 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-cert podName:d82ed6e1-d7aa-4d47-bcb6-f4539431d578 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:31.238711939 +0000 UTC m=+38.733302533 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-cert") pod "ingress-canary-dp4cw" (UID: "d82ed6e1-d7aa-4d47-bcb6-f4539431d578") : secret "canary-serving-cert" not found Apr 16 19:18:31.110319 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:31.110275 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:31.110513 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:31.110405 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:18:31.110513 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:31.110447 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:31.113198 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:31.113178 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-t5lg8\"" Apr 16 19:18:31.113960 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:31.113927 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-s2294\"" Apr 16 19:18:31.113960 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:31.113951 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:18:31.114104 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:31.113975 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:18:31.114104 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:31.114001 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:18:31.114104 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:31.114072 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 19:18:31.145902 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:31.145880 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:31.146322 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:31.145983 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:18:31.146322 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:31.145994 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86db68fc76-grwtr: secret "image-registry-tls" not found Apr 16 19:18:31.146322 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:31.146046 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls podName:fff8270c-4771-4655-8abd-7341281f3173 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:33.146032981 +0000 UTC m=+40.640623577 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls") pod "image-registry-86db68fc76-grwtr" (UID: "fff8270c-4771-4655-8abd-7341281f3173") : secret "image-registry-tls" not found Apr 16 19:18:31.246945 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:31.246915 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/513c0bb7-f253-4f0d-bc14-1d473d560c39-metrics-tls\") pod \"dns-default-8xgc6\" (UID: \"513c0bb7-f253-4f0d-bc14-1d473d560c39\") " pod="openshift-dns/dns-default-8xgc6" Apr 16 19:18:31.247092 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:31.247026 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-cert\") pod \"ingress-canary-dp4cw\" (UID: \"d82ed6e1-d7aa-4d47-bcb6-f4539431d578\") " pod="openshift-ingress-canary/ingress-canary-dp4cw" Apr 16 19:18:31.247092 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:31.247063 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:18:31.247192 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:31.247117 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/513c0bb7-f253-4f0d-bc14-1d473d560c39-metrics-tls podName:513c0bb7-f253-4f0d-bc14-1d473d560c39 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:33.247100998 +0000 UTC m=+40.741691607 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/513c0bb7-f253-4f0d-bc14-1d473d560c39-metrics-tls") pod "dns-default-8xgc6" (UID: "513c0bb7-f253-4f0d-bc14-1d473d560c39") : secret "dns-default-metrics-tls" not found Apr 16 19:18:31.247192 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:31.247131 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:18:31.247270 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:31.247208 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-cert podName:d82ed6e1-d7aa-4d47-bcb6-f4539431d578 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:33.247191998 +0000 UTC m=+40.741782597 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-cert") pod "ingress-canary-dp4cw" (UID: "d82ed6e1-d7aa-4d47-bcb6-f4539431d578") : secret "canary-serving-cert" not found Apr 16 19:18:31.251323 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:31.251301 2582 generic.go:358] "Generic (PLEG): container finished" podID="fb965cc4-1192-4694-81d4-b4802f0b6e56" containerID="49578713ee6805b785fd82db96d20a21d92927937f7f506b297d193bb3575a37" exitCode=0 Apr 16 19:18:31.251426 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:31.251345 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j5vpx" event={"ID":"fb965cc4-1192-4694-81d4-b4802f0b6e56","Type":"ContainerDied","Data":"49578713ee6805b785fd82db96d20a21d92927937f7f506b297d193bb3575a37"} Apr 16 19:18:32.255817 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:32.255783 2582 generic.go:358] "Generic (PLEG): container finished" podID="fb965cc4-1192-4694-81d4-b4802f0b6e56" containerID="52d31de1092255acc3d7cefc80218287e038834a86a2c24f5412f9667a4c3393" exitCode=0 Apr 16 19:18:32.256274 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:32.255846 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j5vpx" event={"ID":"fb965cc4-1192-4694-81d4-b4802f0b6e56","Type":"ContainerDied","Data":"52d31de1092255acc3d7cefc80218287e038834a86a2c24f5412f9667a4c3393"} Apr 16 19:18:33.159242 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:33.158993 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:33.159401 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:33.159144 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:18:33.159401 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:33.159307 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86db68fc76-grwtr: secret "image-registry-tls" not found Apr 16 19:18:33.159401 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:33.159360 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls podName:fff8270c-4771-4655-8abd-7341281f3173 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:37.159344069 +0000 UTC m=+44.653934682 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls") pod "image-registry-86db68fc76-grwtr" (UID: "fff8270c-4771-4655-8abd-7341281f3173") : secret "image-registry-tls" not found Apr 16 19:18:33.259588 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:33.259562 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-cert\") pod \"ingress-canary-dp4cw\" (UID: \"d82ed6e1-d7aa-4d47-bcb6-f4539431d578\") " pod="openshift-ingress-canary/ingress-canary-dp4cw" Apr 16 19:18:33.259991 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:33.259624 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/513c0bb7-f253-4f0d-bc14-1d473d560c39-metrics-tls\") pod \"dns-default-8xgc6\" (UID: \"513c0bb7-f253-4f0d-bc14-1d473d560c39\") " pod="openshift-dns/dns-default-8xgc6" Apr 16 19:18:33.259991 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:33.259658 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:18:33.259991 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:33.259716 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-cert podName:d82ed6e1-d7aa-4d47-bcb6-f4539431d578 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:37.25970221 +0000 UTC m=+44.754292809 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-cert") pod "ingress-canary-dp4cw" (UID: "d82ed6e1-d7aa-4d47-bcb6-f4539431d578") : secret "canary-serving-cert" not found Apr 16 19:18:33.259991 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:33.259715 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:18:33.259991 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:33.259744 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/513c0bb7-f253-4f0d-bc14-1d473d560c39-metrics-tls podName:513c0bb7-f253-4f0d-bc14-1d473d560c39 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:37.25973777 +0000 UTC m=+44.754328365 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/513c0bb7-f253-4f0d-bc14-1d473d560c39-metrics-tls") pod "dns-default-8xgc6" (UID: "513c0bb7-f253-4f0d-bc14-1d473d560c39") : secret "dns-default-metrics-tls" not found Apr 16 19:18:33.260758 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:33.260736 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j5vpx" event={"ID":"fb965cc4-1192-4694-81d4-b4802f0b6e56","Type":"ContainerStarted","Data":"394e557d5b0a1d3d5eab86f32e409e86715cafeaa91502fb41a24bc70515f9ea"} Apr 16 19:18:33.291793 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:33.291755 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-j5vpx" podStartSLOduration=5.14718453 podStartE2EDuration="40.291742905s" podCreationTimestamp="2026-04-16 19:17:53 +0000 UTC" firstStartedPulling="2026-04-16 19:17:55.288428852 +0000 UTC m=+2.783019448" lastFinishedPulling="2026-04-16 19:18:30.432987225 +0000 UTC m=+37.927577823" observedRunningTime="2026-04-16 19:18:33.289086766 +0000 UTC m=+40.783677393" watchObservedRunningTime="2026-04-16 19:18:33.291742905 +0000 UTC m=+40.786333521" Apr 16 19:18:35.276824 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.276790 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-f8xvx"] Apr 16 19:18:35.305577 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.305551 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8hf4w"] Apr 16 19:18:35.305707 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.305688 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-f8xvx" Apr 16 19:18:35.308500 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.308476 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-hcgkz\"" Apr 16 19:18:35.308672 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.308507 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 19:18:35.309294 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.309280 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:18:35.327776 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.327756 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-f8xvx"] Apr 16 19:18:35.327776 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.327779 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8hf4w"] Apr 16 19:18:35.327913 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.327869 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8hf4w" Apr 16 19:18:35.330369 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.330345 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 19:18:35.330369 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.330360 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:18:35.330516 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.330359 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 19:18:35.330516 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.330371 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-x2fps\"" Apr 16 19:18:35.375450 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.375424 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-ksczz"] Apr 16 19:18:35.394630 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.394606 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws"] Apr 16 19:18:35.394868 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.394845 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-ksczz" Apr 16 19:18:35.397392 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.397371 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 19:18:35.397392 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.397381 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 19:18:35.397392 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.397390 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 19:18:35.397618 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.397371 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-pwb9n\"" Apr 16 19:18:35.397618 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.397420 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 19:18:35.403538 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.403519 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 19:18:35.419868 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.419789 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-kqmw8"] Apr 16 19:18:35.419974 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.419962 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws" Apr 16 19:18:35.422529 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.422507 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 19:18:35.422615 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.422513 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 19:18:35.422615 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.422514 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 19:18:35.422700 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.422539 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-4299x\"" Apr 16 19:18:35.423132 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.423112 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 19:18:35.437879 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.437851 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-ksczz"] Apr 16 19:18:35.437879 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.437879 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws"] Apr 16 19:18:35.437985 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.437894 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-kqmw8"] Apr 16 19:18:35.437985 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.437981 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" Apr 16 19:18:35.440454 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.440435 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:18:35.440454 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.440450 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 19:18:35.440590 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.440498 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 19:18:35.440590 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.440432 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 19:18:35.440590 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.440585 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-9h42h\"" Apr 16 19:18:35.445734 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.445720 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 19:18:35.476286 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.476259 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-pl62h"] Apr 16 19:18:35.477660 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.477639 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d22f5821-5636-4a50-8b36-4c7eed507c2d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8hf4w\" (UID: \"d22f5821-5636-4a50-8b36-4c7eed507c2d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8hf4w" Apr 16 19:18:35.477760 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.477696 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w928t\" (UniqueName: \"kubernetes.io/projected/c4c4f849-8b71-4e5a-a7d6-079b83a72af1-kube-api-access-w928t\") pod \"volume-data-source-validator-7c6cbb6c87-f8xvx\" (UID: \"c4c4f849-8b71-4e5a-a7d6-079b83a72af1\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-f8xvx" Apr 16 19:18:35.477825 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.477794 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br6lb\" (UniqueName: \"kubernetes.io/projected/d22f5821-5636-4a50-8b36-4c7eed507c2d-kube-api-access-br6lb\") pod \"cluster-samples-operator-6dc5bdb6b4-8hf4w\" (UID: \"d22f5821-5636-4a50-8b36-4c7eed507c2d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8hf4w" Apr 16 19:18:35.496007 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.495984 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dtbpz"] Apr 16 19:18:35.496184 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.496138 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-pl62h" Apr 16 19:18:35.498615 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.498567 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 19:18:35.498615 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.498588 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 19:18:35.498813 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.498791 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-ktlw7\"" Apr 16 19:18:35.507770 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.507752 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-67nzv"] Apr 16 19:18:35.507894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.507879 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dtbpz" Apr 16 19:18:35.509889 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.509868 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 19:18:35.510026 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.509998 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 19:18:35.510121 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.510001 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:18:35.510121 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.510054 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-j82gm\"" Apr 16 19:18:35.510121 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.510004 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 19:18:35.524618 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.524597 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9hfwk"] Apr 16 19:18:35.524766 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.524750 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-67nzv" Apr 16 19:18:35.527239 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.527138 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:18:35.527239 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.527138 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 19:18:35.527239 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.527197 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 19:18:35.527404 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.527276 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-v5km9\"" Apr 16 19:18:35.527453 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.527441 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 19:18:35.536212 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.536195 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5f4947fcd8-gffmg"] Apr 16 19:18:35.536329 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.536315 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9hfwk" Apr 16 19:18:35.538408 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.538391 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-7lqjv\"" Apr 16 19:18:35.551361 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.551345 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-pl62h"] Apr 16 19:18:35.551435 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.551414 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9hfwk"] Apr 16 19:18:35.551435 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.551435 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-67nzv"] Apr 16 19:18:35.551502 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.551441 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:35.551585 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.551443 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dtbpz"] Apr 16 19:18:35.551679 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.551659 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5f4947fcd8-gffmg"] Apr 16 19:18:35.553680 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.553661 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 19:18:35.553769 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.553662 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 19:18:35.553838 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.553798 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-kfs6b\"" Apr 16 19:18:35.553838 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.553798 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 19:18:35.553838 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.553810 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 19:18:35.553991 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.553844 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 19:18:35.553991 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.553870 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 19:18:35.578239 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.578219 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w928t\" (UniqueName: \"kubernetes.io/projected/c4c4f849-8b71-4e5a-a7d6-079b83a72af1-kube-api-access-w928t\") pod \"volume-data-source-validator-7c6cbb6c87-f8xvx\" (UID: \"c4c4f849-8b71-4e5a-a7d6-079b83a72af1\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-f8xvx" Apr 16 19:18:35.578332 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.578258 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4zp2\" (UniqueName: \"kubernetes.io/projected/82d1300a-6831-4de2-a99c-90a2b28f9a33-kube-api-access-w4zp2\") pod \"console-operator-9d4b6777b-kqmw8\" (UID: \"82d1300a-6831-4de2-a99c-90a2b28f9a33\") " pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" Apr 16 19:18:35.578332 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.578280 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/82d1300a-6831-4de2-a99c-90a2b28f9a33-trusted-ca\") pod \"console-operator-9d4b6777b-kqmw8\" (UID: \"82d1300a-6831-4de2-a99c-90a2b28f9a33\") " pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" Apr 16 19:18:35.578441 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.578421 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-br6lb\" (UniqueName: \"kubernetes.io/projected/d22f5821-5636-4a50-8b36-4c7eed507c2d-kube-api-access-br6lb\") pod \"cluster-samples-operator-6dc5bdb6b4-8hf4w\" (UID: \"d22f5821-5636-4a50-8b36-4c7eed507c2d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8hf4w" Apr 16 19:18:35.578496 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.578460 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/907cddc0-db0e-4159-aa65-8778fb6d6a30-tmp\") pod \"insights-operator-585dfdc468-ksczz\" (UID: \"907cddc0-db0e-4159-aa65-8778fb6d6a30\") " pod="openshift-insights/insights-operator-585dfdc468-ksczz" Apr 16 19:18:35.578550 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.578495 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/448c73ab-a4f5-4a5c-8143-1deb13253eec-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9sbws\" (UID: \"448c73ab-a4f5-4a5c-8143-1deb13253eec\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws" Apr 16 19:18:35.578550 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.578531 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82d1300a-6831-4de2-a99c-90a2b28f9a33-serving-cert\") pod \"console-operator-9d4b6777b-kqmw8\" (UID: \"82d1300a-6831-4de2-a99c-90a2b28f9a33\") " pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" Apr 16 19:18:35.578643 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.578558 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl6b6\" (UniqueName: \"kubernetes.io/projected/907cddc0-db0e-4159-aa65-8778fb6d6a30-kube-api-access-nl6b6\") pod \"insights-operator-585dfdc468-ksczz\" (UID: \"907cddc0-db0e-4159-aa65-8778fb6d6a30\") " pod="openshift-insights/insights-operator-585dfdc468-ksczz" Apr 16 19:18:35.578643 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.578628 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/907cddc0-db0e-4159-aa65-8778fb6d6a30-service-ca-bundle\") pod \"insights-operator-585dfdc468-ksczz\" (UID: \"907cddc0-db0e-4159-aa65-8778fb6d6a30\") " pod="openshift-insights/insights-operator-585dfdc468-ksczz" Apr 16 19:18:35.578738 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.578668 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82d1300a-6831-4de2-a99c-90a2b28f9a33-config\") pod \"console-operator-9d4b6777b-kqmw8\" (UID: \"82d1300a-6831-4de2-a99c-90a2b28f9a33\") " pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" Apr 16 19:18:35.578738 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.578705 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvfnv\" (UniqueName: \"kubernetes.io/projected/448c73ab-a4f5-4a5c-8143-1deb13253eec-kube-api-access-qvfnv\") pod \"cluster-monitoring-operator-75587bd455-9sbws\" (UID: \"448c73ab-a4f5-4a5c-8143-1deb13253eec\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws" Apr 16 19:18:35.578819 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.578733 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/907cddc0-db0e-4159-aa65-8778fb6d6a30-snapshots\") pod \"insights-operator-585dfdc468-ksczz\" (UID: \"907cddc0-db0e-4159-aa65-8778fb6d6a30\") " pod="openshift-insights/insights-operator-585dfdc468-ksczz" Apr 16 19:18:35.578819 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.578763 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/907cddc0-db0e-4159-aa65-8778fb6d6a30-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-ksczz\" (UID: \"907cddc0-db0e-4159-aa65-8778fb6d6a30\") " pod="openshift-insights/insights-operator-585dfdc468-ksczz" Apr 16 19:18:35.578819 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.578796 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d22f5821-5636-4a50-8b36-4c7eed507c2d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8hf4w\" (UID: \"d22f5821-5636-4a50-8b36-4c7eed507c2d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8hf4w" Apr 16 19:18:35.578912 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.578861 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/448c73ab-a4f5-4a5c-8143-1deb13253eec-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-9sbws\" (UID: \"448c73ab-a4f5-4a5c-8143-1deb13253eec\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws" Apr 16 19:18:35.578912 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:35.578868 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:18:35.578912 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.578892 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/907cddc0-db0e-4159-aa65-8778fb6d6a30-serving-cert\") pod \"insights-operator-585dfdc468-ksczz\" (UID: \"907cddc0-db0e-4159-aa65-8778fb6d6a30\") " pod="openshift-insights/insights-operator-585dfdc468-ksczz" Apr 16 19:18:35.579022 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:35.578919 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d22f5821-5636-4a50-8b36-4c7eed507c2d-samples-operator-tls podName:d22f5821-5636-4a50-8b36-4c7eed507c2d nodeName:}" failed. No retries permitted until 2026-04-16 19:18:36.078900918 +0000 UTC m=+43.573491523 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d22f5821-5636-4a50-8b36-4c7eed507c2d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8hf4w" (UID: "d22f5821-5636-4a50-8b36-4c7eed507c2d") : secret "samples-operator-tls" not found Apr 16 19:18:35.588057 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.588029 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-br6lb\" (UniqueName: \"kubernetes.io/projected/d22f5821-5636-4a50-8b36-4c7eed507c2d-kube-api-access-br6lb\") pod \"cluster-samples-operator-6dc5bdb6b4-8hf4w\" (UID: \"d22f5821-5636-4a50-8b36-4c7eed507c2d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8hf4w" Apr 16 19:18:35.590028 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.590010 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w928t\" (UniqueName: \"kubernetes.io/projected/c4c4f849-8b71-4e5a-a7d6-079b83a72af1-kube-api-access-w928t\") pod \"volume-data-source-validator-7c6cbb6c87-f8xvx\" (UID: \"c4c4f849-8b71-4e5a-a7d6-079b83a72af1\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-f8xvx" Apr 16 19:18:35.615826 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.615805 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-f8xvx" Apr 16 19:18:35.679327 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.679293 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mcfl\" (UniqueName: \"kubernetes.io/projected/260a217a-9aa3-43e3-9715-9255e451adff-kube-api-access-9mcfl\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:35.679468 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.679334 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82d1300a-6831-4de2-a99c-90a2b28f9a33-config\") pod \"console-operator-9d4b6777b-kqmw8\" (UID: \"82d1300a-6831-4de2-a99c-90a2b28f9a33\") " pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" Apr 16 19:18:35.679468 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.679391 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvfnv\" (UniqueName: \"kubernetes.io/projected/448c73ab-a4f5-4a5c-8143-1deb13253eec-kube-api-access-qvfnv\") pod \"cluster-monitoring-operator-75587bd455-9sbws\" (UID: \"448c73ab-a4f5-4a5c-8143-1deb13253eec\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws" Apr 16 19:18:35.679532 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.679471 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/907cddc0-db0e-4159-aa65-8778fb6d6a30-snapshots\") pod \"insights-operator-585dfdc468-ksczz\" (UID: \"907cddc0-db0e-4159-aa65-8778fb6d6a30\") " pod="openshift-insights/insights-operator-585dfdc468-ksczz" Apr 16 19:18:35.679532 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.679502 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/907cddc0-db0e-4159-aa65-8778fb6d6a30-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-ksczz\" (UID: \"907cddc0-db0e-4159-aa65-8778fb6d6a30\") " pod="openshift-insights/insights-operator-585dfdc468-ksczz" Apr 16 19:18:35.679596 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.679530 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-metrics-certs\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:35.679596 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.679579 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2559\" (UniqueName: \"kubernetes.io/projected/fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae-kube-api-access-w2559\") pod \"service-ca-operator-d6fc45fc5-67nzv\" (UID: \"fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-67nzv" Apr 16 19:18:35.679668 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.679619 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/260a217a-9aa3-43e3-9715-9255e451adff-service-ca-bundle\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:35.679668 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.679651 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/448c73ab-a4f5-4a5c-8143-1deb13253eec-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-9sbws\" (UID: \"448c73ab-a4f5-4a5c-8143-1deb13253eec\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws" Apr 16 19:18:35.679797 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.679780 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/907cddc0-db0e-4159-aa65-8778fb6d6a30-serving-cert\") pod \"insights-operator-585dfdc468-ksczz\" (UID: \"907cddc0-db0e-4159-aa65-8778fb6d6a30\") " pod="openshift-insights/insights-operator-585dfdc468-ksczz" Apr 16 19:18:35.679856 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.679807 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae-serving-cert\") pod \"service-ca-operator-d6fc45fc5-67nzv\" (UID: \"fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-67nzv" Apr 16 19:18:35.679856 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.679832 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-default-certificate\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:35.679957 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.679862 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4zp2\" (UniqueName: \"kubernetes.io/projected/82d1300a-6831-4de2-a99c-90a2b28f9a33-kube-api-access-w4zp2\") pod \"console-operator-9d4b6777b-kqmw8\" (UID: \"82d1300a-6831-4de2-a99c-90a2b28f9a33\") " pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" Apr 16 19:18:35.679957 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.679929 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/82d1300a-6831-4de2-a99c-90a2b28f9a33-trusted-ca\") pod \"console-operator-9d4b6777b-kqmw8\" (UID: \"82d1300a-6831-4de2-a99c-90a2b28f9a33\") " pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" Apr 16 19:18:35.680060 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.679984 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/907cddc0-db0e-4159-aa65-8778fb6d6a30-snapshots\") pod \"insights-operator-585dfdc468-ksczz\" (UID: \"907cddc0-db0e-4159-aa65-8778fb6d6a30\") " pod="openshift-insights/insights-operator-585dfdc468-ksczz" Apr 16 19:18:35.680060 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.679998 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pl62h\" (UID: \"04310661-51ad-4a3b-86cf-b9a2a0d1dda1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pl62h" Apr 16 19:18:35.680060 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.680033 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-stats-auth\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:35.680191 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.680066 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c83523cc-27c2-4924-9113-67ff5b311e42-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-dtbpz\" (UID: \"c83523cc-27c2-4924-9113-67ff5b311e42\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dtbpz" Apr 16 19:18:35.680191 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.680097 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/907cddc0-db0e-4159-aa65-8778fb6d6a30-tmp\") pod \"insights-operator-585dfdc468-ksczz\" (UID: \"907cddc0-db0e-4159-aa65-8778fb6d6a30\") " pod="openshift-insights/insights-operator-585dfdc468-ksczz" Apr 16 19:18:35.680191 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.680128 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb654\" (UniqueName: \"kubernetes.io/projected/e8903e24-441a-4973-8c06-9e393bd73cd7-kube-api-access-gb654\") pod \"network-check-source-8894fc9bd-9hfwk\" (UID: \"e8903e24-441a-4973-8c06-9e393bd73cd7\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9hfwk" Apr 16 19:18:35.680191 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.680183 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/448c73ab-a4f5-4a5c-8143-1deb13253eec-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9sbws\" (UID: \"448c73ab-a4f5-4a5c-8143-1deb13253eec\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws" Apr 16 19:18:35.680313 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.680209 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae-config\") pod \"service-ca-operator-d6fc45fc5-67nzv\" (UID: \"fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-67nzv" Apr 16 19:18:35.680313 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.680235 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82d1300a-6831-4de2-a99c-90a2b28f9a33-serving-cert\") pod \"console-operator-9d4b6777b-kqmw8\" (UID: \"82d1300a-6831-4de2-a99c-90a2b28f9a33\") " pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" Apr 16 19:18:35.680313 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.680262 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nl6b6\" (UniqueName: \"kubernetes.io/projected/907cddc0-db0e-4159-aa65-8778fb6d6a30-kube-api-access-nl6b6\") pod \"insights-operator-585dfdc468-ksczz\" (UID: \"907cddc0-db0e-4159-aa65-8778fb6d6a30\") " pod="openshift-insights/insights-operator-585dfdc468-ksczz" Apr 16 19:18:35.680313 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.680289 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83523cc-27c2-4924-9113-67ff5b311e42-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-dtbpz\" (UID: \"c83523cc-27c2-4924-9113-67ff5b311e42\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dtbpz" Apr 16 19:18:35.680313 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:35.680302 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:18:35.680445 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.680316 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9gwn\" (UniqueName: \"kubernetes.io/projected/c83523cc-27c2-4924-9113-67ff5b311e42-kube-api-access-w9gwn\") pod \"kube-storage-version-migrator-operator-6769c5d45-dtbpz\" (UID: \"c83523cc-27c2-4924-9113-67ff5b311e42\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dtbpz" Apr 16 19:18:35.680445 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.680350 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-pl62h\" (UID: \"04310661-51ad-4a3b-86cf-b9a2a0d1dda1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pl62h" Apr 16 19:18:35.680445 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.680369 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/907cddc0-db0e-4159-aa65-8778fb6d6a30-tmp\") pod \"insights-operator-585dfdc468-ksczz\" (UID: \"907cddc0-db0e-4159-aa65-8778fb6d6a30\") " pod="openshift-insights/insights-operator-585dfdc468-ksczz" Apr 16 19:18:35.680445 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.680395 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/907cddc0-db0e-4159-aa65-8778fb6d6a30-service-ca-bundle\") pod \"insights-operator-585dfdc468-ksczz\" (UID: \"907cddc0-db0e-4159-aa65-8778fb6d6a30\") " pod="openshift-insights/insights-operator-585dfdc468-ksczz" Apr 16 19:18:35.680586 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.680571 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/907cddc0-db0e-4159-aa65-8778fb6d6a30-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-ksczz\" (UID: \"907cddc0-db0e-4159-aa65-8778fb6d6a30\") " pod="openshift-insights/insights-operator-585dfdc468-ksczz" Apr 16 19:18:35.680639 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:35.680629 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/448c73ab-a4f5-4a5c-8143-1deb13253eec-cluster-monitoring-operator-tls podName:448c73ab-a4f5-4a5c-8143-1deb13253eec nodeName:}" failed. No retries permitted until 2026-04-16 19:18:36.180613233 +0000 UTC m=+43.675203831 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/448c73ab-a4f5-4a5c-8143-1deb13253eec-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9sbws" (UID: "448c73ab-a4f5-4a5c-8143-1deb13253eec") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:18:35.680834 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.680818 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/907cddc0-db0e-4159-aa65-8778fb6d6a30-service-ca-bundle\") pod \"insights-operator-585dfdc468-ksczz\" (UID: \"907cddc0-db0e-4159-aa65-8778fb6d6a30\") " pod="openshift-insights/insights-operator-585dfdc468-ksczz" Apr 16 19:18:35.682276 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.682257 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/907cddc0-db0e-4159-aa65-8778fb6d6a30-serving-cert\") pod \"insights-operator-585dfdc468-ksczz\" (UID: \"907cddc0-db0e-4159-aa65-8778fb6d6a30\") " pod="openshift-insights/insights-operator-585dfdc468-ksczz" Apr 16 19:18:35.689411 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.689363 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/448c73ab-a4f5-4a5c-8143-1deb13253eec-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-9sbws\" (UID: \"448c73ab-a4f5-4a5c-8143-1deb13253eec\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws" Apr 16 19:18:35.689530 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.689434 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82d1300a-6831-4de2-a99c-90a2b28f9a33-config\") pod \"console-operator-9d4b6777b-kqmw8\" (UID: \"82d1300a-6831-4de2-a99c-90a2b28f9a33\") " pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" Apr 16 19:18:35.691128 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.691102 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82d1300a-6831-4de2-a99c-90a2b28f9a33-serving-cert\") pod \"console-operator-9d4b6777b-kqmw8\" (UID: \"82d1300a-6831-4de2-a99c-90a2b28f9a33\") " pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" Apr 16 19:18:35.691477 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.691429 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvfnv\" (UniqueName: \"kubernetes.io/projected/448c73ab-a4f5-4a5c-8143-1deb13253eec-kube-api-access-qvfnv\") pod \"cluster-monitoring-operator-75587bd455-9sbws\" (UID: \"448c73ab-a4f5-4a5c-8143-1deb13253eec\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws" Apr 16 19:18:35.692237 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.692213 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4zp2\" (UniqueName: \"kubernetes.io/projected/82d1300a-6831-4de2-a99c-90a2b28f9a33-kube-api-access-w4zp2\") pod \"console-operator-9d4b6777b-kqmw8\" (UID: \"82d1300a-6831-4de2-a99c-90a2b28f9a33\") " pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" Apr 16 19:18:35.692322 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.692245 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl6b6\" (UniqueName: \"kubernetes.io/projected/907cddc0-db0e-4159-aa65-8778fb6d6a30-kube-api-access-nl6b6\") pod \"insights-operator-585dfdc468-ksczz\" (UID: \"907cddc0-db0e-4159-aa65-8778fb6d6a30\") " pod="openshift-insights/insights-operator-585dfdc468-ksczz" Apr 16 19:18:35.709440 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.699280 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/82d1300a-6831-4de2-a99c-90a2b28f9a33-trusted-ca\") pod \"console-operator-9d4b6777b-kqmw8\" (UID: \"82d1300a-6831-4de2-a99c-90a2b28f9a33\") " pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" Apr 16 19:18:35.709440 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.707505 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-ksczz" Apr 16 19:18:35.746499 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.746412 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" Apr 16 19:18:35.781865 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.781789 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/260a217a-9aa3-43e3-9715-9255e451adff-service-ca-bundle\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:35.781865 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.781850 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae-serving-cert\") pod \"service-ca-operator-d6fc45fc5-67nzv\" (UID: \"fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-67nzv" Apr 16 19:18:35.782063 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.781881 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-default-certificate\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:35.782063 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.781947 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pl62h\" (UID: \"04310661-51ad-4a3b-86cf-b9a2a0d1dda1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pl62h" Apr 16 19:18:35.782063 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:35.781967 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/260a217a-9aa3-43e3-9715-9255e451adff-service-ca-bundle podName:260a217a-9aa3-43e3-9715-9255e451adff nodeName:}" failed. No retries permitted until 2026-04-16 19:18:36.281949174 +0000 UTC m=+43.776539770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/260a217a-9aa3-43e3-9715-9255e451adff-service-ca-bundle") pod "router-default-5f4947fcd8-gffmg" (UID: "260a217a-9aa3-43e3-9715-9255e451adff") : configmap references non-existent config key: service-ca.crt Apr 16 19:18:35.782063 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.782008 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-stats-auth\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:35.782063 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.782030 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c83523cc-27c2-4924-9113-67ff5b311e42-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-dtbpz\" (UID: \"c83523cc-27c2-4924-9113-67ff5b311e42\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dtbpz" Apr 16 19:18:35.782063 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.782062 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gb654\" (UniqueName: \"kubernetes.io/projected/e8903e24-441a-4973-8c06-9e393bd73cd7-kube-api-access-gb654\") pod \"network-check-source-8894fc9bd-9hfwk\" (UID: \"e8903e24-441a-4973-8c06-9e393bd73cd7\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9hfwk" Apr 16 19:18:35.782388 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.782097 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae-config\") pod \"service-ca-operator-d6fc45fc5-67nzv\" (UID: \"fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-67nzv" Apr 16 19:18:35.782388 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.782126 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83523cc-27c2-4924-9113-67ff5b311e42-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-dtbpz\" (UID: \"c83523cc-27c2-4924-9113-67ff5b311e42\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dtbpz" Apr 16 19:18:35.782388 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.782174 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9gwn\" (UniqueName: \"kubernetes.io/projected/c83523cc-27c2-4924-9113-67ff5b311e42-kube-api-access-w9gwn\") pod \"kube-storage-version-migrator-operator-6769c5d45-dtbpz\" (UID: \"c83523cc-27c2-4924-9113-67ff5b311e42\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dtbpz" Apr 16 19:18:35.782388 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.782205 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-pl62h\" (UID: \"04310661-51ad-4a3b-86cf-b9a2a0d1dda1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pl62h" Apr 16 19:18:35.782388 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.782257 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mcfl\" (UniqueName: \"kubernetes.io/projected/260a217a-9aa3-43e3-9715-9255e451adff-kube-api-access-9mcfl\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:35.782388 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.782317 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-metrics-certs\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:35.782388 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.782368 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2559\" (UniqueName: \"kubernetes.io/projected/fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae-kube-api-access-w2559\") pod \"service-ca-operator-d6fc45fc5-67nzv\" (UID: \"fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-67nzv" Apr 16 19:18:35.783343 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.783239 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae-config\") pod \"service-ca-operator-d6fc45fc5-67nzv\" (UID: \"fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-67nzv" Apr 16 19:18:35.783343 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:35.783266 2582 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:18:35.783511 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:35.783376 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-metrics-certs podName:260a217a-9aa3-43e3-9715-9255e451adff nodeName:}" failed. No retries permitted until 2026-04-16 19:18:36.283358741 +0000 UTC m=+43.777949338 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-metrics-certs") pod "router-default-5f4947fcd8-gffmg" (UID: "260a217a-9aa3-43e3-9715-9255e451adff") : secret "router-metrics-certs-default" not found Apr 16 19:18:35.783724 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:35.783623 2582 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:18:35.783724 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:35.783681 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-networking-console-plugin-cert podName:04310661-51ad-4a3b-86cf-b9a2a0d1dda1 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:36.283665608 +0000 UTC m=+43.778256215 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-pl62h" (UID: "04310661-51ad-4a3b-86cf-b9a2a0d1dda1") : secret "networking-console-plugin-cert" not found Apr 16 19:18:35.783864 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.783819 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c83523cc-27c2-4924-9113-67ff5b311e42-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-dtbpz\" (UID: \"c83523cc-27c2-4924-9113-67ff5b311e42\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dtbpz" Apr 16 19:18:35.783919 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.783859 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-pl62h\" (UID: \"04310661-51ad-4a3b-86cf-b9a2a0d1dda1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pl62h" Apr 16 19:18:35.786296 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.786237 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae-serving-cert\") pod \"service-ca-operator-d6fc45fc5-67nzv\" (UID: \"fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-67nzv" Apr 16 19:18:35.786496 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.786447 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-stats-auth\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:35.786632 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.786613 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c83523cc-27c2-4924-9113-67ff5b311e42-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-dtbpz\" (UID: \"c83523cc-27c2-4924-9113-67ff5b311e42\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dtbpz" Apr 16 19:18:35.788935 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.788908 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-default-certificate\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:35.791666 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.791636 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2559\" (UniqueName: \"kubernetes.io/projected/fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae-kube-api-access-w2559\") pod \"service-ca-operator-d6fc45fc5-67nzv\" (UID: \"fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-67nzv" Apr 16 19:18:35.791919 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.791900 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9gwn\" (UniqueName: \"kubernetes.io/projected/c83523cc-27c2-4924-9113-67ff5b311e42-kube-api-access-w9gwn\") pod \"kube-storage-version-migrator-operator-6769c5d45-dtbpz\" (UID: \"c83523cc-27c2-4924-9113-67ff5b311e42\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dtbpz" Apr 16 19:18:35.792294 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.792276 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mcfl\" (UniqueName: \"kubernetes.io/projected/260a217a-9aa3-43e3-9715-9255e451adff-kube-api-access-9mcfl\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:35.805325 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.805272 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb654\" (UniqueName: \"kubernetes.io/projected/e8903e24-441a-4973-8c06-9e393bd73cd7-kube-api-access-gb654\") pod \"network-check-source-8894fc9bd-9hfwk\" (UID: \"e8903e24-441a-4973-8c06-9e393bd73cd7\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9hfwk" Apr 16 19:18:35.817013 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.816985 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dtbpz" Apr 16 19:18:35.819003 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.818964 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-f8xvx"] Apr 16 19:18:35.824630 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:18:35.824428 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4c4f849_8b71_4e5a_a7d6_079b83a72af1.slice/crio-c4fc2a1e10670b66942b80016aa08770b55d34f50d023393c75876e812e351fc WatchSource:0}: Error finding container c4fc2a1e10670b66942b80016aa08770b55d34f50d023393c75876e812e351fc: Status 404 returned error can't find the container with id c4fc2a1e10670b66942b80016aa08770b55d34f50d023393c75876e812e351fc Apr 16 19:18:35.833472 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.833448 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-67nzv" Apr 16 19:18:35.844655 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.844631 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9hfwk" Apr 16 19:18:35.850497 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.850466 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-ksczz"] Apr 16 19:18:35.864358 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:18:35.864322 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod907cddc0_db0e_4159_aa65_8778fb6d6a30.slice/crio-e8d5c8fd38315e976cbbb64a1fed8019b6fed36ff0ca8a381181c85cd19ed5bc WatchSource:0}: Error finding container e8d5c8fd38315e976cbbb64a1fed8019b6fed36ff0ca8a381181c85cd19ed5bc: Status 404 returned error can't find the container with id e8d5c8fd38315e976cbbb64a1fed8019b6fed36ff0ca8a381181c85cd19ed5bc Apr 16 19:18:35.878717 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.878130 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-kqmw8"] Apr 16 19:18:35.885560 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:18:35.885338 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82d1300a_6831_4de2_a99c_90a2b28f9a33.slice/crio-bf1e5d24e0974e86694285aa573209234e0b69f32d3d43e694383af0b345f673 WatchSource:0}: Error finding container bf1e5d24e0974e86694285aa573209234e0b69f32d3d43e694383af0b345f673: Status 404 returned error can't find the container with id bf1e5d24e0974e86694285aa573209234e0b69f32d3d43e694383af0b345f673 Apr 16 19:18:35.969225 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.969166 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dtbpz"] Apr 16 19:18:35.970888 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:18:35.970861 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc83523cc_27c2_4924_9113_67ff5b311e42.slice/crio-4a126f1553f3e33b26d1c047c7b1e220a2e17edec63910ff2c7d5e8457f254d8 WatchSource:0}: Error finding container 4a126f1553f3e33b26d1c047c7b1e220a2e17edec63910ff2c7d5e8457f254d8: Status 404 returned error can't find the container with id 4a126f1553f3e33b26d1c047c7b1e220a2e17edec63910ff2c7d5e8457f254d8 Apr 16 19:18:35.994022 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.993996 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-67nzv"] Apr 16 19:18:35.996636 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:18:35.996611 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfefe6f16_f5b3_4e5d_bd2e_e4cd50f953ae.slice/crio-6fcc4ca648e00ed80e0260d9ce192a76597f91d2403ea7d47324833092a7df27 WatchSource:0}: Error finding container 6fcc4ca648e00ed80e0260d9ce192a76597f91d2403ea7d47324833092a7df27: Status 404 returned error can't find the container with id 6fcc4ca648e00ed80e0260d9ce192a76597f91d2403ea7d47324833092a7df27 Apr 16 19:18:35.998983 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:35.998951 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-9hfwk"] Apr 16 19:18:36.002575 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:18:36.002554 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8903e24_441a_4973_8c06_9e393bd73cd7.slice/crio-fe72c96a3ba85a4a64180972229248138cdab25cdc5a4eadef7343e945ccf043 WatchSource:0}: Error finding container fe72c96a3ba85a4a64180972229248138cdab25cdc5a4eadef7343e945ccf043: Status 404 returned error can't find the container with id fe72c96a3ba85a4a64180972229248138cdab25cdc5a4eadef7343e945ccf043 Apr 16 19:18:36.086306 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:36.086222 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d22f5821-5636-4a50-8b36-4c7eed507c2d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8hf4w\" (UID: \"d22f5821-5636-4a50-8b36-4c7eed507c2d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8hf4w" Apr 16 19:18:36.086450 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:36.086348 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:18:36.086450 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:36.086407 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d22f5821-5636-4a50-8b36-4c7eed507c2d-samples-operator-tls podName:d22f5821-5636-4a50-8b36-4c7eed507c2d nodeName:}" failed. No retries permitted until 2026-04-16 19:18:37.08639433 +0000 UTC m=+44.580984938 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d22f5821-5636-4a50-8b36-4c7eed507c2d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8hf4w" (UID: "d22f5821-5636-4a50-8b36-4c7eed507c2d") : secret "samples-operator-tls" not found Apr 16 19:18:36.187169 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:36.187127 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/448c73ab-a4f5-4a5c-8143-1deb13253eec-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9sbws\" (UID: \"448c73ab-a4f5-4a5c-8143-1deb13253eec\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws" Apr 16 19:18:36.187312 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:36.187279 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:18:36.187362 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:36.187351 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/448c73ab-a4f5-4a5c-8143-1deb13253eec-cluster-monitoring-operator-tls podName:448c73ab-a4f5-4a5c-8143-1deb13253eec nodeName:}" failed. No retries permitted until 2026-04-16 19:18:37.187330271 +0000 UTC m=+44.681920866 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/448c73ab-a4f5-4a5c-8143-1deb13253eec-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9sbws" (UID: "448c73ab-a4f5-4a5c-8143-1deb13253eec") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:18:36.268057 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:36.268028 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-f8xvx" event={"ID":"c4c4f849-8b71-4e5a-a7d6-079b83a72af1","Type":"ContainerStarted","Data":"c4fc2a1e10670b66942b80016aa08770b55d34f50d023393c75876e812e351fc"} Apr 16 19:18:36.268868 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:36.268848 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-67nzv" event={"ID":"fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae","Type":"ContainerStarted","Data":"6fcc4ca648e00ed80e0260d9ce192a76597f91d2403ea7d47324833092a7df27"} Apr 16 19:18:36.269899 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:36.269877 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-ksczz" event={"ID":"907cddc0-db0e-4159-aa65-8778fb6d6a30","Type":"ContainerStarted","Data":"e8d5c8fd38315e976cbbb64a1fed8019b6fed36ff0ca8a381181c85cd19ed5bc"} Apr 16 19:18:36.270867 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:36.270847 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" event={"ID":"82d1300a-6831-4de2-a99c-90a2b28f9a33","Type":"ContainerStarted","Data":"bf1e5d24e0974e86694285aa573209234e0b69f32d3d43e694383af0b345f673"} Apr 16 19:18:36.271805 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:36.271786 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dtbpz" event={"ID":"c83523cc-27c2-4924-9113-67ff5b311e42","Type":"ContainerStarted","Data":"4a126f1553f3e33b26d1c047c7b1e220a2e17edec63910ff2c7d5e8457f254d8"} Apr 16 19:18:36.272604 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:36.272582 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9hfwk" event={"ID":"e8903e24-441a-4973-8c06-9e393bd73cd7","Type":"ContainerStarted","Data":"fe72c96a3ba85a4a64180972229248138cdab25cdc5a4eadef7343e945ccf043"} Apr 16 19:18:36.287998 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:36.287977 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pl62h\" (UID: \"04310661-51ad-4a3b-86cf-b9a2a0d1dda1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pl62h" Apr 16 19:18:36.288338 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:36.288094 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-metrics-certs\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:36.288338 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:36.288110 2582 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:18:36.288338 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:36.288176 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-networking-console-plugin-cert podName:04310661-51ad-4a3b-86cf-b9a2a0d1dda1 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:37.288145722 +0000 UTC m=+44.782736317 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-pl62h" (UID: "04310661-51ad-4a3b-86cf-b9a2a0d1dda1") : secret "networking-console-plugin-cert" not found Apr 16 19:18:36.288338 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:36.288197 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/260a217a-9aa3-43e3-9715-9255e451adff-service-ca-bundle\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:36.288338 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:36.288224 2582 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:18:36.288338 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:36.288282 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-metrics-certs podName:260a217a-9aa3-43e3-9715-9255e451adff nodeName:}" failed. No retries permitted until 2026-04-16 19:18:37.288267427 +0000 UTC m=+44.782858023 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-metrics-certs") pod "router-default-5f4947fcd8-gffmg" (UID: "260a217a-9aa3-43e3-9715-9255e451adff") : secret "router-metrics-certs-default" not found Apr 16 19:18:36.288338 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:36.288299 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/260a217a-9aa3-43e3-9715-9255e451adff-service-ca-bundle podName:260a217a-9aa3-43e3-9715-9255e451adff nodeName:}" failed. No retries permitted until 2026-04-16 19:18:37.288290433 +0000 UTC m=+44.782881034 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/260a217a-9aa3-43e3-9715-9255e451adff-service-ca-bundle") pod "router-default-5f4947fcd8-gffmg" (UID: "260a217a-9aa3-43e3-9715-9255e451adff") : configmap references non-existent config key: service-ca.crt Apr 16 19:18:37.096600 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:37.096562 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d22f5821-5636-4a50-8b36-4c7eed507c2d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8hf4w\" (UID: \"d22f5821-5636-4a50-8b36-4c7eed507c2d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8hf4w" Apr 16 19:18:37.096836 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:37.096785 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:18:37.096914 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:37.096858 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d22f5821-5636-4a50-8b36-4c7eed507c2d-samples-operator-tls podName:d22f5821-5636-4a50-8b36-4c7eed507c2d nodeName:}" failed. No retries permitted until 2026-04-16 19:18:39.096837492 +0000 UTC m=+46.591428088 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d22f5821-5636-4a50-8b36-4c7eed507c2d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8hf4w" (UID: "d22f5821-5636-4a50-8b36-4c7eed507c2d") : secret "samples-operator-tls" not found Apr 16 19:18:37.197735 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:37.197306 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:37.197735 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:37.197464 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/448c73ab-a4f5-4a5c-8143-1deb13253eec-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9sbws\" (UID: \"448c73ab-a4f5-4a5c-8143-1deb13253eec\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws" Apr 16 19:18:37.197735 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:37.197563 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:18:37.197735 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:37.197587 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86db68fc76-grwtr: secret "image-registry-tls" not found Apr 16 19:18:37.197735 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:37.197589 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:18:37.197735 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:37.197648 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/448c73ab-a4f5-4a5c-8143-1deb13253eec-cluster-monitoring-operator-tls podName:448c73ab-a4f5-4a5c-8143-1deb13253eec nodeName:}" failed. No retries permitted until 2026-04-16 19:18:39.19762926 +0000 UTC m=+46.692219880 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/448c73ab-a4f5-4a5c-8143-1deb13253eec-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9sbws" (UID: "448c73ab-a4f5-4a5c-8143-1deb13253eec") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:18:37.197735 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:37.197668 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls podName:fff8270c-4771-4655-8abd-7341281f3173 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:45.197657719 +0000 UTC m=+52.692248318 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls") pod "image-registry-86db68fc76-grwtr" (UID: "fff8270c-4771-4655-8abd-7341281f3173") : secret "image-registry-tls" not found Apr 16 19:18:37.297922 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:37.297887 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-metrics-certs\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:37.298619 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:37.298048 2582 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:18:37.298619 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:37.298121 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-metrics-certs podName:260a217a-9aa3-43e3-9715-9255e451adff nodeName:}" failed. No retries permitted until 2026-04-16 19:18:39.298100254 +0000 UTC m=+46.792690862 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-metrics-certs") pod "router-default-5f4947fcd8-gffmg" (UID: "260a217a-9aa3-43e3-9715-9255e451adff") : secret "router-metrics-certs-default" not found Apr 16 19:18:37.298619 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:37.298168 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/260a217a-9aa3-43e3-9715-9255e451adff-service-ca-bundle\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:37.298619 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:37.298252 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pl62h\" (UID: \"04310661-51ad-4a3b-86cf-b9a2a0d1dda1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pl62h" Apr 16 19:18:37.298619 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:37.298324 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-cert\") pod \"ingress-canary-dp4cw\" (UID: \"d82ed6e1-d7aa-4d47-bcb6-f4539431d578\") " pod="openshift-ingress-canary/ingress-canary-dp4cw" Apr 16 19:18:37.298619 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:37.298365 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/513c0bb7-f253-4f0d-bc14-1d473d560c39-metrics-tls\") pod \"dns-default-8xgc6\" (UID: \"513c0bb7-f253-4f0d-bc14-1d473d560c39\") " pod="openshift-dns/dns-default-8xgc6" Apr 16 19:18:37.298619 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:37.298476 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:18:37.298619 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:37.298523 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/513c0bb7-f253-4f0d-bc14-1d473d560c39-metrics-tls podName:513c0bb7-f253-4f0d-bc14-1d473d560c39 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:45.298508128 +0000 UTC m=+52.793098724 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/513c0bb7-f253-4f0d-bc14-1d473d560c39-metrics-tls") pod "dns-default-8xgc6" (UID: "513c0bb7-f253-4f0d-bc14-1d473d560c39") : secret "dns-default-metrics-tls" not found Apr 16 19:18:37.298619 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:37.298522 2582 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:18:37.298619 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:37.298562 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-networking-console-plugin-cert podName:04310661-51ad-4a3b-86cf-b9a2a0d1dda1 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:39.298552381 +0000 UTC m=+46.793142982 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-pl62h" (UID: "04310661-51ad-4a3b-86cf-b9a2a0d1dda1") : secret "networking-console-plugin-cert" not found Apr 16 19:18:37.298619 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:37.298597 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/260a217a-9aa3-43e3-9715-9255e451adff-service-ca-bundle podName:260a217a-9aa3-43e3-9715-9255e451adff nodeName:}" failed. No retries permitted until 2026-04-16 19:18:39.298581901 +0000 UTC m=+46.793172496 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/260a217a-9aa3-43e3-9715-9255e451adff-service-ca-bundle") pod "router-default-5f4947fcd8-gffmg" (UID: "260a217a-9aa3-43e3-9715-9255e451adff") : configmap references non-existent config key: service-ca.crt Apr 16 19:18:37.298619 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:37.298619 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:18:37.299457 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:37.298653 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-cert podName:d82ed6e1-d7aa-4d47-bcb6-f4539431d578 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:45.298643207 +0000 UTC m=+52.793233805 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-cert") pod "ingress-canary-dp4cw" (UID: "d82ed6e1-d7aa-4d47-bcb6-f4539431d578") : secret "canary-serving-cert" not found Apr 16 19:18:39.117692 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:39.117658 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d22f5821-5636-4a50-8b36-4c7eed507c2d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8hf4w\" (UID: \"d22f5821-5636-4a50-8b36-4c7eed507c2d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8hf4w" Apr 16 19:18:39.118059 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:39.117811 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:18:39.118059 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:39.117880 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d22f5821-5636-4a50-8b36-4c7eed507c2d-samples-operator-tls podName:d22f5821-5636-4a50-8b36-4c7eed507c2d nodeName:}" failed. No retries permitted until 2026-04-16 19:18:43.11786033 +0000 UTC m=+50.612450929 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d22f5821-5636-4a50-8b36-4c7eed507c2d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8hf4w" (UID: "d22f5821-5636-4a50-8b36-4c7eed507c2d") : secret "samples-operator-tls" not found Apr 16 19:18:39.219006 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:39.218969 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/448c73ab-a4f5-4a5c-8143-1deb13253eec-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9sbws\" (UID: \"448c73ab-a4f5-4a5c-8143-1deb13253eec\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws" Apr 16 19:18:39.219215 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:39.219130 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:18:39.219274 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:39.219221 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/448c73ab-a4f5-4a5c-8143-1deb13253eec-cluster-monitoring-operator-tls podName:448c73ab-a4f5-4a5c-8143-1deb13253eec nodeName:}" failed. No retries permitted until 2026-04-16 19:18:43.219202865 +0000 UTC m=+50.713793460 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/448c73ab-a4f5-4a5c-8143-1deb13253eec-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9sbws" (UID: "448c73ab-a4f5-4a5c-8143-1deb13253eec") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:18:39.320375 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:39.320327 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-metrics-certs\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:39.320531 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:39.320402 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/260a217a-9aa3-43e3-9715-9255e451adff-service-ca-bundle\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:39.320531 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:39.320464 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pl62h\" (UID: \"04310661-51ad-4a3b-86cf-b9a2a0d1dda1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pl62h" Apr 16 19:18:39.320633 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:39.320618 2582 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:18:39.320696 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:39.320674 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-networking-console-plugin-cert podName:04310661-51ad-4a3b-86cf-b9a2a0d1dda1 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:43.320661127 +0000 UTC m=+50.815251721 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-pl62h" (UID: "04310661-51ad-4a3b-86cf-b9a2a0d1dda1") : secret "networking-console-plugin-cert" not found Apr 16 19:18:39.320814 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:39.320760 2582 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:18:39.320814 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:39.320801 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/260a217a-9aa3-43e3-9715-9255e451adff-service-ca-bundle podName:260a217a-9aa3-43e3-9715-9255e451adff nodeName:}" failed. No retries permitted until 2026-04-16 19:18:43.320786788 +0000 UTC m=+50.815377383 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/260a217a-9aa3-43e3-9715-9255e451adff-service-ca-bundle") pod "router-default-5f4947fcd8-gffmg" (UID: "260a217a-9aa3-43e3-9715-9255e451adff") : configmap references non-existent config key: service-ca.crt Apr 16 19:18:39.320901 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:39.320821 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-metrics-certs podName:260a217a-9aa3-43e3-9715-9255e451adff nodeName:}" failed. No retries permitted until 2026-04-16 19:18:43.32081485 +0000 UTC m=+50.815405445 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-metrics-certs") pod "router-default-5f4947fcd8-gffmg" (UID: "260a217a-9aa3-43e3-9715-9255e451adff") : secret "router-metrics-certs-default" not found Apr 16 19:18:40.935229 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:40.935198 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-original-pull-secret\") pod \"global-pull-secret-syncer-fg84m\" (UID: \"27412d9f-8c9a-4ed3-92cb-4002bafb01fa\") " pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:40.937716 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:40.937688 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/27412d9f-8c9a-4ed3-92cb-4002bafb01fa-original-pull-secret\") pod \"global-pull-secret-syncer-fg84m\" (UID: \"27412d9f-8c9a-4ed3-92cb-4002bafb01fa\") " pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:41.025466 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:41.025431 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fg84m" Apr 16 19:18:42.615279 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:42.615254 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fg84m"] Apr 16 19:18:42.734866 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:18:42.734829 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27412d9f_8c9a_4ed3_92cb_4002bafb01fa.slice/crio-3478775cd7b86da4ce2efc70b758d84f93c5843b7ccbaf29282c94812da4806c WatchSource:0}: Error finding container 3478775cd7b86da4ce2efc70b758d84f93c5843b7ccbaf29282c94812da4806c: Status 404 returned error can't find the container with id 3478775cd7b86da4ce2efc70b758d84f93c5843b7ccbaf29282c94812da4806c Apr 16 19:18:43.155519 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:43.155481 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d22f5821-5636-4a50-8b36-4c7eed507c2d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8hf4w\" (UID: \"d22f5821-5636-4a50-8b36-4c7eed507c2d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8hf4w" Apr 16 19:18:43.155719 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:43.155593 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:18:43.155719 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:43.155667 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d22f5821-5636-4a50-8b36-4c7eed507c2d-samples-operator-tls podName:d22f5821-5636-4a50-8b36-4c7eed507c2d nodeName:}" failed. No retries permitted until 2026-04-16 19:18:51.155646614 +0000 UTC m=+58.650237216 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d22f5821-5636-4a50-8b36-4c7eed507c2d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8hf4w" (UID: "d22f5821-5636-4a50-8b36-4c7eed507c2d") : secret "samples-operator-tls" not found Apr 16 19:18:43.257036 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:43.256997 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/448c73ab-a4f5-4a5c-8143-1deb13253eec-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9sbws\" (UID: \"448c73ab-a4f5-4a5c-8143-1deb13253eec\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws" Apr 16 19:18:43.257263 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:43.257179 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:18:43.257263 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:43.257239 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/448c73ab-a4f5-4a5c-8143-1deb13253eec-cluster-monitoring-operator-tls podName:448c73ab-a4f5-4a5c-8143-1deb13253eec nodeName:}" failed. No retries permitted until 2026-04-16 19:18:51.25722035 +0000 UTC m=+58.751810951 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/448c73ab-a4f5-4a5c-8143-1deb13253eec-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9sbws" (UID: "448c73ab-a4f5-4a5c-8143-1deb13253eec") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:18:43.293196 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:43.293125 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-67nzv" event={"ID":"fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae","Type":"ContainerStarted","Data":"815fb8daf0f5bb2930fc2e20db17911bbe7b946e6e99534ba84439e85dafc1de"} Apr 16 19:18:43.294927 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:43.294897 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-ksczz" event={"ID":"907cddc0-db0e-4159-aa65-8778fb6d6a30","Type":"ContainerStarted","Data":"d0693c9ed3941455d440d482cd9d21cb9b7f2e934ad48241fe3ace84cbc86702"} Apr 16 19:18:43.296580 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:43.296556 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/0.log" Apr 16 19:18:43.296686 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:43.296589 2582 generic.go:358] "Generic (PLEG): container finished" podID="82d1300a-6831-4de2-a99c-90a2b28f9a33" containerID="ccb1c7146a2ae31117f65f17c75e654bdcb477deb6b3dd614b4c77825ac043f8" exitCode=255 Apr 16 19:18:43.296849 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:43.296831 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" event={"ID":"82d1300a-6831-4de2-a99c-90a2b28f9a33","Type":"ContainerDied","Data":"ccb1c7146a2ae31117f65f17c75e654bdcb477deb6b3dd614b4c77825ac043f8"} Apr 16 19:18:43.297066 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:43.297033 2582 scope.go:117] "RemoveContainer" containerID="ccb1c7146a2ae31117f65f17c75e654bdcb477deb6b3dd614b4c77825ac043f8" Apr 16 19:18:43.298560 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:43.298525 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dtbpz" event={"ID":"c83523cc-27c2-4924-9113-67ff5b311e42","Type":"ContainerStarted","Data":"673e375e7c29260bd1f132adb45ea21ef1b3d385bd090b1901983e293ddab8de"} Apr 16 19:18:43.300061 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:43.299993 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9hfwk" event={"ID":"e8903e24-441a-4973-8c06-9e393bd73cd7","Type":"ContainerStarted","Data":"15041386b5d71b1c9aa813686d9845b2076744ceb5aae14975d614b4720e5f96"} Apr 16 19:18:43.301485 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:43.301457 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-f8xvx" event={"ID":"c4c4f849-8b71-4e5a-a7d6-079b83a72af1","Type":"ContainerStarted","Data":"6654a8c046fe9b0ef827a2f66898939a543afbdef0401163614504adf2bc4609"} Apr 16 19:18:43.302547 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:43.302524 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fg84m" event={"ID":"27412d9f-8c9a-4ed3-92cb-4002bafb01fa","Type":"ContainerStarted","Data":"3478775cd7b86da4ce2efc70b758d84f93c5843b7ccbaf29282c94812da4806c"} Apr 16 19:18:43.311123 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:43.311072 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-67nzv" podStartSLOduration=1.815442518 podStartE2EDuration="8.311057605s" podCreationTimestamp="2026-04-16 19:18:35 +0000 UTC" firstStartedPulling="2026-04-16 19:18:35.9987269 +0000 UTC m=+43.493317495" lastFinishedPulling="2026-04-16 19:18:42.494341983 +0000 UTC m=+49.988932582" observedRunningTime="2026-04-16 19:18:43.309109198 +0000 UTC m=+50.803699816" watchObservedRunningTime="2026-04-16 19:18:43.311057605 +0000 UTC m=+50.805648200" Apr 16 19:18:43.330265 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:43.330214 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-ksczz" podStartSLOduration=1.702736373 podStartE2EDuration="8.33019719s" podCreationTimestamp="2026-04-16 19:18:35 +0000 UTC" firstStartedPulling="2026-04-16 19:18:35.866707078 +0000 UTC m=+43.361297676" lastFinishedPulling="2026-04-16 19:18:42.494167885 +0000 UTC m=+49.988758493" observedRunningTime="2026-04-16 19:18:43.329224322 +0000 UTC m=+50.823814938" watchObservedRunningTime="2026-04-16 19:18:43.33019719 +0000 UTC m=+50.824787808" Apr 16 19:18:43.358532 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:43.358505 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-metrics-certs\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:43.359844 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:43.359823 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/260a217a-9aa3-43e3-9715-9255e451adff-service-ca-bundle\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:43.360366 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:43.359002 2582 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:18:43.360503 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:43.360048 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/260a217a-9aa3-43e3-9715-9255e451adff-service-ca-bundle podName:260a217a-9aa3-43e3-9715-9255e451adff nodeName:}" failed. No retries permitted until 2026-04-16 19:18:51.360028264 +0000 UTC m=+58.854618864 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/260a217a-9aa3-43e3-9715-9255e451adff-service-ca-bundle") pod "router-default-5f4947fcd8-gffmg" (UID: "260a217a-9aa3-43e3-9715-9255e451adff") : configmap references non-existent config key: service-ca.crt Apr 16 19:18:43.360617 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:43.360609 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-metrics-certs podName:260a217a-9aa3-43e3-9715-9255e451adff nodeName:}" failed. No retries permitted until 2026-04-16 19:18:51.360590278 +0000 UTC m=+58.855180887 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-metrics-certs") pod "router-default-5f4947fcd8-gffmg" (UID: "260a217a-9aa3-43e3-9715-9255e451adff") : secret "router-metrics-certs-default" not found Apr 16 19:18:43.360847 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:43.360833 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pl62h\" (UID: \"04310661-51ad-4a3b-86cf-b9a2a0d1dda1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pl62h" Apr 16 19:18:43.361006 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:43.360996 2582 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:18:43.361113 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:43.361105 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-networking-console-plugin-cert podName:04310661-51ad-4a3b-86cf-b9a2a0d1dda1 nodeName:}" failed. No retries permitted until 2026-04-16 19:18:51.361092083 +0000 UTC m=+58.855682681 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-pl62h" (UID: "04310661-51ad-4a3b-86cf-b9a2a0d1dda1") : secret "networking-console-plugin-cert" not found Apr 16 19:18:43.391099 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:43.391003 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-f8xvx" podStartSLOduration=1.725629322 podStartE2EDuration="8.390982984s" podCreationTimestamp="2026-04-16 19:18:35 +0000 UTC" firstStartedPulling="2026-04-16 19:18:35.826523568 +0000 UTC m=+43.321114176" lastFinishedPulling="2026-04-16 19:18:42.491877226 +0000 UTC m=+49.986467838" observedRunningTime="2026-04-16 19:18:43.39045552 +0000 UTC m=+50.885046138" watchObservedRunningTime="2026-04-16 19:18:43.390982984 +0000 UTC m=+50.885573605" Apr 16 19:18:43.391859 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:43.391818 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-9hfwk" podStartSLOduration=1.6294097619999999 podStartE2EDuration="8.391808304s" podCreationTimestamp="2026-04-16 19:18:35 +0000 UTC" firstStartedPulling="2026-04-16 19:18:36.004707698 +0000 UTC m=+43.499298292" lastFinishedPulling="2026-04-16 19:18:42.767106225 +0000 UTC m=+50.261696834" observedRunningTime="2026-04-16 19:18:43.370890879 +0000 UTC m=+50.865481497" watchObservedRunningTime="2026-04-16 19:18:43.391808304 +0000 UTC m=+50.886398941" Apr 16 19:18:43.412862 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:43.412650 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dtbpz" podStartSLOduration=1.8879438149999999 podStartE2EDuration="8.412634743s" podCreationTimestamp="2026-04-16 19:18:35 +0000 UTC" firstStartedPulling="2026-04-16 19:18:35.972959435 +0000 UTC m=+43.467550031" lastFinishedPulling="2026-04-16 19:18:42.497650345 +0000 UTC m=+49.992240959" observedRunningTime="2026-04-16 19:18:43.411498086 +0000 UTC m=+50.906088704" watchObservedRunningTime="2026-04-16 19:18:43.412634743 +0000 UTC m=+50.907225376" Apr 16 19:18:44.306998 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:44.306970 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/1.log" Apr 16 19:18:44.307575 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:44.307437 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/0.log" Apr 16 19:18:44.307575 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:44.307481 2582 generic.go:358] "Generic (PLEG): container finished" podID="82d1300a-6831-4de2-a99c-90a2b28f9a33" containerID="a91b00a45f2656ad8bdbf110154948ee903ce1119f78f5056bbae24c3d1c5869" exitCode=255 Apr 16 19:18:44.307688 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:44.307606 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" event={"ID":"82d1300a-6831-4de2-a99c-90a2b28f9a33","Type":"ContainerDied","Data":"a91b00a45f2656ad8bdbf110154948ee903ce1119f78f5056bbae24c3d1c5869"} Apr 16 19:18:44.307688 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:44.307645 2582 scope.go:117] "RemoveContainer" containerID="ccb1c7146a2ae31117f65f17c75e654bdcb477deb6b3dd614b4c77825ac043f8" Apr 16 19:18:44.308015 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:44.307995 2582 scope.go:117] "RemoveContainer" containerID="a91b00a45f2656ad8bdbf110154948ee903ce1119f78f5056bbae24c3d1c5869" Apr 16 19:18:44.308260 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:44.308233 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-kqmw8_openshift-console-operator(82d1300a-6831-4de2-a99c-90a2b28f9a33)\"" pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" podUID="82d1300a-6831-4de2-a99c-90a2b28f9a33" Apr 16 19:18:45.281036 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:45.280994 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:18:45.281241 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:45.281182 2582 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:18:45.281241 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:45.281204 2582 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-86db68fc76-grwtr: secret "image-registry-tls" not found Apr 16 19:18:45.281348 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:45.281286 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls podName:fff8270c-4771-4655-8abd-7341281f3173 nodeName:}" failed. No retries permitted until 2026-04-16 19:19:01.281265313 +0000 UTC m=+68.775855911 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls") pod "image-registry-86db68fc76-grwtr" (UID: "fff8270c-4771-4655-8abd-7341281f3173") : secret "image-registry-tls" not found Apr 16 19:18:45.311603 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:45.311534 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/1.log" Apr 16 19:18:45.312026 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:45.311965 2582 scope.go:117] "RemoveContainer" containerID="a91b00a45f2656ad8bdbf110154948ee903ce1119f78f5056bbae24c3d1c5869" Apr 16 19:18:45.312216 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:45.312190 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-kqmw8_openshift-console-operator(82d1300a-6831-4de2-a99c-90a2b28f9a33)\"" pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" podUID="82d1300a-6831-4de2-a99c-90a2b28f9a33" Apr 16 19:18:45.382726 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:45.382685 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-cert\") pod \"ingress-canary-dp4cw\" (UID: \"d82ed6e1-d7aa-4d47-bcb6-f4539431d578\") " pod="openshift-ingress-canary/ingress-canary-dp4cw" Apr 16 19:18:45.382944 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:45.382758 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/513c0bb7-f253-4f0d-bc14-1d473d560c39-metrics-tls\") pod \"dns-default-8xgc6\" (UID: \"513c0bb7-f253-4f0d-bc14-1d473d560c39\") " pod="openshift-dns/dns-default-8xgc6" Apr 16 19:18:45.382944 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:45.382828 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:18:45.382944 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:45.382860 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:18:45.382944 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:45.382898 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-cert podName:d82ed6e1-d7aa-4d47-bcb6-f4539431d578 nodeName:}" failed. No retries permitted until 2026-04-16 19:19:01.382878935 +0000 UTC m=+68.877469530 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-cert") pod "ingress-canary-dp4cw" (UID: "d82ed6e1-d7aa-4d47-bcb6-f4539431d578") : secret "canary-serving-cert" not found Apr 16 19:18:45.382944 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:45.382920 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/513c0bb7-f253-4f0d-bc14-1d473d560c39-metrics-tls podName:513c0bb7-f253-4f0d-bc14-1d473d560c39 nodeName:}" failed. No retries permitted until 2026-04-16 19:19:01.382908861 +0000 UTC m=+68.877499462 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/513c0bb7-f253-4f0d-bc14-1d473d560c39-metrics-tls") pod "dns-default-8xgc6" (UID: "513c0bb7-f253-4f0d-bc14-1d473d560c39") : secret "dns-default-metrics-tls" not found Apr 16 19:18:45.396334 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:45.396304 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bmngp_97d2de57-ec6a-4f59-985c-24aea83be3fd/dns-node-resolver/0.log" Apr 16 19:18:45.747212 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:45.747141 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" Apr 16 19:18:45.747368 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:45.747221 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" Apr 16 19:18:46.196439 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:46.196355 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mgz4l_7caf9f3a-4884-4e15-b154-262d7a60b314/node-ca/0.log" Apr 16 19:18:46.314896 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:46.314863 2582 scope.go:117] "RemoveContainer" containerID="a91b00a45f2656ad8bdbf110154948ee903ce1119f78f5056bbae24c3d1c5869" Apr 16 19:18:46.315347 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:46.315104 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-kqmw8_openshift-console-operator(82d1300a-6831-4de2-a99c-90a2b28f9a33)\"" pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" podUID="82d1300a-6831-4de2-a99c-90a2b28f9a33" Apr 16 19:18:47.318380 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:47.318346 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fg84m" event={"ID":"27412d9f-8c9a-4ed3-92cb-4002bafb01fa","Type":"ContainerStarted","Data":"c44152604e485cf47fb2212536c82d8643342a3a64040e66fbdbcee23c5d0751"} Apr 16 19:18:51.233105 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:51.233073 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d22f5821-5636-4a50-8b36-4c7eed507c2d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8hf4w\" (UID: \"d22f5821-5636-4a50-8b36-4c7eed507c2d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8hf4w" Apr 16 19:18:51.233552 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:51.233260 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:18:51.233552 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:51.233321 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d22f5821-5636-4a50-8b36-4c7eed507c2d-samples-operator-tls podName:d22f5821-5636-4a50-8b36-4c7eed507c2d nodeName:}" failed. No retries permitted until 2026-04-16 19:19:07.233304937 +0000 UTC m=+74.727895555 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d22f5821-5636-4a50-8b36-4c7eed507c2d-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-8hf4w" (UID: "d22f5821-5636-4a50-8b36-4c7eed507c2d") : secret "samples-operator-tls" not found Apr 16 19:18:51.334164 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:51.334126 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/448c73ab-a4f5-4a5c-8143-1deb13253eec-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9sbws\" (UID: \"448c73ab-a4f5-4a5c-8143-1deb13253eec\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws" Apr 16 19:18:51.334329 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:51.334265 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 19:18:51.334329 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:51.334322 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/448c73ab-a4f5-4a5c-8143-1deb13253eec-cluster-monitoring-operator-tls podName:448c73ab-a4f5-4a5c-8143-1deb13253eec nodeName:}" failed. No retries permitted until 2026-04-16 19:19:07.334307476 +0000 UTC m=+74.828898071 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/448c73ab-a4f5-4a5c-8143-1deb13253eec-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-9sbws" (UID: "448c73ab-a4f5-4a5c-8143-1deb13253eec") : secret "cluster-monitoring-operator-tls" not found Apr 16 19:18:51.435376 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:51.435341 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-metrics-certs\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:51.435528 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:51.435391 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/260a217a-9aa3-43e3-9715-9255e451adff-service-ca-bundle\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:18:51.435528 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:51.435443 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pl62h\" (UID: \"04310661-51ad-4a3b-86cf-b9a2a0d1dda1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pl62h" Apr 16 19:18:51.435528 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:51.435494 2582 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 19:18:51.435655 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:51.435559 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-metrics-certs podName:260a217a-9aa3-43e3-9715-9255e451adff nodeName:}" failed. No retries permitted until 2026-04-16 19:19:07.435543377 +0000 UTC m=+74.930133972 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-metrics-certs") pod "router-default-5f4947fcd8-gffmg" (UID: "260a217a-9aa3-43e3-9715-9255e451adff") : secret "router-metrics-certs-default" not found Apr 16 19:18:51.435655 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:51.435596 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/260a217a-9aa3-43e3-9715-9255e451adff-service-ca-bundle podName:260a217a-9aa3-43e3-9715-9255e451adff nodeName:}" failed. No retries permitted until 2026-04-16 19:19:07.435573438 +0000 UTC m=+74.930164035 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/260a217a-9aa3-43e3-9715-9255e451adff-service-ca-bundle") pod "router-default-5f4947fcd8-gffmg" (UID: "260a217a-9aa3-43e3-9715-9255e451adff") : configmap references non-existent config key: service-ca.crt Apr 16 19:18:51.435655 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:51.435612 2582 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:18:51.435815 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:51.435695 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-networking-console-plugin-cert podName:04310661-51ad-4a3b-86cf-b9a2a0d1dda1 nodeName:}" failed. No retries permitted until 2026-04-16 19:19:07.435684516 +0000 UTC m=+74.930275111 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-pl62h" (UID: "04310661-51ad-4a3b-86cf-b9a2a0d1dda1") : secret "networking-console-plugin-cert" not found Apr 16 19:18:54.247927 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:54.247900 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4xbfq" Apr 16 19:18:54.276721 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:54.276671 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-fg84m" podStartSLOduration=40.921751386 podStartE2EDuration="45.276659199s" podCreationTimestamp="2026-04-16 19:18:09 +0000 UTC" firstStartedPulling="2026-04-16 19:18:42.75320651 +0000 UTC m=+50.247797106" lastFinishedPulling="2026-04-16 19:18:47.108114321 +0000 UTC m=+54.602704919" observedRunningTime="2026-04-16 19:18:47.338746464 +0000 UTC m=+54.833337082" watchObservedRunningTime="2026-04-16 19:18:54.276659199 +0000 UTC m=+61.771249815" Apr 16 19:18:58.799720 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:58.799683 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs\") pod \"network-metrics-daemon-lvp6d\" (UID: \"0fa55098-1c0e-4cf5-963c-602d47a411cc\") " pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:18:58.802253 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:58.802233 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:18:58.810461 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:58.810444 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 19:18:58.810538 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:18:58.810497 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs podName:0fa55098-1c0e-4cf5-963c-602d47a411cc nodeName:}" failed. No retries permitted until 2026-04-16 19:20:02.810483074 +0000 UTC m=+130.305073669 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs") pod "network-metrics-daemon-lvp6d" (UID: "0fa55098-1c0e-4cf5-963c-602d47a411cc") : secret "metrics-daemon-secret" not found Apr 16 19:18:58.900207 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:58.900178 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkmgw\" (UniqueName: \"kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw\") pod \"network-check-target-kk8zc\" (UID: \"0ef22a96-6828-4636-8255-3aa3eaae036d\") " pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:58.902522 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:58.902505 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkmgw\" (UniqueName: \"kubernetes.io/projected/0ef22a96-6828-4636-8255-3aa3eaae036d-kube-api-access-qkmgw\") pod \"network-check-target-kk8zc\" (UID: \"0ef22a96-6828-4636-8255-3aa3eaae036d\") " pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:59.023526 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:59.023497 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-s2294\"" Apr 16 19:18:59.031454 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:59.031432 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:59.141426 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:59.141396 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-kk8zc"] Apr 16 19:18:59.145132 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:18:59.145104 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ef22a96_6828_4636_8255_3aa3eaae036d.slice/crio-f82082085b657beac05c77af001d2105d8f72bbea86824fadab7a1c97293325e WatchSource:0}: Error finding container f82082085b657beac05c77af001d2105d8f72bbea86824fadab7a1c97293325e: Status 404 returned error can't find the container with id f82082085b657beac05c77af001d2105d8f72bbea86824fadab7a1c97293325e Apr 16 19:18:59.350044 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:59.349948 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kk8zc" event={"ID":"0ef22a96-6828-4636-8255-3aa3eaae036d","Type":"ContainerStarted","Data":"c2807fa4542d2fd92ba0e1e39e842fd4aea57beef852064176a1fdc02bc280e5"} Apr 16 19:18:59.350044 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:59.349985 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-kk8zc" event={"ID":"0ef22a96-6828-4636-8255-3aa3eaae036d","Type":"ContainerStarted","Data":"f82082085b657beac05c77af001d2105d8f72bbea86824fadab7a1c97293325e"} Apr 16 19:18:59.350288 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:59.350087 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:18:59.367957 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:18:59.367911 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-kk8zc" podStartSLOduration=66.367899737 podStartE2EDuration="1m6.367899737s" podCreationTimestamp="2026-04-16 19:17:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:18:59.367487115 +0000 UTC m=+66.862077736" watchObservedRunningTime="2026-04-16 19:18:59.367899737 +0000 UTC m=+66.862490391" Apr 16 19:19:00.110364 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:00.110335 2582 scope.go:117] "RemoveContainer" containerID="a91b00a45f2656ad8bdbf110154948ee903ce1119f78f5056bbae24c3d1c5869" Apr 16 19:19:00.354124 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:00.354096 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/2.log" Apr 16 19:19:00.354483 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:00.354466 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/1.log" Apr 16 19:19:00.354559 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:00.354499 2582 generic.go:358] "Generic (PLEG): container finished" podID="82d1300a-6831-4de2-a99c-90a2b28f9a33" containerID="9e5adaaff50339a33ad8742e2e96fe9cab2b7b6e972eac2639c3f92347aa1ac4" exitCode=255 Apr 16 19:19:00.354612 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:00.354585 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" event={"ID":"82d1300a-6831-4de2-a99c-90a2b28f9a33","Type":"ContainerDied","Data":"9e5adaaff50339a33ad8742e2e96fe9cab2b7b6e972eac2639c3f92347aa1ac4"} Apr 16 19:19:00.354663 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:00.354635 2582 scope.go:117] "RemoveContainer" containerID="a91b00a45f2656ad8bdbf110154948ee903ce1119f78f5056bbae24c3d1c5869" Apr 16 19:19:00.354961 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:00.354937 2582 scope.go:117] "RemoveContainer" containerID="9e5adaaff50339a33ad8742e2e96fe9cab2b7b6e972eac2639c3f92347aa1ac4" Apr 16 19:19:00.355114 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:19:00.355094 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-kqmw8_openshift-console-operator(82d1300a-6831-4de2-a99c-90a2b28f9a33)\"" pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" podUID="82d1300a-6831-4de2-a99c-90a2b28f9a33" Apr 16 19:19:01.320585 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:01.320551 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:19:01.322812 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:01.322786 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls\") pod \"image-registry-86db68fc76-grwtr\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:19:01.358536 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:01.358510 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/2.log" Apr 16 19:19:01.421752 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:01.421720 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-cert\") pod \"ingress-canary-dp4cw\" (UID: \"d82ed6e1-d7aa-4d47-bcb6-f4539431d578\") " pod="openshift-ingress-canary/ingress-canary-dp4cw" Apr 16 19:19:01.421894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:01.421765 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/513c0bb7-f253-4f0d-bc14-1d473d560c39-metrics-tls\") pod \"dns-default-8xgc6\" (UID: \"513c0bb7-f253-4f0d-bc14-1d473d560c39\") " pod="openshift-dns/dns-default-8xgc6" Apr 16 19:19:01.424139 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:01.424115 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d82ed6e1-d7aa-4d47-bcb6-f4539431d578-cert\") pod \"ingress-canary-dp4cw\" (UID: \"d82ed6e1-d7aa-4d47-bcb6-f4539431d578\") " pod="openshift-ingress-canary/ingress-canary-dp4cw" Apr 16 19:19:01.424139 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:01.424136 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/513c0bb7-f253-4f0d-bc14-1d473d560c39-metrics-tls\") pod \"dns-default-8xgc6\" (UID: \"513c0bb7-f253-4f0d-bc14-1d473d560c39\") " pod="openshift-dns/dns-default-8xgc6" Apr 16 19:19:01.549049 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:01.549019 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-bjmd7\"" Apr 16 19:19:01.556434 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:01.556407 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:19:01.579762 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:01.579680 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-n84tw\"" Apr 16 19:19:01.587595 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:01.587566 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8xgc6" Apr 16 19:19:01.602221 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:01.602192 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9bx8r\"" Apr 16 19:19:01.609621 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:01.609587 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dp4cw" Apr 16 19:19:01.684861 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:01.684829 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-86db68fc76-grwtr"] Apr 16 19:19:01.700900 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:19:01.700589 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfff8270c_4771_4655_8abd_7341281f3173.slice/crio-5296eeb4f51446133b1773e90190ef4bff6012fc844f0903394035f68e372dd5 WatchSource:0}: Error finding container 5296eeb4f51446133b1773e90190ef4bff6012fc844f0903394035f68e372dd5: Status 404 returned error can't find the container with id 5296eeb4f51446133b1773e90190ef4bff6012fc844f0903394035f68e372dd5 Apr 16 19:19:01.750195 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:01.750136 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8xgc6"] Apr 16 19:19:01.753310 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:19:01.753284 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod513c0bb7_f253_4f0d_bc14_1d473d560c39.slice/crio-fb1b9320ad39d866cc66642aac9014a658dce59fae4b7ad00e24199c9cc70f54 WatchSource:0}: Error finding container fb1b9320ad39d866cc66642aac9014a658dce59fae4b7ad00e24199c9cc70f54: Status 404 returned error can't find the container with id fb1b9320ad39d866cc66642aac9014a658dce59fae4b7ad00e24199c9cc70f54 Apr 16 19:19:01.766651 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:01.766628 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dp4cw"] Apr 16 19:19:01.769618 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:19:01.769590 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd82ed6e1_d7aa_4d47_bcb6_f4539431d578.slice/crio-daca67e5e341349a8c1c7748dcf1bd0fbea16e8d84b1259b069b399c385f50b5 WatchSource:0}: Error finding container daca67e5e341349a8c1c7748dcf1bd0fbea16e8d84b1259b069b399c385f50b5: Status 404 returned error can't find the container with id daca67e5e341349a8c1c7748dcf1bd0fbea16e8d84b1259b069b399c385f50b5 Apr 16 19:19:02.363188 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:02.363132 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8xgc6" event={"ID":"513c0bb7-f253-4f0d-bc14-1d473d560c39","Type":"ContainerStarted","Data":"fb1b9320ad39d866cc66642aac9014a658dce59fae4b7ad00e24199c9cc70f54"} Apr 16 19:19:02.364433 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:02.364390 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dp4cw" event={"ID":"d82ed6e1-d7aa-4d47-bcb6-f4539431d578","Type":"ContainerStarted","Data":"daca67e5e341349a8c1c7748dcf1bd0fbea16e8d84b1259b069b399c385f50b5"} Apr 16 19:19:02.366365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:02.366328 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86db68fc76-grwtr" event={"ID":"fff8270c-4771-4655-8abd-7341281f3173","Type":"ContainerStarted","Data":"ebbad7bff376f08d17d86584071f59b54a55d83c2ddf56ee030de0826e0b90f7"} Apr 16 19:19:02.366365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:02.366360 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86db68fc76-grwtr" event={"ID":"fff8270c-4771-4655-8abd-7341281f3173","Type":"ContainerStarted","Data":"5296eeb4f51446133b1773e90190ef4bff6012fc844f0903394035f68e372dd5"} Apr 16 19:19:02.366546 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:02.366495 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:19:02.388353 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:02.388300 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-86db68fc76-grwtr" podStartSLOduration=69.388282283 podStartE2EDuration="1m9.388282283s" podCreationTimestamp="2026-04-16 19:17:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:19:02.387426885 +0000 UTC m=+69.882017506" watchObservedRunningTime="2026-04-16 19:19:02.388282283 +0000 UTC m=+69.882872903" Apr 16 19:19:04.373608 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:04.373516 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dp4cw" event={"ID":"d82ed6e1-d7aa-4d47-bcb6-f4539431d578","Type":"ContainerStarted","Data":"d26d036aee7631f460b0698db8054f59f89299f8f983d2b30c17e29288b33f3b"} Apr 16 19:19:04.375121 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:04.375097 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8xgc6" event={"ID":"513c0bb7-f253-4f0d-bc14-1d473d560c39","Type":"ContainerStarted","Data":"e7b3cfe1a4c753705b655f4713a4e966d95ddb174f63f74e08181439459e1fd8"} Apr 16 19:19:04.375259 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:04.375126 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8xgc6" event={"ID":"513c0bb7-f253-4f0d-bc14-1d473d560c39","Type":"ContainerStarted","Data":"287f946005ed585c57f3212ab5a4833e88daa209f1ab9e0ae4f918d45717a8ed"} Apr 16 19:19:04.375259 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:04.375196 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-8xgc6" Apr 16 19:19:04.390400 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:04.390359 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dp4cw" podStartSLOduration=33.302843834 podStartE2EDuration="35.39034659s" podCreationTimestamp="2026-04-16 19:18:29 +0000 UTC" firstStartedPulling="2026-04-16 19:19:01.771517896 +0000 UTC m=+69.266108492" lastFinishedPulling="2026-04-16 19:19:03.859020638 +0000 UTC m=+71.353611248" observedRunningTime="2026-04-16 19:19:04.389357132 +0000 UTC m=+71.883947749" watchObservedRunningTime="2026-04-16 19:19:04.39034659 +0000 UTC m=+71.884937201" Apr 16 19:19:04.407020 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:04.406984 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8xgc6" podStartSLOduration=33.308360062 podStartE2EDuration="35.406972409s" podCreationTimestamp="2026-04-16 19:18:29 +0000 UTC" firstStartedPulling="2026-04-16 19:19:01.755214314 +0000 UTC m=+69.249804914" lastFinishedPulling="2026-04-16 19:19:03.853826665 +0000 UTC m=+71.348417261" observedRunningTime="2026-04-16 19:19:04.405929806 +0000 UTC m=+71.900520423" watchObservedRunningTime="2026-04-16 19:19:04.406972409 +0000 UTC m=+71.901563025" Apr 16 19:19:05.747590 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:05.747461 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" Apr 16 19:19:05.747590 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:05.747494 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" Apr 16 19:19:05.748076 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:05.747902 2582 scope.go:117] "RemoveContainer" containerID="9e5adaaff50339a33ad8742e2e96fe9cab2b7b6e972eac2639c3f92347aa1ac4" Apr 16 19:19:05.748175 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:19:05.748125 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-kqmw8_openshift-console-operator(82d1300a-6831-4de2-a99c-90a2b28f9a33)\"" pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" podUID="82d1300a-6831-4de2-a99c-90a2b28f9a33" Apr 16 19:19:07.272860 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:07.272824 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d22f5821-5636-4a50-8b36-4c7eed507c2d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8hf4w\" (UID: \"d22f5821-5636-4a50-8b36-4c7eed507c2d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8hf4w" Apr 16 19:19:07.275185 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:07.275164 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d22f5821-5636-4a50-8b36-4c7eed507c2d-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-8hf4w\" (UID: \"d22f5821-5636-4a50-8b36-4c7eed507c2d\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8hf4w" Apr 16 19:19:07.373665 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:07.373613 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/448c73ab-a4f5-4a5c-8143-1deb13253eec-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9sbws\" (UID: \"448c73ab-a4f5-4a5c-8143-1deb13253eec\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws" Apr 16 19:19:07.375938 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:07.375910 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/448c73ab-a4f5-4a5c-8143-1deb13253eec-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-9sbws\" (UID: \"448c73ab-a4f5-4a5c-8143-1deb13253eec\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws" Apr 16 19:19:07.438526 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:07.438498 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-x2fps\"" Apr 16 19:19:07.446438 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:07.446417 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8hf4w" Apr 16 19:19:07.474540 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:07.474506 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-metrics-certs\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:19:07.474665 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:07.474553 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/260a217a-9aa3-43e3-9715-9255e451adff-service-ca-bundle\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:19:07.474665 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:07.474593 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pl62h\" (UID: \"04310661-51ad-4a3b-86cf-b9a2a0d1dda1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pl62h" Apr 16 19:19:07.475266 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:07.475239 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/260a217a-9aa3-43e3-9715-9255e451adff-service-ca-bundle\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:19:07.476908 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:07.476876 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/04310661-51ad-4a3b-86cf-b9a2a0d1dda1-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-pl62h\" (UID: \"04310661-51ad-4a3b-86cf-b9a2a0d1dda1\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-pl62h" Apr 16 19:19:07.477311 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:07.477296 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/260a217a-9aa3-43e3-9715-9255e451adff-metrics-certs\") pod \"router-default-5f4947fcd8-gffmg\" (UID: \"260a217a-9aa3-43e3-9715-9255e451adff\") " pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:19:07.531084 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:07.531023 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-4299x\"" Apr 16 19:19:07.539186 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:07.539164 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws" Apr 16 19:19:07.565738 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:07.565710 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8hf4w"] Apr 16 19:19:07.607527 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:07.607500 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-ktlw7\"" Apr 16 19:19:07.615595 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:07.615571 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-pl62h" Apr 16 19:19:07.654756 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:07.654724 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws"] Apr 16 19:19:07.657652 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:19:07.657628 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod448c73ab_a4f5_4a5c_8143_1deb13253eec.slice/crio-d0f2defed90a1275777f68ac78091effd61f4045b2990e32c065f486f61c44fb WatchSource:0}: Error finding container d0f2defed90a1275777f68ac78091effd61f4045b2990e32c065f486f61c44fb: Status 404 returned error can't find the container with id d0f2defed90a1275777f68ac78091effd61f4045b2990e32c065f486f61c44fb Apr 16 19:19:07.660976 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:07.660957 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-kfs6b\"" Apr 16 19:19:07.669239 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:07.669206 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:19:07.735733 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:07.735705 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-pl62h"] Apr 16 19:19:07.738741 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:19:07.738713 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04310661_51ad_4a3b_86cf_b9a2a0d1dda1.slice/crio-d5c47b7cc9ec0fbbf5213a02cb529b3a7de5f1e9e8064073ccadc13f9cb58790 WatchSource:0}: Error finding container d5c47b7cc9ec0fbbf5213a02cb529b3a7de5f1e9e8064073ccadc13f9cb58790: Status 404 returned error can't find the container with id d5c47b7cc9ec0fbbf5213a02cb529b3a7de5f1e9e8064073ccadc13f9cb58790 Apr 16 19:19:07.791945 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:07.791873 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-5f4947fcd8-gffmg"] Apr 16 19:19:07.795599 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:19:07.795569 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod260a217a_9aa3_43e3_9715_9255e451adff.slice/crio-58995aeaf1517e5fce8742bfcb817e61d79274da8b9adef01f6b55eb0547efc6 WatchSource:0}: Error finding container 58995aeaf1517e5fce8742bfcb817e61d79274da8b9adef01f6b55eb0547efc6: Status 404 returned error can't find the container with id 58995aeaf1517e5fce8742bfcb817e61d79274da8b9adef01f6b55eb0547efc6 Apr 16 19:19:08.388463 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:08.388421 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-pl62h" event={"ID":"04310661-51ad-4a3b-86cf-b9a2a0d1dda1","Type":"ContainerStarted","Data":"d5c47b7cc9ec0fbbf5213a02cb529b3a7de5f1e9e8064073ccadc13f9cb58790"} Apr 16 19:19:08.389734 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:08.389700 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8hf4w" event={"ID":"d22f5821-5636-4a50-8b36-4c7eed507c2d","Type":"ContainerStarted","Data":"cedc87f07305eaa13bda0be7196b60fb1bdcffe2323d9808000483c74e7801ad"} Apr 16 19:19:08.391669 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:08.391643 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5f4947fcd8-gffmg" event={"ID":"260a217a-9aa3-43e3-9715-9255e451adff","Type":"ContainerStarted","Data":"aec92ef25df4705d771e8dadcd11e330e3fca41b35fd7777920578cac4633eee"} Apr 16 19:19:08.391770 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:08.391672 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5f4947fcd8-gffmg" event={"ID":"260a217a-9aa3-43e3-9715-9255e451adff","Type":"ContainerStarted","Data":"58995aeaf1517e5fce8742bfcb817e61d79274da8b9adef01f6b55eb0547efc6"} Apr 16 19:19:08.393510 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:08.393476 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws" event={"ID":"448c73ab-a4f5-4a5c-8143-1deb13253eec","Type":"ContainerStarted","Data":"d0f2defed90a1275777f68ac78091effd61f4045b2990e32c065f486f61c44fb"} Apr 16 19:19:08.412924 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:08.412499 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5f4947fcd8-gffmg" podStartSLOduration=33.412483867 podStartE2EDuration="33.412483867s" podCreationTimestamp="2026-04-16 19:18:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:19:08.411405164 +0000 UTC m=+75.905995787" watchObservedRunningTime="2026-04-16 19:19:08.412483867 +0000 UTC m=+75.907074486" Apr 16 19:19:08.670059 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:08.669950 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:19:08.673097 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:08.673073 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:19:09.398578 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:09.398540 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-pl62h" event={"ID":"04310661-51ad-4a3b-86cf-b9a2a0d1dda1","Type":"ContainerStarted","Data":"e0cd817b208026c8c53e32b86ecf83dcb0db4289e15b8892418ff7216e036291"} Apr 16 19:19:09.399040 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:09.398765 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:19:09.400140 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:09.400113 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5f4947fcd8-gffmg" Apr 16 19:19:09.414460 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:09.414421 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-pl62h" podStartSLOduration=33.359985377 podStartE2EDuration="34.4144064s" podCreationTimestamp="2026-04-16 19:18:35 +0000 UTC" firstStartedPulling="2026-04-16 19:19:07.741076577 +0000 UTC m=+75.235667172" lastFinishedPulling="2026-04-16 19:19:08.795497598 +0000 UTC m=+76.290088195" observedRunningTime="2026-04-16 19:19:09.413396704 +0000 UTC m=+76.907987322" watchObservedRunningTime="2026-04-16 19:19:09.4144064 +0000 UTC m=+76.908997066" Apr 16 19:19:10.403410 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:10.403372 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws" event={"ID":"448c73ab-a4f5-4a5c-8143-1deb13253eec","Type":"ContainerStarted","Data":"04ca5a1dc8ba1bd2d1e7905cc57ffb2381963a7d8382f87182f5a43ebf0fb54a"} Apr 16 19:19:10.405314 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:10.405274 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8hf4w" event={"ID":"d22f5821-5636-4a50-8b36-4c7eed507c2d","Type":"ContainerStarted","Data":"84b68a5570cea85182060dab08da001d8a0e25de99becf6f55042cc5473f2e91"} Apr 16 19:19:10.405314 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:10.405312 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8hf4w" event={"ID":"d22f5821-5636-4a50-8b36-4c7eed507c2d","Type":"ContainerStarted","Data":"962b1e05e5785ed9ce2972290c2f3c85156934ec9ab36835474211b6b73bc737"} Apr 16 19:19:10.435786 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:10.435729 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-9sbws" podStartSLOduration=32.879684858 podStartE2EDuration="35.435714335s" podCreationTimestamp="2026-04-16 19:18:35 +0000 UTC" firstStartedPulling="2026-04-16 19:19:07.659616794 +0000 UTC m=+75.154207392" lastFinishedPulling="2026-04-16 19:19:10.215646271 +0000 UTC m=+77.710236869" observedRunningTime="2026-04-16 19:19:10.433537589 +0000 UTC m=+77.928128206" watchObservedRunningTime="2026-04-16 19:19:10.435714335 +0000 UTC m=+77.930304952" Apr 16 19:19:10.458658 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:10.458605 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-8hf4w" podStartSLOduration=32.866225339 podStartE2EDuration="35.458587961s" podCreationTimestamp="2026-04-16 19:18:35 +0000 UTC" firstStartedPulling="2026-04-16 19:19:07.620930366 +0000 UTC m=+75.115520961" lastFinishedPulling="2026-04-16 19:19:10.213292985 +0000 UTC m=+77.707883583" observedRunningTime="2026-04-16 19:19:10.453729025 +0000 UTC m=+77.948319643" watchObservedRunningTime="2026-04-16 19:19:10.458587961 +0000 UTC m=+77.953178575" Apr 16 19:19:13.109220 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.109190 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fhd58"] Apr 16 19:19:13.135110 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.135084 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fhd58" Apr 16 19:19:13.137580 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.137553 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vl99s\"" Apr 16 19:19:13.138293 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.138272 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 19:19:13.138414 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.138300 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 19:19:13.139955 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.139932 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fhd58"] Apr 16 19:19:13.222242 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.222208 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ca94893d-31a8-4ecf-9f0d-fc52580b40f4-data-volume\") pod \"insights-runtime-extractor-fhd58\" (UID: \"ca94893d-31a8-4ecf-9f0d-fc52580b40f4\") " pod="openshift-insights/insights-runtime-extractor-fhd58" Apr 16 19:19:13.222416 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.222257 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf44n\" (UniqueName: \"kubernetes.io/projected/ca94893d-31a8-4ecf-9f0d-fc52580b40f4-kube-api-access-kf44n\") pod \"insights-runtime-extractor-fhd58\" (UID: \"ca94893d-31a8-4ecf-9f0d-fc52580b40f4\") " pod="openshift-insights/insights-runtime-extractor-fhd58" Apr 16 19:19:13.222416 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.222357 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ca94893d-31a8-4ecf-9f0d-fc52580b40f4-crio-socket\") pod \"insights-runtime-extractor-fhd58\" (UID: \"ca94893d-31a8-4ecf-9f0d-fc52580b40f4\") " pod="openshift-insights/insights-runtime-extractor-fhd58" Apr 16 19:19:13.222416 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.222399 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ca94893d-31a8-4ecf-9f0d-fc52580b40f4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fhd58\" (UID: \"ca94893d-31a8-4ecf-9f0d-fc52580b40f4\") " pod="openshift-insights/insights-runtime-extractor-fhd58" Apr 16 19:19:13.222608 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.222433 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ca94893d-31a8-4ecf-9f0d-fc52580b40f4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fhd58\" (UID: \"ca94893d-31a8-4ecf-9f0d-fc52580b40f4\") " pod="openshift-insights/insights-runtime-extractor-fhd58" Apr 16 19:19:13.322860 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.322826 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ca94893d-31a8-4ecf-9f0d-fc52580b40f4-data-volume\") pod \"insights-runtime-extractor-fhd58\" (UID: \"ca94893d-31a8-4ecf-9f0d-fc52580b40f4\") " pod="openshift-insights/insights-runtime-extractor-fhd58" Apr 16 19:19:13.322860 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.322866 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kf44n\" (UniqueName: \"kubernetes.io/projected/ca94893d-31a8-4ecf-9f0d-fc52580b40f4-kube-api-access-kf44n\") pod \"insights-runtime-extractor-fhd58\" (UID: \"ca94893d-31a8-4ecf-9f0d-fc52580b40f4\") " pod="openshift-insights/insights-runtime-extractor-fhd58" Apr 16 19:19:13.323059 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.322893 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ca94893d-31a8-4ecf-9f0d-fc52580b40f4-crio-socket\") pod \"insights-runtime-extractor-fhd58\" (UID: \"ca94893d-31a8-4ecf-9f0d-fc52580b40f4\") " pod="openshift-insights/insights-runtime-extractor-fhd58" Apr 16 19:19:13.323059 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.322918 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ca94893d-31a8-4ecf-9f0d-fc52580b40f4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fhd58\" (UID: \"ca94893d-31a8-4ecf-9f0d-fc52580b40f4\") " pod="openshift-insights/insights-runtime-extractor-fhd58" Apr 16 19:19:13.323059 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.322942 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ca94893d-31a8-4ecf-9f0d-fc52580b40f4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fhd58\" (UID: \"ca94893d-31a8-4ecf-9f0d-fc52580b40f4\") " pod="openshift-insights/insights-runtime-extractor-fhd58" Apr 16 19:19:13.323059 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.323032 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ca94893d-31a8-4ecf-9f0d-fc52580b40f4-crio-socket\") pod \"insights-runtime-extractor-fhd58\" (UID: \"ca94893d-31a8-4ecf-9f0d-fc52580b40f4\") " pod="openshift-insights/insights-runtime-extractor-fhd58" Apr 16 19:19:13.323291 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.323269 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ca94893d-31a8-4ecf-9f0d-fc52580b40f4-data-volume\") pod \"insights-runtime-extractor-fhd58\" (UID: \"ca94893d-31a8-4ecf-9f0d-fc52580b40f4\") " pod="openshift-insights/insights-runtime-extractor-fhd58" Apr 16 19:19:13.323473 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.323445 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ca94893d-31a8-4ecf-9f0d-fc52580b40f4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fhd58\" (UID: \"ca94893d-31a8-4ecf-9f0d-fc52580b40f4\") " pod="openshift-insights/insights-runtime-extractor-fhd58" Apr 16 19:19:13.325292 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.325273 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ca94893d-31a8-4ecf-9f0d-fc52580b40f4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fhd58\" (UID: \"ca94893d-31a8-4ecf-9f0d-fc52580b40f4\") " pod="openshift-insights/insights-runtime-extractor-fhd58" Apr 16 19:19:13.334902 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.334843 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf44n\" (UniqueName: \"kubernetes.io/projected/ca94893d-31a8-4ecf-9f0d-fc52580b40f4-kube-api-access-kf44n\") pod \"insights-runtime-extractor-fhd58\" (UID: \"ca94893d-31a8-4ecf-9f0d-fc52580b40f4\") " pod="openshift-insights/insights-runtime-extractor-fhd58" Apr 16 19:19:13.453958 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.453929 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fhd58" Apr 16 19:19:13.576752 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:13.576732 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fhd58"] Apr 16 19:19:13.591103 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:19:13.591030 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca94893d_31a8_4ecf_9f0d_fc52580b40f4.slice/crio-d0af9f48946c1dafbd94067d979ad430562d208bbd7f69b5e1b5864eae6d60c9 WatchSource:0}: Error finding container d0af9f48946c1dafbd94067d979ad430562d208bbd7f69b5e1b5864eae6d60c9: Status 404 returned error can't find the container with id d0af9f48946c1dafbd94067d979ad430562d208bbd7f69b5e1b5864eae6d60c9 Apr 16 19:19:14.380942 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:14.380910 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8xgc6" Apr 16 19:19:14.417945 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:14.417903 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fhd58" event={"ID":"ca94893d-31a8-4ecf-9f0d-fc52580b40f4","Type":"ContainerStarted","Data":"f4263331668cf897c680efe94d52cc1cfc467ca009908f952eb100f63f2c9d84"} Apr 16 19:19:14.417945 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:14.417946 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fhd58" event={"ID":"ca94893d-31a8-4ecf-9f0d-fc52580b40f4","Type":"ContainerStarted","Data":"d0af9f48946c1dafbd94067d979ad430562d208bbd7f69b5e1b5864eae6d60c9"} Apr 16 19:19:15.423472 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:15.423433 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fhd58" event={"ID":"ca94893d-31a8-4ecf-9f0d-fc52580b40f4","Type":"ContainerStarted","Data":"39f4cbae77077eb2a0919fb97b57925897fdd904cc08557e4849c574aa7e8028"} Apr 16 19:19:17.432438 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:17.432404 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fhd58" event={"ID":"ca94893d-31a8-4ecf-9f0d-fc52580b40f4","Type":"ContainerStarted","Data":"9ac9a2da8066ed29fa29573cd60256271f5d71e227c8194d29ef96a7ea55af11"} Apr 16 19:19:17.452252 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:17.452206 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fhd58" podStartSLOduration=1.455936256 podStartE2EDuration="4.452192367s" podCreationTimestamp="2026-04-16 19:19:13 +0000 UTC" firstStartedPulling="2026-04-16 19:19:13.707667447 +0000 UTC m=+81.202258042" lastFinishedPulling="2026-04-16 19:19:16.703923556 +0000 UTC m=+84.198514153" observedRunningTime="2026-04-16 19:19:17.451342735 +0000 UTC m=+84.945933352" watchObservedRunningTime="2026-04-16 19:19:17.452192367 +0000 UTC m=+84.946782984" Apr 16 19:19:19.110789 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:19.110755 2582 scope.go:117] "RemoveContainer" containerID="9e5adaaff50339a33ad8742e2e96fe9cab2b7b6e972eac2639c3f92347aa1ac4" Apr 16 19:19:19.111285 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:19:19.110970 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-kqmw8_openshift-console-operator(82d1300a-6831-4de2-a99c-90a2b28f9a33)\"" pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" podUID="82d1300a-6831-4de2-a99c-90a2b28f9a33" Apr 16 19:19:21.560439 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:21.560402 2582 patch_prober.go:28] interesting pod/image-registry-86db68fc76-grwtr container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 19:19:21.560825 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:21.560472 2582 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-86db68fc76-grwtr" podUID="fff8270c-4771-4655-8abd-7341281f3173" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 19:19:23.373736 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:23.373708 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:19:24.261517 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.261486 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-86pzj"] Apr 16 19:19:24.264862 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.264846 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-86pzj" Apr 16 19:19:24.267048 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.267027 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 19:19:24.267253 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.267234 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-2jhsp\"" Apr 16 19:19:24.267968 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.267951 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 19:19:24.268015 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.267979 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 19:19:24.276808 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.276786 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-86pzj"] Apr 16 19:19:24.288861 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.288841 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-xf965"] Apr 16 19:19:24.292180 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.292143 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.294184 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.294163 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-9z59t\"" Apr 16 19:19:24.294302 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.294212 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 19:19:24.294472 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.294450 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 19:19:24.294581 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.294458 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 19:19:24.312991 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.312970 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5e5dbd1b-6936-4ebc-83c5-9d234738556b-root\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.313117 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.312995 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5e5dbd1b-6936-4ebc-83c5-9d234738556b-node-exporter-tls\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.313117 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.313013 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx87t\" (UniqueName: \"kubernetes.io/projected/5e5dbd1b-6936-4ebc-83c5-9d234738556b-kube-api-access-gx87t\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.313117 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.313031 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd3c9bfd-8e4a-498b-9c73-93f8b57377f5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-86pzj\" (UID: \"fd3c9bfd-8e4a-498b-9c73-93f8b57377f5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-86pzj" Apr 16 19:19:24.313298 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.313172 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e5dbd1b-6936-4ebc-83c5-9d234738556b-metrics-client-ca\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.313298 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.313225 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5e5dbd1b-6936-4ebc-83c5-9d234738556b-node-exporter-accelerators-collector-config\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.313298 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.313258 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fd3c9bfd-8e4a-498b-9c73-93f8b57377f5-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-86pzj\" (UID: \"fd3c9bfd-8e4a-498b-9c73-93f8b57377f5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-86pzj" Apr 16 19:19:24.313449 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.313310 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e5dbd1b-6936-4ebc-83c5-9d234738556b-sys\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.313449 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.313345 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxnq8\" (UniqueName: \"kubernetes.io/projected/fd3c9bfd-8e4a-498b-9c73-93f8b57377f5-kube-api-access-nxnq8\") pod \"openshift-state-metrics-9d44df66c-86pzj\" (UID: \"fd3c9bfd-8e4a-498b-9c73-93f8b57377f5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-86pzj" Apr 16 19:19:24.313449 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.313403 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5e5dbd1b-6936-4ebc-83c5-9d234738556b-node-exporter-wtmp\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.313449 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.313438 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5e5dbd1b-6936-4ebc-83c5-9d234738556b-node-exporter-textfile\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.313623 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.313465 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5e5dbd1b-6936-4ebc-83c5-9d234738556b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.313623 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.313492 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd3c9bfd-8e4a-498b-9c73-93f8b57377f5-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-86pzj\" (UID: \"fd3c9bfd-8e4a-498b-9c73-93f8b57377f5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-86pzj" Apr 16 19:19:24.414341 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.414308 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e5dbd1b-6936-4ebc-83c5-9d234738556b-metrics-client-ca\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.414341 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.414349 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5e5dbd1b-6936-4ebc-83c5-9d234738556b-node-exporter-accelerators-collector-config\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.414867 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.414367 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fd3c9bfd-8e4a-498b-9c73-93f8b57377f5-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-86pzj\" (UID: \"fd3c9bfd-8e4a-498b-9c73-93f8b57377f5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-86pzj" Apr 16 19:19:24.414867 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.414426 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e5dbd1b-6936-4ebc-83c5-9d234738556b-sys\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.414867 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.414494 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxnq8\" (UniqueName: \"kubernetes.io/projected/fd3c9bfd-8e4a-498b-9c73-93f8b57377f5-kube-api-access-nxnq8\") pod \"openshift-state-metrics-9d44df66c-86pzj\" (UID: \"fd3c9bfd-8e4a-498b-9c73-93f8b57377f5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-86pzj" Apr 16 19:19:24.414867 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.414542 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5e5dbd1b-6936-4ebc-83c5-9d234738556b-node-exporter-wtmp\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.414867 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.414577 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5e5dbd1b-6936-4ebc-83c5-9d234738556b-node-exporter-textfile\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.414867 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.414592 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e5dbd1b-6936-4ebc-83c5-9d234738556b-sys\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.414867 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.414641 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5e5dbd1b-6936-4ebc-83c5-9d234738556b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.414867 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.414673 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd3c9bfd-8e4a-498b-9c73-93f8b57377f5-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-86pzj\" (UID: \"fd3c9bfd-8e4a-498b-9c73-93f8b57377f5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-86pzj" Apr 16 19:19:24.414867 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.414709 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5e5dbd1b-6936-4ebc-83c5-9d234738556b-node-exporter-wtmp\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.414867 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.414726 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5e5dbd1b-6936-4ebc-83c5-9d234738556b-root\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.414867 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.414774 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5e5dbd1b-6936-4ebc-83c5-9d234738556b-root\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.414867 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.414848 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5e5dbd1b-6936-4ebc-83c5-9d234738556b-node-exporter-textfile\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.415454 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.414909 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5e5dbd1b-6936-4ebc-83c5-9d234738556b-node-exporter-tls\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.415454 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.414944 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gx87t\" (UniqueName: \"kubernetes.io/projected/5e5dbd1b-6936-4ebc-83c5-9d234738556b-kube-api-access-gx87t\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.415454 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.414978 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd3c9bfd-8e4a-498b-9c73-93f8b57377f5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-86pzj\" (UID: \"fd3c9bfd-8e4a-498b-9c73-93f8b57377f5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-86pzj" Apr 16 19:19:24.415454 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.415123 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5e5dbd1b-6936-4ebc-83c5-9d234738556b-node-exporter-accelerators-collector-config\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.415454 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.415420 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fd3c9bfd-8e4a-498b-9c73-93f8b57377f5-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-86pzj\" (UID: \"fd3c9bfd-8e4a-498b-9c73-93f8b57377f5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-86pzj" Apr 16 19:19:24.415711 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.415473 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5e5dbd1b-6936-4ebc-83c5-9d234738556b-metrics-client-ca\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.417060 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.417029 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5e5dbd1b-6936-4ebc-83c5-9d234738556b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.417207 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.417169 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5e5dbd1b-6936-4ebc-83c5-9d234738556b-node-exporter-tls\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.417379 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.417356 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fd3c9bfd-8e4a-498b-9c73-93f8b57377f5-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-86pzj\" (UID: \"fd3c9bfd-8e4a-498b-9c73-93f8b57377f5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-86pzj" Apr 16 19:19:24.417624 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.417605 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fd3c9bfd-8e4a-498b-9c73-93f8b57377f5-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-86pzj\" (UID: \"fd3c9bfd-8e4a-498b-9c73-93f8b57377f5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-86pzj" Apr 16 19:19:24.423206 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.423183 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx87t\" (UniqueName: \"kubernetes.io/projected/5e5dbd1b-6936-4ebc-83c5-9d234738556b-kube-api-access-gx87t\") pod \"node-exporter-xf965\" (UID: \"5e5dbd1b-6936-4ebc-83c5-9d234738556b\") " pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.424234 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.424213 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxnq8\" (UniqueName: \"kubernetes.io/projected/fd3c9bfd-8e4a-498b-9c73-93f8b57377f5-kube-api-access-nxnq8\") pod \"openshift-state-metrics-9d44df66c-86pzj\" (UID: \"fd3c9bfd-8e4a-498b-9c73-93f8b57377f5\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-86pzj" Apr 16 19:19:24.573423 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.573341 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-86pzj" Apr 16 19:19:24.601297 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.601269 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xf965" Apr 16 19:19:24.614595 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:19:24.614549 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e5dbd1b_6936_4ebc_83c5_9d234738556b.slice/crio-e59f3dfae663984d67d1c685239bee4e1394595359ab59ab96ec073546f8850a WatchSource:0}: Error finding container e59f3dfae663984d67d1c685239bee4e1394595359ab59ab96ec073546f8850a: Status 404 returned error can't find the container with id e59f3dfae663984d67d1c685239bee4e1394595359ab59ab96ec073546f8850a Apr 16 19:19:24.701296 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:24.701266 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-86pzj"] Apr 16 19:19:24.704408 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:19:24.704353 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd3c9bfd_8e4a_498b_9c73_93f8b57377f5.slice/crio-5c7c0b1f28dc3f061fbed672a266c49e38c4cfbba27b912cc24ea954a24602ea WatchSource:0}: Error finding container 5c7c0b1f28dc3f061fbed672a266c49e38c4cfbba27b912cc24ea954a24602ea: Status 404 returned error can't find the container with id 5c7c0b1f28dc3f061fbed672a266c49e38c4cfbba27b912cc24ea954a24602ea Apr 16 19:19:25.377893 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.377861 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:19:25.381871 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.381848 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.399437 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.398458 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 19:19:25.399437 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.399024 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 19:19:25.399437 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.399254 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 19:19:25.399666 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.399625 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 19:19:25.399731 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.399665 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 19:19:25.400103 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.399879 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-zv6fv\"" Apr 16 19:19:25.400103 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.400042 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 19:19:25.401073 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.400906 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 19:19:25.401073 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.401046 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 19:19:25.401314 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.401294 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 19:19:25.423129 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.423062 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:19:25.424113 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.424071 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppgtq\" (UniqueName: \"kubernetes.io/projected/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-kube-api-access-ppgtq\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.424256 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.424137 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.424256 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.424204 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.424256 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.424231 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-web-config\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.424407 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.424283 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.424407 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.424364 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.424501 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.424412 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-config-out\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.424501 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.424467 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.424599 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.424515 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.424599 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.424544 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.424599 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.424577 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.424740 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.424620 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.424740 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.424719 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-config-volume\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.457437 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.457368 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xf965" event={"ID":"5e5dbd1b-6936-4ebc-83c5-9d234738556b","Type":"ContainerStarted","Data":"e59f3dfae663984d67d1c685239bee4e1394595359ab59ab96ec073546f8850a"} Apr 16 19:19:25.459471 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.459444 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-86pzj" event={"ID":"fd3c9bfd-8e4a-498b-9c73-93f8b57377f5","Type":"ContainerStarted","Data":"717248f94362cdab32ba86cab3b6e041adbfcce45c662af07959df0fcbc1cd09"} Apr 16 19:19:25.459588 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.459478 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-86pzj" event={"ID":"fd3c9bfd-8e4a-498b-9c73-93f8b57377f5","Type":"ContainerStarted","Data":"2814c3a9557a20422a65f5bb01e69b354d446225d084a58edcf143c5cbcbfd15"} Apr 16 19:19:25.459588 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.459497 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-86pzj" event={"ID":"fd3c9bfd-8e4a-498b-9c73-93f8b57377f5","Type":"ContainerStarted","Data":"5c7c0b1f28dc3f061fbed672a266c49e38c4cfbba27b912cc24ea954a24602ea"} Apr 16 19:19:25.526181 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.525467 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-config-volume\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.526181 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.525515 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ppgtq\" (UniqueName: \"kubernetes.io/projected/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-kube-api-access-ppgtq\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.526181 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.525549 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.526181 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.525579 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.526181 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.525601 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-web-config\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.526181 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.525633 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.526181 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.525689 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.526181 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.525716 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-config-out\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.526181 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.525766 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.526181 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.525792 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.526181 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.525824 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.526181 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.525857 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.526181 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.525885 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.526181 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:19:25.526073 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-alertmanager-trusted-ca-bundle podName:1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb nodeName:}" failed. No retries permitted until 2026-04-16 19:19:26.026054717 +0000 UTC m=+93.520645312 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb") : configmap references non-existent config key: ca-bundle.crt Apr 16 19:19:25.527703 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:19:25.527007 2582 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 16 19:19:25.527703 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:19:25.527073 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-main-tls podName:1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb nodeName:}" failed. No retries permitted until 2026-04-16 19:19:26.027060113 +0000 UTC m=+93.521650713 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb") : secret "alertmanager-main-tls" not found Apr 16 19:19:25.527703 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.527194 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.527703 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.527661 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.530530 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.530502 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-config-volume\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.531665 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.531638 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.531788 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.531761 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.532236 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.532211 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.533132 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.532891 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-web-config\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.533132 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.533074 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-config-out\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.533877 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.533851 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.534346 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.534290 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:25.536786 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:25.536743 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppgtq\" (UniqueName: \"kubernetes.io/projected/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-kube-api-access-ppgtq\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:26.031282 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:26.031254 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:26.031405 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:26.031317 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:26.032318 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:26.032275 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:26.033640 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:26.033618 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:26.294636 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:26.294596 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:19:26.422232 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:26.422141 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:19:26.424724 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:19:26.424695 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1065d5ff_722f_4ff6_ab0d_b67bd5e87dbb.slice/crio-7af680162d74e838bbe83259f86971035bb41b72444eb940e23c4de26e35059b WatchSource:0}: Error finding container 7af680162d74e838bbe83259f86971035bb41b72444eb940e23c4de26e35059b: Status 404 returned error can't find the container with id 7af680162d74e838bbe83259f86971035bb41b72444eb940e23c4de26e35059b Apr 16 19:19:26.464015 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:26.463986 2582 generic.go:358] "Generic (PLEG): container finished" podID="5e5dbd1b-6936-4ebc-83c5-9d234738556b" containerID="c293aaa421db456be146fb6c0891fbaebea2a5fb98f829c123ebc3c45087b483" exitCode=0 Apr 16 19:19:26.464198 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:26.464051 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xf965" event={"ID":"5e5dbd1b-6936-4ebc-83c5-9d234738556b","Type":"ContainerDied","Data":"c293aaa421db456be146fb6c0891fbaebea2a5fb98f829c123ebc3c45087b483"} Apr 16 19:19:26.465193 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:26.465168 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb","Type":"ContainerStarted","Data":"7af680162d74e838bbe83259f86971035bb41b72444eb940e23c4de26e35059b"} Apr 16 19:19:26.466933 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:26.466908 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-86pzj" event={"ID":"fd3c9bfd-8e4a-498b-9c73-93f8b57377f5","Type":"ContainerStarted","Data":"d4111d731dfd97d262304351a591b1775a4775ef3566062cb85195f56ab74ac1"} Apr 16 19:19:26.510103 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:26.510042 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-86pzj" podStartSLOduration=1.372126729 podStartE2EDuration="2.510023814s" podCreationTimestamp="2026-04-16 19:19:24 +0000 UTC" firstStartedPulling="2026-04-16 19:19:24.83545486 +0000 UTC m=+92.330045455" lastFinishedPulling="2026-04-16 19:19:25.973351923 +0000 UTC m=+93.467942540" observedRunningTime="2026-04-16 19:19:26.508532417 +0000 UTC m=+94.003123054" watchObservedRunningTime="2026-04-16 19:19:26.510023814 +0000 UTC m=+94.004614433" Apr 16 19:19:27.472688 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:27.472661 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xf965" event={"ID":"5e5dbd1b-6936-4ebc-83c5-9d234738556b","Type":"ContainerStarted","Data":"7bf2e9683be7873a61d0f3bd208f62cc2e1f66b3e22cec9befdf8337638b6595"} Apr 16 19:19:27.472961 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:27.472694 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xf965" event={"ID":"5e5dbd1b-6936-4ebc-83c5-9d234738556b","Type":"ContainerStarted","Data":"c5bb847d3ceb32f82b810f0fd0bde6a3aa1af5d19c2ac47c7c33786d488c1207"} Apr 16 19:19:27.495688 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:27.495646 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-xf965" podStartSLOduration=2.483143151 podStartE2EDuration="3.495631319s" podCreationTimestamp="2026-04-16 19:19:24 +0000 UTC" firstStartedPulling="2026-04-16 19:19:24.616561608 +0000 UTC m=+92.111152210" lastFinishedPulling="2026-04-16 19:19:25.629049782 +0000 UTC m=+93.123640378" observedRunningTime="2026-04-16 19:19:27.494492879 +0000 UTC m=+94.989083507" watchObservedRunningTime="2026-04-16 19:19:27.495631319 +0000 UTC m=+94.990221936" Apr 16 19:19:28.477135 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.477100 2582 generic.go:358] "Generic (PLEG): container finished" podID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerID="67cc6c387bc5dd8c87fe2d99dc552fbb41c9128ab2bda3caf096233b1adb3d02" exitCode=0 Apr 16 19:19:28.477534 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.477193 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb","Type":"ContainerDied","Data":"67cc6c387bc5dd8c87fe2d99dc552fbb41c9128ab2bda3caf096233b1adb3d02"} Apr 16 19:19:28.713315 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.713280 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-797486cb67-lw5s7"] Apr 16 19:19:28.716570 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.716552 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:28.721163 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.721132 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 19:19:28.721281 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.721234 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 19:19:28.721442 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.721424 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-amalemprq2ukr\"" Apr 16 19:19:28.721523 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.721466 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 19:19:28.721523 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.721493 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 19:19:28.721634 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.721492 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-s6dn9\"" Apr 16 19:19:28.735199 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.735122 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-797486cb67-lw5s7"] Apr 16 19:19:28.756341 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.756305 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12ea81e7-90ab-476f-805c-836751743647-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-797486cb67-lw5s7\" (UID: \"12ea81e7-90ab-476f-805c-836751743647\") " pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:28.756341 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.756344 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/12ea81e7-90ab-476f-805c-836751743647-secret-metrics-server-client-certs\") pod \"metrics-server-797486cb67-lw5s7\" (UID: \"12ea81e7-90ab-476f-805c-836751743647\") " pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:28.756548 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.756364 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/12ea81e7-90ab-476f-805c-836751743647-audit-log\") pod \"metrics-server-797486cb67-lw5s7\" (UID: \"12ea81e7-90ab-476f-805c-836751743647\") " pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:28.756548 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.756428 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/12ea81e7-90ab-476f-805c-836751743647-metrics-server-audit-profiles\") pod \"metrics-server-797486cb67-lw5s7\" (UID: \"12ea81e7-90ab-476f-805c-836751743647\") " pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:28.756548 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.756461 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/12ea81e7-90ab-476f-805c-836751743647-secret-metrics-server-tls\") pod \"metrics-server-797486cb67-lw5s7\" (UID: \"12ea81e7-90ab-476f-805c-836751743647\") " pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:28.756548 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.756529 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ea81e7-90ab-476f-805c-836751743647-client-ca-bundle\") pod \"metrics-server-797486cb67-lw5s7\" (UID: \"12ea81e7-90ab-476f-805c-836751743647\") " pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:28.756683 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.756561 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqtg8\" (UniqueName: \"kubernetes.io/projected/12ea81e7-90ab-476f-805c-836751743647-kube-api-access-mqtg8\") pod \"metrics-server-797486cb67-lw5s7\" (UID: \"12ea81e7-90ab-476f-805c-836751743647\") " pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:28.857651 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.857611 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12ea81e7-90ab-476f-805c-836751743647-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-797486cb67-lw5s7\" (UID: \"12ea81e7-90ab-476f-805c-836751743647\") " pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:28.857651 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.857652 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/12ea81e7-90ab-476f-805c-836751743647-secret-metrics-server-client-certs\") pod \"metrics-server-797486cb67-lw5s7\" (UID: \"12ea81e7-90ab-476f-805c-836751743647\") " pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:28.857894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.857671 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/12ea81e7-90ab-476f-805c-836751743647-audit-log\") pod \"metrics-server-797486cb67-lw5s7\" (UID: \"12ea81e7-90ab-476f-805c-836751743647\") " pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:28.857894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.857692 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/12ea81e7-90ab-476f-805c-836751743647-metrics-server-audit-profiles\") pod \"metrics-server-797486cb67-lw5s7\" (UID: \"12ea81e7-90ab-476f-805c-836751743647\") " pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:28.857894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.857716 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/12ea81e7-90ab-476f-805c-836751743647-secret-metrics-server-tls\") pod \"metrics-server-797486cb67-lw5s7\" (UID: \"12ea81e7-90ab-476f-805c-836751743647\") " pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:28.857894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.857792 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ea81e7-90ab-476f-805c-836751743647-client-ca-bundle\") pod \"metrics-server-797486cb67-lw5s7\" (UID: \"12ea81e7-90ab-476f-805c-836751743647\") " pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:28.857894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.857831 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqtg8\" (UniqueName: \"kubernetes.io/projected/12ea81e7-90ab-476f-805c-836751743647-kube-api-access-mqtg8\") pod \"metrics-server-797486cb67-lw5s7\" (UID: \"12ea81e7-90ab-476f-805c-836751743647\") " pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:28.858214 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.858193 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/12ea81e7-90ab-476f-805c-836751743647-audit-log\") pod \"metrics-server-797486cb67-lw5s7\" (UID: \"12ea81e7-90ab-476f-805c-836751743647\") " pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:28.858454 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.858430 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12ea81e7-90ab-476f-805c-836751743647-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-797486cb67-lw5s7\" (UID: \"12ea81e7-90ab-476f-805c-836751743647\") " pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:28.858776 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.858753 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/12ea81e7-90ab-476f-805c-836751743647-metrics-server-audit-profiles\") pod \"metrics-server-797486cb67-lw5s7\" (UID: \"12ea81e7-90ab-476f-805c-836751743647\") " pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:28.860260 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.860243 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/12ea81e7-90ab-476f-805c-836751743647-secret-metrics-server-tls\") pod \"metrics-server-797486cb67-lw5s7\" (UID: \"12ea81e7-90ab-476f-805c-836751743647\") " pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:28.860714 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.860690 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/12ea81e7-90ab-476f-805c-836751743647-secret-metrics-server-client-certs\") pod \"metrics-server-797486cb67-lw5s7\" (UID: \"12ea81e7-90ab-476f-805c-836751743647\") " pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:28.860876 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.860856 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12ea81e7-90ab-476f-805c-836751743647-client-ca-bundle\") pod \"metrics-server-797486cb67-lw5s7\" (UID: \"12ea81e7-90ab-476f-805c-836751743647\") " pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:28.866384 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:28.866362 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqtg8\" (UniqueName: \"kubernetes.io/projected/12ea81e7-90ab-476f-805c-836751743647-kube-api-access-mqtg8\") pod \"metrics-server-797486cb67-lw5s7\" (UID: \"12ea81e7-90ab-476f-805c-836751743647\") " pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:29.025922 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.025834 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:29.150574 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.150517 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-797486cb67-lw5s7"] Apr 16 19:19:29.153580 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:19:29.153548 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12ea81e7_90ab_476f_805c_836751743647.slice/crio-cbf8eaf329dd85aef94b281276a6fb5bcf1341d066a3e2ed4170337d995ad004 WatchSource:0}: Error finding container cbf8eaf329dd85aef94b281276a6fb5bcf1341d066a3e2ed4170337d995ad004: Status 404 returned error can't find the container with id cbf8eaf329dd85aef94b281276a6fb5bcf1341d066a3e2ed4170337d995ad004 Apr 16 19:19:29.482993 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.482955 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" event={"ID":"12ea81e7-90ab-476f-805c-836751743647","Type":"ContainerStarted","Data":"cbf8eaf329dd85aef94b281276a6fb5bcf1341d066a3e2ed4170337d995ad004"} Apr 16 19:19:29.546814 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.546774 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5"] Apr 16 19:19:29.550362 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.550341 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.553257 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.553231 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 19:19:29.553367 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.553271 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-947xp\"" Apr 16 19:19:29.553470 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.553457 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 19:19:29.553659 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.553646 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 19:19:29.553742 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.553727 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 19:19:29.554279 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.554264 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 19:19:29.558963 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.558945 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 19:19:29.567828 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.567809 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5"] Apr 16 19:19:29.664489 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.664451 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ba226b86-8449-4c16-881e-753110579cfe-federate-client-tls\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.664658 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.664505 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ba226b86-8449-4c16-881e-753110579cfe-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.664658 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.664569 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ba226b86-8449-4c16-881e-753110579cfe-secret-telemeter-client\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.664658 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.664641 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ba226b86-8449-4c16-881e-753110579cfe-telemeter-client-tls\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.664813 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.664692 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba226b86-8449-4c16-881e-753110579cfe-metrics-client-ca\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.664813 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.664728 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba226b86-8449-4c16-881e-753110579cfe-telemeter-trusted-ca-bundle\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.664813 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.664787 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq5xf\" (UniqueName: \"kubernetes.io/projected/ba226b86-8449-4c16-881e-753110579cfe-kube-api-access-fq5xf\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.664955 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.664841 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba226b86-8449-4c16-881e-753110579cfe-serving-certs-ca-bundle\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.765509 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.765420 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ba226b86-8449-4c16-881e-753110579cfe-telemeter-client-tls\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.765509 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.765466 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba226b86-8449-4c16-881e-753110579cfe-metrics-client-ca\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.765509 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.765494 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba226b86-8449-4c16-881e-753110579cfe-telemeter-trusted-ca-bundle\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.765773 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.765523 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fq5xf\" (UniqueName: \"kubernetes.io/projected/ba226b86-8449-4c16-881e-753110579cfe-kube-api-access-fq5xf\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.765773 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.765561 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba226b86-8449-4c16-881e-753110579cfe-serving-certs-ca-bundle\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.765773 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.765616 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ba226b86-8449-4c16-881e-753110579cfe-federate-client-tls\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.765773 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.765648 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ba226b86-8449-4c16-881e-753110579cfe-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.765773 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.765701 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ba226b86-8449-4c16-881e-753110579cfe-secret-telemeter-client\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.766379 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.766303 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ba226b86-8449-4c16-881e-753110579cfe-metrics-client-ca\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.766529 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.766391 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba226b86-8449-4c16-881e-753110579cfe-telemeter-trusted-ca-bundle\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.766850 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.766827 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba226b86-8449-4c16-881e-753110579cfe-serving-certs-ca-bundle\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.768858 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.768833 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ba226b86-8449-4c16-881e-753110579cfe-telemeter-client-tls\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.768932 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.768873 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ba226b86-8449-4c16-881e-753110579cfe-federate-client-tls\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.768932 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.768912 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ba226b86-8449-4c16-881e-753110579cfe-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.768932 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.768926 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ba226b86-8449-4c16-881e-753110579cfe-secret-telemeter-client\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.775848 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.775827 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq5xf\" (UniqueName: \"kubernetes.io/projected/ba226b86-8449-4c16-881e-753110579cfe-kube-api-access-fq5xf\") pod \"telemeter-client-9d45d7bd4-hgrc5\" (UID: \"ba226b86-8449-4c16-881e-753110579cfe\") " pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:29.860240 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:29.860195 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" Apr 16 19:19:30.357328 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.357278 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-kk8zc" Apr 16 19:19:30.636322 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.636293 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:19:30.640396 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.640371 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.644852 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.644702 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 19:19:30.644852 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.644752 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 19:19:30.644852 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.644775 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 19:19:30.645795 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.645714 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 19:19:30.646533 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.646514 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 19:19:30.646665 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.646651 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-sghz7\"" Apr 16 19:19:30.647313 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.647299 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 19:19:30.651908 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.651893 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 19:19:30.652458 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.652408 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 19:19:30.652759 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.652501 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8dih7s3s9oa5u\"" Apr 16 19:19:30.652759 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.652515 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 19:19:30.653209 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.653192 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 19:19:30.653746 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.653711 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 19:19:30.655767 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.655688 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 19:19:30.671554 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.671528 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:19:30.678884 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.678843 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/33cd2620-ef7c-434f-8baf-6b44b39be4c8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.679018 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.678889 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.679018 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.678921 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.679018 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.678960 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/33cd2620-ef7c-434f-8baf-6b44b39be4c8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.679018 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.678985 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.679464 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.679031 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/33cd2620-ef7c-434f-8baf-6b44b39be4c8-config-out\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.679464 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.679055 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.679464 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.679095 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.679464 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.679124 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.679464 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.679165 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-web-config\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.679464 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.679189 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brgfp\" (UniqueName: \"kubernetes.io/projected/33cd2620-ef7c-434f-8baf-6b44b39be4c8-kube-api-access-brgfp\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.679464 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.679227 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-config\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.679464 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.679266 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.679464 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.679440 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.679890 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.679519 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.679890 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.679558 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.679890 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.679647 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.679890 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.679695 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.766549 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.766527 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5"] Apr 16 19:19:30.772354 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:19:30.772325 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba226b86_8449_4c16_881e_753110579cfe.slice/crio-dc7bb68810d74241d2ac335b8295d2a7dd74cd601adaf99f4ea317a7ae679608 WatchSource:0}: Error finding container dc7bb68810d74241d2ac335b8295d2a7dd74cd601adaf99f4ea317a7ae679608: Status 404 returned error can't find the container with id dc7bb68810d74241d2ac335b8295d2a7dd74cd601adaf99f4ea317a7ae679608 Apr 16 19:19:30.781894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.781164 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.781894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.781205 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-web-config\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.781894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.781222 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brgfp\" (UniqueName: \"kubernetes.io/projected/33cd2620-ef7c-434f-8baf-6b44b39be4c8-kube-api-access-brgfp\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.781894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.781256 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-config\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.781894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.781280 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.781894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.781306 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.781894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.781385 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.781894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.781424 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.781894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.781464 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.781894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.781491 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.781894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.781541 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/33cd2620-ef7c-434f-8baf-6b44b39be4c8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.781894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.781576 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.781894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.781604 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.781894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.781655 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/33cd2620-ef7c-434f-8baf-6b44b39be4c8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.781894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.781680 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.781894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.781711 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/33cd2620-ef7c-434f-8baf-6b44b39be4c8-config-out\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.781894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.781739 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.785523 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.781785 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.785523 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.781940 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.785523 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.782388 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/33cd2620-ef7c-434f-8baf-6b44b39be4c8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.785523 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.782766 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.788112 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.787344 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.788112 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.787882 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.789460 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.788496 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.789460 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.788685 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/33cd2620-ef7c-434f-8baf-6b44b39be4c8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.789460 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.788718 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.789460 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.788744 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.789460 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.789201 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.789460 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.789362 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.789784 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.789459 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/33cd2620-ef7c-434f-8baf-6b44b39be4c8-config-out\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.789784 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.789581 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.790323 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.790284 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-config\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.790520 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.790497 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-web-config\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.790584 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.790542 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.791870 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.791845 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:30.792166 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:30.792131 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brgfp\" (UniqueName: \"kubernetes.io/projected/33cd2620-ef7c-434f-8baf-6b44b39be4c8-kube-api-access-brgfp\") pod \"prometheus-k8s-0\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:31.035205 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:31.035141 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:31.171186 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:31.171096 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:19:31.174344 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:19:31.174316 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33cd2620_ef7c_434f_8baf_6b44b39be4c8.slice/crio-adf1982795a8599ea8a065681f077692f8beaaa39e86ef293f9c2b0b3ed05511 WatchSource:0}: Error finding container adf1982795a8599ea8a065681f077692f8beaaa39e86ef293f9c2b0b3ed05511: Status 404 returned error can't find the container with id adf1982795a8599ea8a065681f077692f8beaaa39e86ef293f9c2b0b3ed05511 Apr 16 19:19:31.493456 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:31.493423 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" event={"ID":"ba226b86-8449-4c16-881e-753110579cfe","Type":"ContainerStarted","Data":"dc7bb68810d74241d2ac335b8295d2a7dd74cd601adaf99f4ea317a7ae679608"} Apr 16 19:19:31.495007 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:31.494965 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" event={"ID":"12ea81e7-90ab-476f-805c-836751743647","Type":"ContainerStarted","Data":"e6585c7ce320acff42fa9dd9e69d68c5834d31f9f16a26ec66d524da2a0fd0ac"} Apr 16 19:19:31.498013 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:31.497989 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb","Type":"ContainerStarted","Data":"3030b7ecb96ef732254a644f00e87cc38c37c2d9b13214086e5273761129d310"} Apr 16 19:19:31.498013 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:31.498016 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb","Type":"ContainerStarted","Data":"9bb6d471c371f01d389c774d6dafa001d17930ab5de6ed7f38f3e84f2d59c5e4"} Apr 16 19:19:31.498213 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:31.498026 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb","Type":"ContainerStarted","Data":"cbfdb923f2b0bf11fdaf0a390f5997029af58518f15426955a4e0271158ea65a"} Apr 16 19:19:31.498213 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:31.498036 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb","Type":"ContainerStarted","Data":"1b6630a011135a067c0eccef4b8570f2c24164d0ad82bd974c4f711dd32904ff"} Apr 16 19:19:31.498213 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:31.498043 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb","Type":"ContainerStarted","Data":"8eaa8ec8874cfcadb62a2e66a562b255ba745cc0ac4aab9a178d61f3b36d4ea8"} Apr 16 19:19:31.499579 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:31.499551 2582 generic.go:358] "Generic (PLEG): container finished" podID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerID="4c707484b60b3313034a617f8746cf8aa92664a3c420d31ae6085a8fe286fa24" exitCode=0 Apr 16 19:19:31.499694 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:31.499592 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"33cd2620-ef7c-434f-8baf-6b44b39be4c8","Type":"ContainerDied","Data":"4c707484b60b3313034a617f8746cf8aa92664a3c420d31ae6085a8fe286fa24"} Apr 16 19:19:31.499694 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:31.499613 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"33cd2620-ef7c-434f-8baf-6b44b39be4c8","Type":"ContainerStarted","Data":"adf1982795a8599ea8a065681f077692f8beaaa39e86ef293f9c2b0b3ed05511"} Apr 16 19:19:31.545374 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:31.545322 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" podStartSLOduration=2.061668789 podStartE2EDuration="3.545305793s" podCreationTimestamp="2026-04-16 19:19:28 +0000 UTC" firstStartedPulling="2026-04-16 19:19:29.156105816 +0000 UTC m=+96.650696421" lastFinishedPulling="2026-04-16 19:19:30.639742818 +0000 UTC m=+98.134333425" observedRunningTime="2026-04-16 19:19:31.543483172 +0000 UTC m=+99.038073790" watchObservedRunningTime="2026-04-16 19:19:31.545305793 +0000 UTC m=+99.039896414" Apr 16 19:19:32.110887 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:32.110818 2582 scope.go:117] "RemoveContainer" containerID="9e5adaaff50339a33ad8742e2e96fe9cab2b7b6e972eac2639c3f92347aa1ac4" Apr 16 19:19:32.505450 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:32.505417 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/2.log" Apr 16 19:19:32.505622 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:32.505545 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" event={"ID":"82d1300a-6831-4de2-a99c-90a2b28f9a33","Type":"ContainerStarted","Data":"deb6845c0b909a0d7e76b486ef05f710fc1daab5fcf60e0b14b881c843f8d4ac"} Apr 16 19:19:32.505967 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:32.505946 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" Apr 16 19:19:32.509532 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:32.509504 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb","Type":"ContainerStarted","Data":"8052c3f5a2706df3152a860095f783467f2bda8e65eed00a5653db9ac752cc1b"} Apr 16 19:19:32.555346 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:32.555301 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" podStartSLOduration=50.956645633 podStartE2EDuration="57.555287903s" podCreationTimestamp="2026-04-16 19:18:35 +0000 UTC" firstStartedPulling="2026-04-16 19:18:35.895603698 +0000 UTC m=+43.390194294" lastFinishedPulling="2026-04-16 19:18:42.494245952 +0000 UTC m=+49.988836564" observedRunningTime="2026-04-16 19:19:32.5246459 +0000 UTC m=+100.019236541" watchObservedRunningTime="2026-04-16 19:19:32.555287903 +0000 UTC m=+100.049878517" Apr 16 19:19:32.555938 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:32.555896 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.286916305 podStartE2EDuration="7.555885457s" podCreationTimestamp="2026-04-16 19:19:25 +0000 UTC" firstStartedPulling="2026-04-16 19:19:26.426964543 +0000 UTC m=+93.921555138" lastFinishedPulling="2026-04-16 19:19:31.695933689 +0000 UTC m=+99.190524290" observedRunningTime="2026-04-16 19:19:32.55444769 +0000 UTC m=+100.049038312" watchObservedRunningTime="2026-04-16 19:19:32.555885457 +0000 UTC m=+100.050476073" Apr 16 19:19:32.765039 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:32.764960 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-kqmw8" Apr 16 19:19:33.514314 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:33.514278 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" event={"ID":"ba226b86-8449-4c16-881e-753110579cfe","Type":"ContainerStarted","Data":"80e34027599b88da8e292a1f3c0b869505f7ea3fd9cfc0aefb5f30e04f1803d5"} Apr 16 19:19:33.514314 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:33.514318 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" event={"ID":"ba226b86-8449-4c16-881e-753110579cfe","Type":"ContainerStarted","Data":"8325058d6ee1b7fbedc011f4a01b46e7b00c6b4dca0b26e97cc4a65723d8f42c"} Apr 16 19:19:33.514314 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:33.514329 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" event={"ID":"ba226b86-8449-4c16-881e-753110579cfe","Type":"ContainerStarted","Data":"3b54caffe2a76125c3dbb8d2d02f7404715314276e757347b2d737635222a7df"} Apr 16 19:19:33.548368 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:33.548310 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-9d45d7bd4-hgrc5" podStartSLOduration=2.181375135 podStartE2EDuration="4.548293705s" podCreationTimestamp="2026-04-16 19:19:29 +0000 UTC" firstStartedPulling="2026-04-16 19:19:30.774380855 +0000 UTC m=+98.268971450" lastFinishedPulling="2026-04-16 19:19:33.141299423 +0000 UTC m=+100.635890020" observedRunningTime="2026-04-16 19:19:33.547518294 +0000 UTC m=+101.042108911" watchObservedRunningTime="2026-04-16 19:19:33.548293705 +0000 UTC m=+101.042884323" Apr 16 19:19:35.198791 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:35.198757 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-86db68fc76-grwtr"] Apr 16 19:19:36.525367 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:36.525334 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"33cd2620-ef7c-434f-8baf-6b44b39be4c8","Type":"ContainerStarted","Data":"3e03a40e36bfcc5516aa09df567e991d091695a43b0d5c52406837dad4ea26bb"} Apr 16 19:19:36.525367 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:36.525369 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"33cd2620-ef7c-434f-8baf-6b44b39be4c8","Type":"ContainerStarted","Data":"35d8978b99d1910976ae758e3415cf3450f18f7ef95491faaa70ff91c758d001"} Apr 16 19:19:37.531828 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:37.531803 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"33cd2620-ef7c-434f-8baf-6b44b39be4c8","Type":"ContainerStarted","Data":"602fef8860e23884627a98b7062925db5cd0ee1b6fd85ad24d2f5170413b4f2e"} Apr 16 19:19:37.532098 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:37.531835 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"33cd2620-ef7c-434f-8baf-6b44b39be4c8","Type":"ContainerStarted","Data":"5cc58e70539715ac110857df30400c7e663618c4fd2a94c800f775719d38c759"} Apr 16 19:19:37.532098 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:37.531845 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"33cd2620-ef7c-434f-8baf-6b44b39be4c8","Type":"ContainerStarted","Data":"dfc587a1d60607c3801621b24c59dd279bd59b606a12322137f316a8bea65c09"} Apr 16 19:19:38.538211 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:38.538146 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"33cd2620-ef7c-434f-8baf-6b44b39be4c8","Type":"ContainerStarted","Data":"008a50a43f03b73bd90d9aab0d31cf3150d6006e468c1c522fc7ada266eebe45"} Apr 16 19:19:38.565909 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:38.565854 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.753256663 podStartE2EDuration="8.565837105s" podCreationTimestamp="2026-04-16 19:19:30 +0000 UTC" firstStartedPulling="2026-04-16 19:19:31.500942541 +0000 UTC m=+98.995533136" lastFinishedPulling="2026-04-16 19:19:37.31352298 +0000 UTC m=+104.808113578" observedRunningTime="2026-04-16 19:19:38.564518609 +0000 UTC m=+106.059109247" watchObservedRunningTime="2026-04-16 19:19:38.565837105 +0000 UTC m=+106.060427725" Apr 16 19:19:41.035991 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:41.035955 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:19:49.026025 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:49.025982 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:49.026513 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:49.026062 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:19:53.277949 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:19:53.277914 2582 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfefe6f16_f5b3_4e5d_bd2e_e4cd50f953ae.slice/crio-conmon-815fb8daf0f5bb2930fc2e20db17911bbe7b946e6e99534ba84439e85dafc1de.scope\": RecentStats: unable to find data in memory cache]" Apr 16 19:19:53.585583 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:53.585504 2582 generic.go:358] "Generic (PLEG): container finished" podID="fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae" containerID="815fb8daf0f5bb2930fc2e20db17911bbe7b946e6e99534ba84439e85dafc1de" exitCode=0 Apr 16 19:19:53.585722 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:53.585578 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-67nzv" event={"ID":"fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae","Type":"ContainerDied","Data":"815fb8daf0f5bb2930fc2e20db17911bbe7b946e6e99534ba84439e85dafc1de"} Apr 16 19:19:53.585907 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:53.585893 2582 scope.go:117] "RemoveContainer" containerID="815fb8daf0f5bb2930fc2e20db17911bbe7b946e6e99534ba84439e85dafc1de" Apr 16 19:19:54.589739 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:19:54.589694 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-67nzv" event={"ID":"fefe6f16-f5b3-4e5d-bd2e-e4cd50f953ae","Type":"ContainerStarted","Data":"5e249adc9827124f7811d79cf816ac4efaf232af2b6cb767fd84deadbd5bc288"} Apr 16 19:20:00.221094 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.221028 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-86db68fc76-grwtr" podUID="fff8270c-4771-4655-8abd-7341281f3173" containerName="registry" containerID="cri-o://ebbad7bff376f08d17d86584071f59b54a55d83c2ddf56ee030de0826e0b90f7" gracePeriod=30 Apr 16 19:20:00.462449 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.462426 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:20:00.561516 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.561485 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fff8270c-4771-4655-8abd-7341281f3173-ca-trust-extracted\") pod \"fff8270c-4771-4655-8abd-7341281f3173\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " Apr 16 19:20:00.561722 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.561530 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fff8270c-4771-4655-8abd-7341281f3173-image-registry-private-configuration\") pod \"fff8270c-4771-4655-8abd-7341281f3173\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " Apr 16 19:20:00.561722 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.561584 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls\") pod \"fff8270c-4771-4655-8abd-7341281f3173\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " Apr 16 19:20:00.561722 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.561632 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fff8270c-4771-4655-8abd-7341281f3173-registry-certificates\") pod \"fff8270c-4771-4655-8abd-7341281f3173\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " Apr 16 19:20:00.561722 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.561659 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gks9j\" (UniqueName: \"kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-kube-api-access-gks9j\") pod \"fff8270c-4771-4655-8abd-7341281f3173\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " Apr 16 19:20:00.561722 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.561713 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fff8270c-4771-4655-8abd-7341281f3173-installation-pull-secrets\") pod \"fff8270c-4771-4655-8abd-7341281f3173\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " Apr 16 19:20:00.561964 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.561850 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-bound-sa-token\") pod \"fff8270c-4771-4655-8abd-7341281f3173\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " Apr 16 19:20:00.561964 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.561904 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fff8270c-4771-4655-8abd-7341281f3173-trusted-ca\") pod \"fff8270c-4771-4655-8abd-7341281f3173\" (UID: \"fff8270c-4771-4655-8abd-7341281f3173\") " Apr 16 19:20:00.562251 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.562178 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fff8270c-4771-4655-8abd-7341281f3173-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "fff8270c-4771-4655-8abd-7341281f3173" (UID: "fff8270c-4771-4655-8abd-7341281f3173"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:20:00.562364 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.562269 2582 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fff8270c-4771-4655-8abd-7341281f3173-registry-certificates\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:00.562632 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.562591 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fff8270c-4771-4655-8abd-7341281f3173-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "fff8270c-4771-4655-8abd-7341281f3173" (UID: "fff8270c-4771-4655-8abd-7341281f3173"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:20:00.564293 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.564265 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-kube-api-access-gks9j" (OuterVolumeSpecName: "kube-api-access-gks9j") pod "fff8270c-4771-4655-8abd-7341281f3173" (UID: "fff8270c-4771-4655-8abd-7341281f3173"). InnerVolumeSpecName "kube-api-access-gks9j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:20:00.564401 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.564295 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "fff8270c-4771-4655-8abd-7341281f3173" (UID: "fff8270c-4771-4655-8abd-7341281f3173"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:20:00.564401 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.564299 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fff8270c-4771-4655-8abd-7341281f3173-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "fff8270c-4771-4655-8abd-7341281f3173" (UID: "fff8270c-4771-4655-8abd-7341281f3173"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:20:00.564511 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.564478 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "fff8270c-4771-4655-8abd-7341281f3173" (UID: "fff8270c-4771-4655-8abd-7341281f3173"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:20:00.564694 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.564676 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fff8270c-4771-4655-8abd-7341281f3173-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "fff8270c-4771-4655-8abd-7341281f3173" (UID: "fff8270c-4771-4655-8abd-7341281f3173"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:20:00.570243 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.570218 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fff8270c-4771-4655-8abd-7341281f3173-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "fff8270c-4771-4655-8abd-7341281f3173" (UID: "fff8270c-4771-4655-8abd-7341281f3173"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:20:00.608098 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.608062 2582 generic.go:358] "Generic (PLEG): container finished" podID="fff8270c-4771-4655-8abd-7341281f3173" containerID="ebbad7bff376f08d17d86584071f59b54a55d83c2ddf56ee030de0826e0b90f7" exitCode=0 Apr 16 19:20:00.608249 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.608134 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86db68fc76-grwtr" Apr 16 19:20:00.608249 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.608168 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86db68fc76-grwtr" event={"ID":"fff8270c-4771-4655-8abd-7341281f3173","Type":"ContainerDied","Data":"ebbad7bff376f08d17d86584071f59b54a55d83c2ddf56ee030de0826e0b90f7"} Apr 16 19:20:00.608249 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.608219 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86db68fc76-grwtr" event={"ID":"fff8270c-4771-4655-8abd-7341281f3173","Type":"ContainerDied","Data":"5296eeb4f51446133b1773e90190ef4bff6012fc844f0903394035f68e372dd5"} Apr 16 19:20:00.608249 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.608238 2582 scope.go:117] "RemoveContainer" containerID="ebbad7bff376f08d17d86584071f59b54a55d83c2ddf56ee030de0826e0b90f7" Apr 16 19:20:00.616373 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.616357 2582 scope.go:117] "RemoveContainer" containerID="ebbad7bff376f08d17d86584071f59b54a55d83c2ddf56ee030de0826e0b90f7" Apr 16 19:20:00.616644 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:00.616625 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebbad7bff376f08d17d86584071f59b54a55d83c2ddf56ee030de0826e0b90f7\": container with ID starting with ebbad7bff376f08d17d86584071f59b54a55d83c2ddf56ee030de0826e0b90f7 not found: ID does not exist" containerID="ebbad7bff376f08d17d86584071f59b54a55d83c2ddf56ee030de0826e0b90f7" Apr 16 19:20:00.616701 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.616652 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebbad7bff376f08d17d86584071f59b54a55d83c2ddf56ee030de0826e0b90f7"} err="failed to get container status \"ebbad7bff376f08d17d86584071f59b54a55d83c2ddf56ee030de0826e0b90f7\": rpc error: code = NotFound desc = could not find container \"ebbad7bff376f08d17d86584071f59b54a55d83c2ddf56ee030de0826e0b90f7\": container with ID starting with ebbad7bff376f08d17d86584071f59b54a55d83c2ddf56ee030de0826e0b90f7 not found: ID does not exist" Apr 16 19:20:00.632952 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.632927 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-86db68fc76-grwtr"] Apr 16 19:20:00.637100 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.637077 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-86db68fc76-grwtr"] Apr 16 19:20:00.663482 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.663457 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gks9j\" (UniqueName: \"kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-kube-api-access-gks9j\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:00.663482 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.663481 2582 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fff8270c-4771-4655-8abd-7341281f3173-installation-pull-secrets\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:00.663633 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.663493 2582 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-bound-sa-token\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:00.663633 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.663503 2582 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fff8270c-4771-4655-8abd-7341281f3173-trusted-ca\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:00.663633 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.663511 2582 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fff8270c-4771-4655-8abd-7341281f3173-ca-trust-extracted\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:00.663633 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.663519 2582 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fff8270c-4771-4655-8abd-7341281f3173-image-registry-private-configuration\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:00.663633 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:00.663529 2582 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fff8270c-4771-4655-8abd-7341281f3173-registry-tls\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:01.114147 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:01.114107 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fff8270c-4771-4655-8abd-7341281f3173" path="/var/lib/kubelet/pods/fff8270c-4771-4655-8abd-7341281f3173/volumes" Apr 16 19:20:01.219429 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:01.219388 2582 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfefe6f16_f5b3_4e5d_bd2e_e4cd50f953ae.slice/crio-conmon-815fb8daf0f5bb2930fc2e20db17911bbe7b946e6e99534ba84439e85dafc1de.scope\": RecentStats: unable to find data in memory cache]" Apr 16 19:20:02.884544 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:02.884496 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs\") pod \"network-metrics-daemon-lvp6d\" (UID: \"0fa55098-1c0e-4cf5-963c-602d47a411cc\") " pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:20:02.886976 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:02.886948 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fa55098-1c0e-4cf5-963c-602d47a411cc-metrics-certs\") pod \"network-metrics-daemon-lvp6d\" (UID: \"0fa55098-1c0e-4cf5-963c-602d47a411cc\") " pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:20:02.933600 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:02.933565 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-t5lg8\"" Apr 16 19:20:02.940231 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:02.940207 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvp6d" Apr 16 19:20:03.059399 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:03.059358 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lvp6d"] Apr 16 19:20:03.061990 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:20:03.061963 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fa55098_1c0e_4cf5_963c_602d47a411cc.slice/crio-0b75c946e414c097846df398082464ed05d060840e3b6cd9fe51489be3f7d043 WatchSource:0}: Error finding container 0b75c946e414c097846df398082464ed05d060840e3b6cd9fe51489be3f7d043: Status 404 returned error can't find the container with id 0b75c946e414c097846df398082464ed05d060840e3b6cd9fe51489be3f7d043 Apr 16 19:20:03.286225 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:03.286191 2582 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfefe6f16_f5b3_4e5d_bd2e_e4cd50f953ae.slice/crio-conmon-815fb8daf0f5bb2930fc2e20db17911bbe7b946e6e99534ba84439e85dafc1de.scope\": RecentStats: unable to find data in memory cache]" Apr 16 19:20:03.619039 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:03.618952 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lvp6d" event={"ID":"0fa55098-1c0e-4cf5-963c-602d47a411cc","Type":"ContainerStarted","Data":"0b75c946e414c097846df398082464ed05d060840e3b6cd9fe51489be3f7d043"} Apr 16 19:20:05.627877 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:05.627841 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lvp6d" event={"ID":"0fa55098-1c0e-4cf5-963c-602d47a411cc","Type":"ContainerStarted","Data":"3bc3dce21e7c2f75201d967f45d153c4b5e8683002956154c05a7ab3a6606a23"} Apr 16 19:20:05.627877 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:05.627878 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lvp6d" event={"ID":"0fa55098-1c0e-4cf5-963c-602d47a411cc","Type":"ContainerStarted","Data":"955526a414577fb2c73ac633cefd3acdb45848cd223f1d61858b27e8e90da3f0"} Apr 16 19:20:05.647814 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:05.647761 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lvp6d" podStartSLOduration=130.9586424 podStartE2EDuration="2m12.647744951s" podCreationTimestamp="2026-04-16 19:17:53 +0000 UTC" firstStartedPulling="2026-04-16 19:20:03.063719109 +0000 UTC m=+130.558309704" lastFinishedPulling="2026-04-16 19:20:04.752821657 +0000 UTC m=+132.247412255" observedRunningTime="2026-04-16 19:20:05.646391722 +0000 UTC m=+133.140982341" watchObservedRunningTime="2026-04-16 19:20:05.647744951 +0000 UTC m=+133.142335567" Apr 16 19:20:09.031692 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:09.031660 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:20:09.035464 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:09.035440 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-797486cb67-lw5s7" Apr 16 19:20:13.326774 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:13.326738 2582 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfefe6f16_f5b3_4e5d_bd2e_e4cd50f953ae.slice/crio-conmon-815fb8daf0f5bb2930fc2e20db17911bbe7b946e6e99534ba84439e85dafc1de.scope\": RecentStats: unable to find data in memory cache]" Apr 16 19:20:13.652996 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:13.652901 2582 generic.go:358] "Generic (PLEG): container finished" podID="c83523cc-27c2-4924-9113-67ff5b311e42" containerID="673e375e7c29260bd1f132adb45ea21ef1b3d385bd090b1901983e293ddab8de" exitCode=0 Apr 16 19:20:13.652996 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:13.652945 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dtbpz" event={"ID":"c83523cc-27c2-4924-9113-67ff5b311e42","Type":"ContainerDied","Data":"673e375e7c29260bd1f132adb45ea21ef1b3d385bd090b1901983e293ddab8de"} Apr 16 19:20:13.653304 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:13.653289 2582 scope.go:117] "RemoveContainer" containerID="673e375e7c29260bd1f132adb45ea21ef1b3d385bd090b1901983e293ddab8de" Apr 16 19:20:14.657625 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:14.657588 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-dtbpz" event={"ID":"c83523cc-27c2-4924-9113-67ff5b311e42","Type":"ContainerStarted","Data":"9fc13c25f4bfbf5607a04094c2103ae5be0e4019c5af837e136b8e7b77487960"} Apr 16 19:20:16.192656 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:16.192621 2582 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfefe6f16_f5b3_4e5d_bd2e_e4cd50f953ae.slice/crio-conmon-815fb8daf0f5bb2930fc2e20db17911bbe7b946e6e99534ba84439e85dafc1de.scope\": RecentStats: unable to find data in memory cache]" Apr 16 19:20:18.671890 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:18.671852 2582 generic.go:358] "Generic (PLEG): container finished" podID="907cddc0-db0e-4159-aa65-8778fb6d6a30" containerID="d0693c9ed3941455d440d482cd9d21cb9b7f2e934ad48241fe3ace84cbc86702" exitCode=0 Apr 16 19:20:18.672274 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:18.671925 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-ksczz" event={"ID":"907cddc0-db0e-4159-aa65-8778fb6d6a30","Type":"ContainerDied","Data":"d0693c9ed3941455d440d482cd9d21cb9b7f2e934ad48241fe3ace84cbc86702"} Apr 16 19:20:18.672318 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:18.672289 2582 scope.go:117] "RemoveContainer" containerID="d0693c9ed3941455d440d482cd9d21cb9b7f2e934ad48241fe3ace84cbc86702" Apr 16 19:20:19.677592 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:19.677557 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-ksczz" event={"ID":"907cddc0-db0e-4159-aa65-8778fb6d6a30","Type":"ContainerStarted","Data":"89afd46f9a48c7e0342f88d7254af514edf93c29a801218c42fe5ca92571e31a"} Apr 16 19:20:20.681711 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:20.681681 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb/init-config-reloader/0.log" Apr 16 19:20:20.868845 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:20.868820 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb/alertmanager/0.log" Apr 16 19:20:21.068825 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:21.068796 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb/config-reloader/0.log" Apr 16 19:20:21.269416 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:21.269388 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb/kube-rbac-proxy-web/0.log" Apr 16 19:20:21.468886 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:21.468781 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb/kube-rbac-proxy/0.log" Apr 16 19:20:21.669080 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:21.669034 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb/kube-rbac-proxy-metric/0.log" Apr 16 19:20:21.868779 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:21.868749 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb/prom-label-proxy/0.log" Apr 16 19:20:22.069441 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:22.069393 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-9sbws_448c73ab-a4f5-4a5c-8143-1deb13253eec/cluster-monitoring-operator/0.log" Apr 16 19:20:22.868590 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:22.868563 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-797486cb67-lw5s7_12ea81e7-90ab-476f-805c-836751743647/metrics-server/0.log" Apr 16 19:20:23.370880 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:23.370845 2582 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfefe6f16_f5b3_4e5d_bd2e_e4cd50f953ae.slice/crio-conmon-815fb8daf0f5bb2930fc2e20db17911bbe7b946e6e99534ba84439e85dafc1de.scope\": RecentStats: unable to find data in memory cache]" Apr 16 19:20:24.469330 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:24.469301 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xf965_5e5dbd1b-6936-4ebc-83c5-9d234738556b/init-textfile/0.log" Apr 16 19:20:24.669690 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:24.669648 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xf965_5e5dbd1b-6936-4ebc-83c5-9d234738556b/node-exporter/0.log" Apr 16 19:20:24.869143 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:24.869114 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xf965_5e5dbd1b-6936-4ebc-83c5-9d234738556b/kube-rbac-proxy/0.log" Apr 16 19:20:25.068915 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:25.068884 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-86pzj_fd3c9bfd-8e4a-498b-9c73-93f8b57377f5/kube-rbac-proxy-main/0.log" Apr 16 19:20:25.269240 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:25.269193 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-86pzj_fd3c9bfd-8e4a-498b-9c73-93f8b57377f5/kube-rbac-proxy-self/0.log" Apr 16 19:20:25.469022 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:25.468996 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-86pzj_fd3c9bfd-8e4a-498b-9c73-93f8b57377f5/openshift-state-metrics/0.log" Apr 16 19:20:25.669766 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:25.669642 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_33cd2620-ef7c-434f-8baf-6b44b39be4c8/init-config-reloader/0.log" Apr 16 19:20:25.870329 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:25.870295 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_33cd2620-ef7c-434f-8baf-6b44b39be4c8/prometheus/0.log" Apr 16 19:20:26.069599 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:26.069573 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_33cd2620-ef7c-434f-8baf-6b44b39be4c8/config-reloader/0.log" Apr 16 19:20:26.268923 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:26.268894 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_33cd2620-ef7c-434f-8baf-6b44b39be4c8/thanos-sidecar/0.log" Apr 16 19:20:26.468723 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:26.468641 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_33cd2620-ef7c-434f-8baf-6b44b39be4c8/kube-rbac-proxy-web/0.log" Apr 16 19:20:26.668525 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:26.668495 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_33cd2620-ef7c-434f-8baf-6b44b39be4c8/kube-rbac-proxy/0.log" Apr 16 19:20:26.869730 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:26.869702 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_33cd2620-ef7c-434f-8baf-6b44b39be4c8/kube-rbac-proxy-thanos/0.log" Apr 16 19:20:27.668963 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:27.668930 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-9d45d7bd4-hgrc5_ba226b86-8449-4c16-881e-753110579cfe/telemeter-client/0.log" Apr 16 19:20:27.869564 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:27.869537 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-9d45d7bd4-hgrc5_ba226b86-8449-4c16-881e-753110579cfe/reload/0.log" Apr 16 19:20:28.068998 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:28.068966 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-9d45d7bd4-hgrc5_ba226b86-8449-4c16-881e-753110579cfe/kube-rbac-proxy/0.log" Apr 16 19:20:29.468960 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:29.468932 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-pl62h_04310661-51ad-4a3b-86cf-b9a2a0d1dda1/networking-console-plugin/0.log" Apr 16 19:20:29.668652 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:29.668626 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/2.log" Apr 16 19:20:29.870299 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:29.870268 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/3.log" Apr 16 19:20:30.669824 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:30.669795 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5f4947fcd8-gffmg_260a217a-9aa3-43e3-9715-9255e451adff/router/0.log" Apr 16 19:20:31.035318 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:31.035280 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:31.051448 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:31.051421 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:31.068613 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:31.068583 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dp4cw_d82ed6e1-d7aa-4d47-bcb6-f4539431d578/serve-healthcheck-canary/0.log" Apr 16 19:20:31.219685 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:31.219649 2582 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfefe6f16_f5b3_4e5d_bd2e_e4cd50f953ae.slice/crio-conmon-815fb8daf0f5bb2930fc2e20db17911bbe7b946e6e99534ba84439e85dafc1de.scope\": RecentStats: unable to find data in memory cache]" Apr 16 19:20:31.731982 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:31.731951 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:33.380111 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:33.380065 2582 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfefe6f16_f5b3_4e5d_bd2e_e4cd50f953ae.slice/crio-conmon-815fb8daf0f5bb2930fc2e20db17911bbe7b946e6e99534ba84439e85dafc1de.scope\": RecentStats: unable to find data in memory cache]" Apr 16 19:20:34.248174 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:34.247232 2582 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfefe6f16_f5b3_4e5d_bd2e_e4cd50f953ae.slice/crio-conmon-815fb8daf0f5bb2930fc2e20db17911bbe7b946e6e99534ba84439e85dafc1de.scope\": RecentStats: unable to find data in memory cache]" Apr 16 19:20:34.255369 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:34.255339 2582 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfefe6f16_f5b3_4e5d_bd2e_e4cd50f953ae.slice/crio-conmon-815fb8daf0f5bb2930fc2e20db17911bbe7b946e6e99534ba84439e85dafc1de.scope\": RecentStats: unable to find data in memory cache]" Apr 16 19:20:43.389452 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:43.389375 2582 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfefe6f16_f5b3_4e5d_bd2e_e4cd50f953ae.slice/crio-conmon-815fb8daf0f5bb2930fc2e20db17911bbe7b946e6e99534ba84439e85dafc1de.scope\": RecentStats: unable to find data in memory cache]" Apr 16 19:20:44.724812 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:44.724775 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:20:44.725280 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:44.725239 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="alertmanager" containerID="cri-o://8eaa8ec8874cfcadb62a2e66a562b255ba745cc0ac4aab9a178d61f3b36d4ea8" gracePeriod=120 Apr 16 19:20:44.725352 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:44.725291 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="kube-rbac-proxy-metric" containerID="cri-o://3030b7ecb96ef732254a644f00e87cc38c37c2d9b13214086e5273761129d310" gracePeriod=120 Apr 16 19:20:44.725407 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:44.725345 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="config-reloader" containerID="cri-o://1b6630a011135a067c0eccef4b8570f2c24164d0ad82bd974c4f711dd32904ff" gracePeriod=120 Apr 16 19:20:44.725407 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:44.725352 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="kube-rbac-proxy" containerID="cri-o://9bb6d471c371f01d389c774d6dafa001d17930ab5de6ed7f38f3e84f2d59c5e4" gracePeriod=120 Apr 16 19:20:44.725501 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:44.725314 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="kube-rbac-proxy-web" containerID="cri-o://cbfdb923f2b0bf11fdaf0a390f5997029af58518f15426955a4e0271158ea65a" gracePeriod=120 Apr 16 19:20:44.725501 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:44.725363 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="prom-label-proxy" containerID="cri-o://8052c3f5a2706df3152a860095f783467f2bda8e65eed00a5653db9ac752cc1b" gracePeriod=120 Apr 16 19:20:45.768188 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:45.768140 2582 generic.go:358] "Generic (PLEG): container finished" podID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerID="8052c3f5a2706df3152a860095f783467f2bda8e65eed00a5653db9ac752cc1b" exitCode=0 Apr 16 19:20:45.768188 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:45.768182 2582 generic.go:358] "Generic (PLEG): container finished" podID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerID="9bb6d471c371f01d389c774d6dafa001d17930ab5de6ed7f38f3e84f2d59c5e4" exitCode=0 Apr 16 19:20:45.768188 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:45.768178 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb","Type":"ContainerDied","Data":"8052c3f5a2706df3152a860095f783467f2bda8e65eed00a5653db9ac752cc1b"} Apr 16 19:20:45.768612 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:45.768216 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb","Type":"ContainerDied","Data":"9bb6d471c371f01d389c774d6dafa001d17930ab5de6ed7f38f3e84f2d59c5e4"} Apr 16 19:20:45.768612 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:45.768226 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb","Type":"ContainerDied","Data":"1b6630a011135a067c0eccef4b8570f2c24164d0ad82bd974c4f711dd32904ff"} Apr 16 19:20:45.768612 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:45.768189 2582 generic.go:358] "Generic (PLEG): container finished" podID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerID="1b6630a011135a067c0eccef4b8570f2c24164d0ad82bd974c4f711dd32904ff" exitCode=0 Apr 16 19:20:45.768612 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:45.768242 2582 generic.go:358] "Generic (PLEG): container finished" podID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerID="8eaa8ec8874cfcadb62a2e66a562b255ba745cc0ac4aab9a178d61f3b36d4ea8" exitCode=0 Apr 16 19:20:45.768612 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:45.768281 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb","Type":"ContainerDied","Data":"8eaa8ec8874cfcadb62a2e66a562b255ba745cc0ac4aab9a178d61f3b36d4ea8"} Apr 16 19:20:45.974834 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:45.974806 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.064171 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.064063 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-main-tls\") pod \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " Apr 16 19:20:46.064171 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.064110 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-web-config\") pod \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " Apr 16 19:20:46.064171 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.064136 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-metrics-client-ca\") pod \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " Apr 16 19:20:46.064451 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.064280 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-alertmanager-main-db\") pod \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " Apr 16 19:20:46.064451 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.064325 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-kube-rbac-proxy\") pod \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " Apr 16 19:20:46.064451 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.064351 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-tls-assets\") pod \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " Apr 16 19:20:46.064451 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.064368 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-cluster-tls-config\") pod \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " Apr 16 19:20:46.064451 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.064419 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-config-out\") pod \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " Apr 16 19:20:46.064765 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.064474 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " Apr 16 19:20:46.064765 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.064503 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-alertmanager-trusted-ca-bundle\") pod \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " Apr 16 19:20:46.064765 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.064533 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-kube-rbac-proxy-web\") pod \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " Apr 16 19:20:46.064765 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.064556 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-config-volume\") pod \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " Apr 16 19:20:46.064765 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.064590 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" (UID: "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:20:46.064765 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.064601 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppgtq\" (UniqueName: \"kubernetes.io/projected/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-kube-api-access-ppgtq\") pod \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\" (UID: \"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb\") " Apr 16 19:20:46.064765 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.064596 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" (UID: "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:20:46.065120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.064929 2582 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-alertmanager-main-db\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:46.065120 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.064953 2582 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-metrics-client-ca\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:46.065635 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.065605 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" (UID: "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:20:46.067327 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.067286 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" (UID: "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:20:46.067640 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.067613 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" (UID: "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:20:46.068477 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.068437 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" (UID: "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:20:46.068593 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.068492 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-config-out" (OuterVolumeSpecName: "config-out") pod "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" (UID: "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:20:46.068593 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.068509 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" (UID: "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:20:46.068939 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.068908 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-config-volume" (OuterVolumeSpecName: "config-volume") pod "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" (UID: "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:20:46.068939 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.068923 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" (UID: "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:20:46.069090 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.069079 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-kube-api-access-ppgtq" (OuterVolumeSpecName: "kube-api-access-ppgtq") pod "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" (UID: "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb"). InnerVolumeSpecName "kube-api-access-ppgtq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:20:46.071742 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.071705 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" (UID: "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:20:46.078776 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.078756 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-web-config" (OuterVolumeSpecName: "web-config") pod "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" (UID: "1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:20:46.165683 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.165651 2582 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-web-config\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:46.165683 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.165679 2582 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:46.165683 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.165689 2582 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-tls-assets\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:46.165890 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.165699 2582 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-cluster-tls-config\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:46.165890 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.165708 2582 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-config-out\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:46.165890 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.165717 2582 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:46.165890 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.165726 2582 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:46.165890 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.165735 2582 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:46.165890 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.165744 2582 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-config-volume\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:46.165890 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.165753 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ppgtq\" (UniqueName: \"kubernetes.io/projected/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-kube-api-access-ppgtq\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:46.165890 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.165761 2582 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb-secret-alertmanager-main-tls\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:46.194073 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:46.194043 2582 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfefe6f16_f5b3_4e5d_bd2e_e4cd50f953ae.slice/crio-conmon-815fb8daf0f5bb2930fc2e20db17911bbe7b946e6e99534ba84439e85dafc1de.scope\": RecentStats: unable to find data in memory cache]" Apr 16 19:20:46.773609 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.773569 2582 generic.go:358] "Generic (PLEG): container finished" podID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerID="3030b7ecb96ef732254a644f00e87cc38c37c2d9b13214086e5273761129d310" exitCode=0 Apr 16 19:20:46.773609 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.773603 2582 generic.go:358] "Generic (PLEG): container finished" podID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerID="cbfdb923f2b0bf11fdaf0a390f5997029af58518f15426955a4e0271158ea65a" exitCode=0 Apr 16 19:20:46.774091 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.773652 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb","Type":"ContainerDied","Data":"3030b7ecb96ef732254a644f00e87cc38c37c2d9b13214086e5273761129d310"} Apr 16 19:20:46.774091 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.773685 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.774091 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.773691 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb","Type":"ContainerDied","Data":"cbfdb923f2b0bf11fdaf0a390f5997029af58518f15426955a4e0271158ea65a"} Apr 16 19:20:46.774091 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.773706 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb","Type":"ContainerDied","Data":"7af680162d74e838bbe83259f86971035bb41b72444eb940e23c4de26e35059b"} Apr 16 19:20:46.774091 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.773725 2582 scope.go:117] "RemoveContainer" containerID="8052c3f5a2706df3152a860095f783467f2bda8e65eed00a5653db9ac752cc1b" Apr 16 19:20:46.782465 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.782437 2582 scope.go:117] "RemoveContainer" containerID="3030b7ecb96ef732254a644f00e87cc38c37c2d9b13214086e5273761129d310" Apr 16 19:20:46.792798 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.791561 2582 scope.go:117] "RemoveContainer" containerID="9bb6d471c371f01d389c774d6dafa001d17930ab5de6ed7f38f3e84f2d59c5e4" Apr 16 19:20:46.798803 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.798783 2582 scope.go:117] "RemoveContainer" containerID="cbfdb923f2b0bf11fdaf0a390f5997029af58518f15426955a4e0271158ea65a" Apr 16 19:20:46.800829 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.800803 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:20:46.806651 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.806630 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:20:46.811032 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.811014 2582 scope.go:117] "RemoveContainer" containerID="1b6630a011135a067c0eccef4b8570f2c24164d0ad82bd974c4f711dd32904ff" Apr 16 19:20:46.818093 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.818064 2582 scope.go:117] "RemoveContainer" containerID="8eaa8ec8874cfcadb62a2e66a562b255ba745cc0ac4aab9a178d61f3b36d4ea8" Apr 16 19:20:46.824855 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.824836 2582 scope.go:117] "RemoveContainer" containerID="67cc6c387bc5dd8c87fe2d99dc552fbb41c9128ab2bda3caf096233b1adb3d02" Apr 16 19:20:46.831525 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.831503 2582 scope.go:117] "RemoveContainer" containerID="8052c3f5a2706df3152a860095f783467f2bda8e65eed00a5653db9ac752cc1b" Apr 16 19:20:46.831764 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:46.831745 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8052c3f5a2706df3152a860095f783467f2bda8e65eed00a5653db9ac752cc1b\": container with ID starting with 8052c3f5a2706df3152a860095f783467f2bda8e65eed00a5653db9ac752cc1b not found: ID does not exist" containerID="8052c3f5a2706df3152a860095f783467f2bda8e65eed00a5653db9ac752cc1b" Apr 16 19:20:46.831817 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.831775 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8052c3f5a2706df3152a860095f783467f2bda8e65eed00a5653db9ac752cc1b"} err="failed to get container status \"8052c3f5a2706df3152a860095f783467f2bda8e65eed00a5653db9ac752cc1b\": rpc error: code = NotFound desc = could not find container \"8052c3f5a2706df3152a860095f783467f2bda8e65eed00a5653db9ac752cc1b\": container with ID starting with 8052c3f5a2706df3152a860095f783467f2bda8e65eed00a5653db9ac752cc1b not found: ID does not exist" Apr 16 19:20:46.831817 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.831795 2582 scope.go:117] "RemoveContainer" containerID="3030b7ecb96ef732254a644f00e87cc38c37c2d9b13214086e5273761129d310" Apr 16 19:20:46.831990 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:46.831975 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3030b7ecb96ef732254a644f00e87cc38c37c2d9b13214086e5273761129d310\": container with ID starting with 3030b7ecb96ef732254a644f00e87cc38c37c2d9b13214086e5273761129d310 not found: ID does not exist" containerID="3030b7ecb96ef732254a644f00e87cc38c37c2d9b13214086e5273761129d310" Apr 16 19:20:46.832032 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.831993 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3030b7ecb96ef732254a644f00e87cc38c37c2d9b13214086e5273761129d310"} err="failed to get container status \"3030b7ecb96ef732254a644f00e87cc38c37c2d9b13214086e5273761129d310\": rpc error: code = NotFound desc = could not find container \"3030b7ecb96ef732254a644f00e87cc38c37c2d9b13214086e5273761129d310\": container with ID starting with 3030b7ecb96ef732254a644f00e87cc38c37c2d9b13214086e5273761129d310 not found: ID does not exist" Apr 16 19:20:46.832032 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.832006 2582 scope.go:117] "RemoveContainer" containerID="9bb6d471c371f01d389c774d6dafa001d17930ab5de6ed7f38f3e84f2d59c5e4" Apr 16 19:20:46.832254 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:46.832238 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bb6d471c371f01d389c774d6dafa001d17930ab5de6ed7f38f3e84f2d59c5e4\": container with ID starting with 9bb6d471c371f01d389c774d6dafa001d17930ab5de6ed7f38f3e84f2d59c5e4 not found: ID does not exist" containerID="9bb6d471c371f01d389c774d6dafa001d17930ab5de6ed7f38f3e84f2d59c5e4" Apr 16 19:20:46.832310 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.832258 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bb6d471c371f01d389c774d6dafa001d17930ab5de6ed7f38f3e84f2d59c5e4"} err="failed to get container status \"9bb6d471c371f01d389c774d6dafa001d17930ab5de6ed7f38f3e84f2d59c5e4\": rpc error: code = NotFound desc = could not find container \"9bb6d471c371f01d389c774d6dafa001d17930ab5de6ed7f38f3e84f2d59c5e4\": container with ID starting with 9bb6d471c371f01d389c774d6dafa001d17930ab5de6ed7f38f3e84f2d59c5e4 not found: ID does not exist" Apr 16 19:20:46.832310 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.832272 2582 scope.go:117] "RemoveContainer" containerID="cbfdb923f2b0bf11fdaf0a390f5997029af58518f15426955a4e0271158ea65a" Apr 16 19:20:46.832506 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:46.832490 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbfdb923f2b0bf11fdaf0a390f5997029af58518f15426955a4e0271158ea65a\": container with ID starting with cbfdb923f2b0bf11fdaf0a390f5997029af58518f15426955a4e0271158ea65a not found: ID does not exist" containerID="cbfdb923f2b0bf11fdaf0a390f5997029af58518f15426955a4e0271158ea65a" Apr 16 19:20:46.832554 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.832519 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbfdb923f2b0bf11fdaf0a390f5997029af58518f15426955a4e0271158ea65a"} err="failed to get container status \"cbfdb923f2b0bf11fdaf0a390f5997029af58518f15426955a4e0271158ea65a\": rpc error: code = NotFound desc = could not find container \"cbfdb923f2b0bf11fdaf0a390f5997029af58518f15426955a4e0271158ea65a\": container with ID starting with cbfdb923f2b0bf11fdaf0a390f5997029af58518f15426955a4e0271158ea65a not found: ID does not exist" Apr 16 19:20:46.832554 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.832534 2582 scope.go:117] "RemoveContainer" containerID="1b6630a011135a067c0eccef4b8570f2c24164d0ad82bd974c4f711dd32904ff" Apr 16 19:20:46.832737 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:46.832721 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b6630a011135a067c0eccef4b8570f2c24164d0ad82bd974c4f711dd32904ff\": container with ID starting with 1b6630a011135a067c0eccef4b8570f2c24164d0ad82bd974c4f711dd32904ff not found: ID does not exist" containerID="1b6630a011135a067c0eccef4b8570f2c24164d0ad82bd974c4f711dd32904ff" Apr 16 19:20:46.832801 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.832741 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b6630a011135a067c0eccef4b8570f2c24164d0ad82bd974c4f711dd32904ff"} err="failed to get container status \"1b6630a011135a067c0eccef4b8570f2c24164d0ad82bd974c4f711dd32904ff\": rpc error: code = NotFound desc = could not find container \"1b6630a011135a067c0eccef4b8570f2c24164d0ad82bd974c4f711dd32904ff\": container with ID starting with 1b6630a011135a067c0eccef4b8570f2c24164d0ad82bd974c4f711dd32904ff not found: ID does not exist" Apr 16 19:20:46.832801 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.832753 2582 scope.go:117] "RemoveContainer" containerID="8eaa8ec8874cfcadb62a2e66a562b255ba745cc0ac4aab9a178d61f3b36d4ea8" Apr 16 19:20:46.832992 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:46.832973 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eaa8ec8874cfcadb62a2e66a562b255ba745cc0ac4aab9a178d61f3b36d4ea8\": container with ID starting with 8eaa8ec8874cfcadb62a2e66a562b255ba745cc0ac4aab9a178d61f3b36d4ea8 not found: ID does not exist" containerID="8eaa8ec8874cfcadb62a2e66a562b255ba745cc0ac4aab9a178d61f3b36d4ea8" Apr 16 19:20:46.833050 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.832995 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eaa8ec8874cfcadb62a2e66a562b255ba745cc0ac4aab9a178d61f3b36d4ea8"} err="failed to get container status \"8eaa8ec8874cfcadb62a2e66a562b255ba745cc0ac4aab9a178d61f3b36d4ea8\": rpc error: code = NotFound desc = could not find container \"8eaa8ec8874cfcadb62a2e66a562b255ba745cc0ac4aab9a178d61f3b36d4ea8\": container with ID starting with 8eaa8ec8874cfcadb62a2e66a562b255ba745cc0ac4aab9a178d61f3b36d4ea8 not found: ID does not exist" Apr 16 19:20:46.833050 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.833006 2582 scope.go:117] "RemoveContainer" containerID="67cc6c387bc5dd8c87fe2d99dc552fbb41c9128ab2bda3caf096233b1adb3d02" Apr 16 19:20:46.833248 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:46.833230 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67cc6c387bc5dd8c87fe2d99dc552fbb41c9128ab2bda3caf096233b1adb3d02\": container with ID starting with 67cc6c387bc5dd8c87fe2d99dc552fbb41c9128ab2bda3caf096233b1adb3d02 not found: ID does not exist" containerID="67cc6c387bc5dd8c87fe2d99dc552fbb41c9128ab2bda3caf096233b1adb3d02" Apr 16 19:20:46.833297 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.833253 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67cc6c387bc5dd8c87fe2d99dc552fbb41c9128ab2bda3caf096233b1adb3d02"} err="failed to get container status \"67cc6c387bc5dd8c87fe2d99dc552fbb41c9128ab2bda3caf096233b1adb3d02\": rpc error: code = NotFound desc = could not find container \"67cc6c387bc5dd8c87fe2d99dc552fbb41c9128ab2bda3caf096233b1adb3d02\": container with ID starting with 67cc6c387bc5dd8c87fe2d99dc552fbb41c9128ab2bda3caf096233b1adb3d02 not found: ID does not exist" Apr 16 19:20:46.833297 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.833272 2582 scope.go:117] "RemoveContainer" containerID="8052c3f5a2706df3152a860095f783467f2bda8e65eed00a5653db9ac752cc1b" Apr 16 19:20:46.833513 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.833493 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8052c3f5a2706df3152a860095f783467f2bda8e65eed00a5653db9ac752cc1b"} err="failed to get container status \"8052c3f5a2706df3152a860095f783467f2bda8e65eed00a5653db9ac752cc1b\": rpc error: code = NotFound desc = could not find container \"8052c3f5a2706df3152a860095f783467f2bda8e65eed00a5653db9ac752cc1b\": container with ID starting with 8052c3f5a2706df3152a860095f783467f2bda8e65eed00a5653db9ac752cc1b not found: ID does not exist" Apr 16 19:20:46.833558 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.833514 2582 scope.go:117] "RemoveContainer" containerID="3030b7ecb96ef732254a644f00e87cc38c37c2d9b13214086e5273761129d310" Apr 16 19:20:46.833705 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.833687 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3030b7ecb96ef732254a644f00e87cc38c37c2d9b13214086e5273761129d310"} err="failed to get container status \"3030b7ecb96ef732254a644f00e87cc38c37c2d9b13214086e5273761129d310\": rpc error: code = NotFound desc = could not find container \"3030b7ecb96ef732254a644f00e87cc38c37c2d9b13214086e5273761129d310\": container with ID starting with 3030b7ecb96ef732254a644f00e87cc38c37c2d9b13214086e5273761129d310 not found: ID does not exist" Apr 16 19:20:46.833705 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.833707 2582 scope.go:117] "RemoveContainer" containerID="9bb6d471c371f01d389c774d6dafa001d17930ab5de6ed7f38f3e84f2d59c5e4" Apr 16 19:20:46.833911 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.833894 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bb6d471c371f01d389c774d6dafa001d17930ab5de6ed7f38f3e84f2d59c5e4"} err="failed to get container status \"9bb6d471c371f01d389c774d6dafa001d17930ab5de6ed7f38f3e84f2d59c5e4\": rpc error: code = NotFound desc = could not find container \"9bb6d471c371f01d389c774d6dafa001d17930ab5de6ed7f38f3e84f2d59c5e4\": container with ID starting with 9bb6d471c371f01d389c774d6dafa001d17930ab5de6ed7f38f3e84f2d59c5e4 not found: ID does not exist" Apr 16 19:20:46.833911 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.833910 2582 scope.go:117] "RemoveContainer" containerID="cbfdb923f2b0bf11fdaf0a390f5997029af58518f15426955a4e0271158ea65a" Apr 16 19:20:46.834136 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.834117 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbfdb923f2b0bf11fdaf0a390f5997029af58518f15426955a4e0271158ea65a"} err="failed to get container status \"cbfdb923f2b0bf11fdaf0a390f5997029af58518f15426955a4e0271158ea65a\": rpc error: code = NotFound desc = could not find container \"cbfdb923f2b0bf11fdaf0a390f5997029af58518f15426955a4e0271158ea65a\": container with ID starting with cbfdb923f2b0bf11fdaf0a390f5997029af58518f15426955a4e0271158ea65a not found: ID does not exist" Apr 16 19:20:46.834243 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.834136 2582 scope.go:117] "RemoveContainer" containerID="1b6630a011135a067c0eccef4b8570f2c24164d0ad82bd974c4f711dd32904ff" Apr 16 19:20:46.834375 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.834359 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b6630a011135a067c0eccef4b8570f2c24164d0ad82bd974c4f711dd32904ff"} err="failed to get container status \"1b6630a011135a067c0eccef4b8570f2c24164d0ad82bd974c4f711dd32904ff\": rpc error: code = NotFound desc = could not find container \"1b6630a011135a067c0eccef4b8570f2c24164d0ad82bd974c4f711dd32904ff\": container with ID starting with 1b6630a011135a067c0eccef4b8570f2c24164d0ad82bd974c4f711dd32904ff not found: ID does not exist" Apr 16 19:20:46.834433 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.834375 2582 scope.go:117] "RemoveContainer" containerID="8eaa8ec8874cfcadb62a2e66a562b255ba745cc0ac4aab9a178d61f3b36d4ea8" Apr 16 19:20:46.834560 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.834543 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eaa8ec8874cfcadb62a2e66a562b255ba745cc0ac4aab9a178d61f3b36d4ea8"} err="failed to get container status \"8eaa8ec8874cfcadb62a2e66a562b255ba745cc0ac4aab9a178d61f3b36d4ea8\": rpc error: code = NotFound desc = could not find container \"8eaa8ec8874cfcadb62a2e66a562b255ba745cc0ac4aab9a178d61f3b36d4ea8\": container with ID starting with 8eaa8ec8874cfcadb62a2e66a562b255ba745cc0ac4aab9a178d61f3b36d4ea8 not found: ID does not exist" Apr 16 19:20:46.834620 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.834561 2582 scope.go:117] "RemoveContainer" containerID="67cc6c387bc5dd8c87fe2d99dc552fbb41c9128ab2bda3caf096233b1adb3d02" Apr 16 19:20:46.834746 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.834727 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67cc6c387bc5dd8c87fe2d99dc552fbb41c9128ab2bda3caf096233b1adb3d02"} err="failed to get container status \"67cc6c387bc5dd8c87fe2d99dc552fbb41c9128ab2bda3caf096233b1adb3d02\": rpc error: code = NotFound desc = could not find container \"67cc6c387bc5dd8c87fe2d99dc552fbb41c9128ab2bda3caf096233b1adb3d02\": container with ID starting with 67cc6c387bc5dd8c87fe2d99dc552fbb41c9128ab2bda3caf096233b1adb3d02 not found: ID does not exist" Apr 16 19:20:46.838415 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.838392 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:20:46.838824 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.838809 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="init-config-reloader" Apr 16 19:20:46.838871 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.838827 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="init-config-reloader" Apr 16 19:20:46.838871 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.838838 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="config-reloader" Apr 16 19:20:46.838871 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.838844 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="config-reloader" Apr 16 19:20:46.838871 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.838852 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="kube-rbac-proxy-web" Apr 16 19:20:46.838871 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.838857 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="kube-rbac-proxy-web" Apr 16 19:20:46.838871 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.838868 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="alertmanager" Apr 16 19:20:46.838871 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.838873 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="alertmanager" Apr 16 19:20:46.839066 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.838880 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="prom-label-proxy" Apr 16 19:20:46.839066 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.838885 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="prom-label-proxy" Apr 16 19:20:46.839066 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.838900 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="kube-rbac-proxy" Apr 16 19:20:46.839066 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.838905 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="kube-rbac-proxy" Apr 16 19:20:46.839066 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.838915 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fff8270c-4771-4655-8abd-7341281f3173" containerName="registry" Apr 16 19:20:46.839066 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.838920 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff8270c-4771-4655-8abd-7341281f3173" containerName="registry" Apr 16 19:20:46.839066 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.838927 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="kube-rbac-proxy-metric" Apr 16 19:20:46.839066 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.838932 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="kube-rbac-proxy-metric" Apr 16 19:20:46.839066 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.838977 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="prom-label-proxy" Apr 16 19:20:46.839066 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.838987 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="alertmanager" Apr 16 19:20:46.839066 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.838996 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="kube-rbac-proxy-web" Apr 16 19:20:46.839066 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.839002 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="fff8270c-4771-4655-8abd-7341281f3173" containerName="registry" Apr 16 19:20:46.839066 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.839007 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="config-reloader" Apr 16 19:20:46.839066 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.839013 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="kube-rbac-proxy" Apr 16 19:20:46.839066 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.839019 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" containerName="kube-rbac-proxy-metric" Apr 16 19:20:46.842540 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.842525 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.845181 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.845136 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 19:20:46.845181 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.845136 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 19:20:46.845340 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.845216 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 19:20:46.845340 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.845277 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-zv6fv\"" Apr 16 19:20:46.845442 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.845382 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 19:20:46.845555 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.845541 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 19:20:46.845642 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.845625 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 19:20:46.845697 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.845684 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 19:20:46.845743 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.845703 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 19:20:46.850678 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.850659 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 19:20:46.854080 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.854061 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:20:46.871758 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.871730 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.871882 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.871769 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.871882 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.871796 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-web-config\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.871882 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.871822 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.871882 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.871852 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-config-volume\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.872078 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.871943 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.872078 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.871984 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.872078 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.872006 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.872078 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.872024 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgndf\" (UniqueName: \"kubernetes.io/projected/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-kube-api-access-jgndf\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.872078 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.872062 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.872301 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.872143 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.872301 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.872198 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-config-out\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.872301 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.872221 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.972725 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.972686 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.972930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.972737 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.972930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.972762 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.972930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.972783 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jgndf\" (UniqueName: \"kubernetes.io/projected/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-kube-api-access-jgndf\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.972930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.972802 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.972930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.972829 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.972930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.972851 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-config-out\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.972930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.972870 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.973336 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.973012 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.973336 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.973184 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.973336 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.973221 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-web-config\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.973336 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.973249 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.973336 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.973287 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-config-volume\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.973336 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.973325 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.973906 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.973717 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.974168 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.974126 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.975771 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.975747 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-config-out\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.976428 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.976383 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.976560 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.976539 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-web-config\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.976627 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.976584 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-config-volume\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.976627 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.976605 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.976627 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.976585 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.976744 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.976670 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.976843 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.976825 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.977854 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.977831 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:46.981197 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:46.981177 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgndf\" (UniqueName: \"kubernetes.io/projected/b1bf3a4d-389a-44ac-9ef1-cef2367546aa-kube-api-access-jgndf\") pod \"alertmanager-main-0\" (UID: \"b1bf3a4d-389a-44ac-9ef1-cef2367546aa\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:47.114548 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:47.114458 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb" path="/var/lib/kubelet/pods/1065d5ff-722f-4ff6-ab0d-b67bd5e87dbb/volumes" Apr 16 19:20:47.152293 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:47.152266 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 19:20:47.355642 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:20:47.355607 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1bf3a4d_389a_44ac_9ef1_cef2367546aa.slice/crio-d4bb529a768ed1fdae89b5e259e777c9c5dabb49ebf12d6adbecbc1bbee6ff4a WatchSource:0}: Error finding container d4bb529a768ed1fdae89b5e259e777c9c5dabb49ebf12d6adbecbc1bbee6ff4a: Status 404 returned error can't find the container with id d4bb529a768ed1fdae89b5e259e777c9c5dabb49ebf12d6adbecbc1bbee6ff4a Apr 16 19:20:47.355857 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:47.355837 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 19:20:47.781691 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:47.781651 2582 generic.go:358] "Generic (PLEG): container finished" podID="b1bf3a4d-389a-44ac-9ef1-cef2367546aa" containerID="45dd250d4551e432d00c4dc63607df5a96da5dca263f420683335c425b725a06" exitCode=0 Apr 16 19:20:47.782080 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:47.781694 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1bf3a4d-389a-44ac-9ef1-cef2367546aa","Type":"ContainerDied","Data":"45dd250d4551e432d00c4dc63607df5a96da5dca263f420683335c425b725a06"} Apr 16 19:20:47.782080 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:47.781732 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1bf3a4d-389a-44ac-9ef1-cef2367546aa","Type":"ContainerStarted","Data":"d4bb529a768ed1fdae89b5e259e777c9c5dabb49ebf12d6adbecbc1bbee6ff4a"} Apr 16 19:20:48.796738 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:48.796703 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1bf3a4d-389a-44ac-9ef1-cef2367546aa","Type":"ContainerStarted","Data":"d3bb938f679fe173ea4ab03269ab4475c4c4e1c0f6078fd1732685de71c82dac"} Apr 16 19:20:48.796738 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:48.796741 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1bf3a4d-389a-44ac-9ef1-cef2367546aa","Type":"ContainerStarted","Data":"b9dd4217bdf170037b5b09dce07f29c0d88ff75614dd6617dc8fc6a385377d51"} Apr 16 19:20:48.797323 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:48.796755 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1bf3a4d-389a-44ac-9ef1-cef2367546aa","Type":"ContainerStarted","Data":"e20d343507ce5dd0a300d2da376ac656e6638fc1375f5ecd912cd181b80033c2"} Apr 16 19:20:48.797323 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:48.796767 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1bf3a4d-389a-44ac-9ef1-cef2367546aa","Type":"ContainerStarted","Data":"6871db0ed91ffd36581e2e2bbae2e22261af9b69725889a538e5b4516f838a02"} Apr 16 19:20:48.797323 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:48.796777 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1bf3a4d-389a-44ac-9ef1-cef2367546aa","Type":"ContainerStarted","Data":"8b61d59a98ba3083a4e7e41c90e5f20dad17556284d3ee99ef4593271f2b7d2e"} Apr 16 19:20:48.797323 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:48.796787 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b1bf3a4d-389a-44ac-9ef1-cef2367546aa","Type":"ContainerStarted","Data":"eef7624d4c12faba8321f9af4961b3a6e2378022b50acdaf4c6489faf73e9599"} Apr 16 19:20:48.826979 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:48.826923 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.826906281 podStartE2EDuration="2.826906281s" podCreationTimestamp="2026-04-16 19:20:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:20:48.825032407 +0000 UTC m=+176.319623048" watchObservedRunningTime="2026-04-16 19:20:48.826906281 +0000 UTC m=+176.321496897" Apr 16 19:20:48.982412 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:48.982377 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:20:48.982823 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:48.982795 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="prometheus" containerID="cri-o://35d8978b99d1910976ae758e3415cf3450f18f7ef95491faaa70ff91c758d001" gracePeriod=600 Apr 16 19:20:48.982905 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:48.982823 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="kube-rbac-proxy-web" containerID="cri-o://5cc58e70539715ac110857df30400c7e663618c4fd2a94c800f775719d38c759" gracePeriod=600 Apr 16 19:20:48.982905 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:48.982846 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="thanos-sidecar" containerID="cri-o://dfc587a1d60607c3801621b24c59dd279bd59b606a12322137f316a8bea65c09" gracePeriod=600 Apr 16 19:20:48.983006 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:48.982858 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="config-reloader" containerID="cri-o://3e03a40e36bfcc5516aa09df567e991d091695a43b0d5c52406837dad4ea26bb" gracePeriod=600 Apr 16 19:20:48.983006 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:48.982943 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="kube-rbac-proxy" containerID="cri-o://602fef8860e23884627a98b7062925db5cd0ee1b6fd85ad24d2f5170413b4f2e" gracePeriod=600 Apr 16 19:20:48.983144 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:48.982806 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="kube-rbac-proxy-thanos" containerID="cri-o://008a50a43f03b73bd90d9aab0d31cf3150d6006e468c1c522fc7ada266eebe45" gracePeriod=600 Apr 16 19:20:49.803891 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:49.803860 2582 generic.go:358] "Generic (PLEG): container finished" podID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerID="008a50a43f03b73bd90d9aab0d31cf3150d6006e468c1c522fc7ada266eebe45" exitCode=0 Apr 16 19:20:49.803891 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:49.803887 2582 generic.go:358] "Generic (PLEG): container finished" podID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerID="602fef8860e23884627a98b7062925db5cd0ee1b6fd85ad24d2f5170413b4f2e" exitCode=0 Apr 16 19:20:49.803891 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:49.803895 2582 generic.go:358] "Generic (PLEG): container finished" podID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerID="dfc587a1d60607c3801621b24c59dd279bd59b606a12322137f316a8bea65c09" exitCode=0 Apr 16 19:20:49.804273 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:49.803902 2582 generic.go:358] "Generic (PLEG): container finished" podID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerID="3e03a40e36bfcc5516aa09df567e991d091695a43b0d5c52406837dad4ea26bb" exitCode=0 Apr 16 19:20:49.804273 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:49.803910 2582 generic.go:358] "Generic (PLEG): container finished" podID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerID="35d8978b99d1910976ae758e3415cf3450f18f7ef95491faaa70ff91c758d001" exitCode=0 Apr 16 19:20:49.804273 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:49.803930 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"33cd2620-ef7c-434f-8baf-6b44b39be4c8","Type":"ContainerDied","Data":"008a50a43f03b73bd90d9aab0d31cf3150d6006e468c1c522fc7ada266eebe45"} Apr 16 19:20:49.804273 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:49.803963 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"33cd2620-ef7c-434f-8baf-6b44b39be4c8","Type":"ContainerDied","Data":"602fef8860e23884627a98b7062925db5cd0ee1b6fd85ad24d2f5170413b4f2e"} Apr 16 19:20:49.804273 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:49.803975 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"33cd2620-ef7c-434f-8baf-6b44b39be4c8","Type":"ContainerDied","Data":"dfc587a1d60607c3801621b24c59dd279bd59b606a12322137f316a8bea65c09"} Apr 16 19:20:49.804273 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:49.803987 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"33cd2620-ef7c-434f-8baf-6b44b39be4c8","Type":"ContainerDied","Data":"3e03a40e36bfcc5516aa09df567e991d091695a43b0d5c52406837dad4ea26bb"} Apr 16 19:20:49.804273 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:49.804011 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"33cd2620-ef7c-434f-8baf-6b44b39be4c8","Type":"ContainerDied","Data":"35d8978b99d1910976ae758e3415cf3450f18f7ef95491faaa70ff91c758d001"} Apr 16 19:20:50.246930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.246906 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:50.302920 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.302887 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/33cd2620-ef7c-434f-8baf-6b44b39be4c8-config-out\") pod \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " Apr 16 19:20:50.303092 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.302934 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-prometheus-k8s-tls\") pod \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " Apr 16 19:20:50.303092 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.302989 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " Apr 16 19:20:50.303092 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.303015 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-configmap-metrics-client-ca\") pod \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " Apr 16 19:20:50.303092 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.303047 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/33cd2620-ef7c-434f-8baf-6b44b39be4c8-prometheus-k8s-db\") pod \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " Apr 16 19:20:50.303092 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.303078 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-web-config\") pod \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " Apr 16 19:20:50.303392 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.303103 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-grpc-tls\") pod \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " Apr 16 19:20:50.303392 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.303140 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-prometheus-k8s-rulefiles-0\") pod \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " Apr 16 19:20:50.303392 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.303183 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/33cd2620-ef7c-434f-8baf-6b44b39be4c8-tls-assets\") pod \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " Apr 16 19:20:50.303392 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.303209 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-thanos-prometheus-http-client-file\") pod \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " Apr 16 19:20:50.303392 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.303261 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-prometheus-trusted-ca-bundle\") pod \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " Apr 16 19:20:50.303392 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.303288 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-config\") pod \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " Apr 16 19:20:50.303392 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.303335 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-metrics-client-certs\") pod \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " Apr 16 19:20:50.303392 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.303362 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-configmap-kubelet-serving-ca-bundle\") pod \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " Apr 16 19:20:50.303773 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.303422 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brgfp\" (UniqueName: \"kubernetes.io/projected/33cd2620-ef7c-434f-8baf-6b44b39be4c8-kube-api-access-brgfp\") pod \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " Apr 16 19:20:50.303773 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.303453 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-configmap-serving-certs-ca-bundle\") pod \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " Apr 16 19:20:50.303773 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.303478 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-kube-rbac-proxy\") pod \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " Apr 16 19:20:50.303773 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.303507 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\" (UID: \"33cd2620-ef7c-434f-8baf-6b44b39be4c8\") " Apr 16 19:20:50.303773 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.303555 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "33cd2620-ef7c-434f-8baf-6b44b39be4c8" (UID: "33cd2620-ef7c-434f-8baf-6b44b39be4c8"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:20:50.304010 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.303786 2582 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-configmap-metrics-client-ca\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:50.304865 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.304832 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "33cd2620-ef7c-434f-8baf-6b44b39be4c8" (UID: "33cd2620-ef7c-434f-8baf-6b44b39be4c8"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:20:50.304993 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.304969 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33cd2620-ef7c-434f-8baf-6b44b39be4c8-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "33cd2620-ef7c-434f-8baf-6b44b39be4c8" (UID: "33cd2620-ef7c-434f-8baf-6b44b39be4c8"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:20:50.305842 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.305687 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "33cd2620-ef7c-434f-8baf-6b44b39be4c8" (UID: "33cd2620-ef7c-434f-8baf-6b44b39be4c8"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:20:50.305842 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.305698 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "33cd2620-ef7c-434f-8baf-6b44b39be4c8" (UID: "33cd2620-ef7c-434f-8baf-6b44b39be4c8"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:20:50.305842 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.305768 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "33cd2620-ef7c-434f-8baf-6b44b39be4c8" (UID: "33cd2620-ef7c-434f-8baf-6b44b39be4c8"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:20:50.306433 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.306283 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33cd2620-ef7c-434f-8baf-6b44b39be4c8-config-out" (OuterVolumeSpecName: "config-out") pod "33cd2620-ef7c-434f-8baf-6b44b39be4c8" (UID: "33cd2620-ef7c-434f-8baf-6b44b39be4c8"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:20:50.306701 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.306664 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "33cd2620-ef7c-434f-8baf-6b44b39be4c8" (UID: "33cd2620-ef7c-434f-8baf-6b44b39be4c8"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:20:50.307850 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.307815 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "33cd2620-ef7c-434f-8baf-6b44b39be4c8" (UID: "33cd2620-ef7c-434f-8baf-6b44b39be4c8"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:20:50.308883 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.308845 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "33cd2620-ef7c-434f-8baf-6b44b39be4c8" (UID: "33cd2620-ef7c-434f-8baf-6b44b39be4c8"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:20:50.309224 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.309124 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33cd2620-ef7c-434f-8baf-6b44b39be4c8-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "33cd2620-ef7c-434f-8baf-6b44b39be4c8" (UID: "33cd2620-ef7c-434f-8baf-6b44b39be4c8"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:20:50.309224 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.309169 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "33cd2620-ef7c-434f-8baf-6b44b39be4c8" (UID: "33cd2620-ef7c-434f-8baf-6b44b39be4c8"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:20:50.309224 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.309203 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "33cd2620-ef7c-434f-8baf-6b44b39be4c8" (UID: "33cd2620-ef7c-434f-8baf-6b44b39be4c8"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:20:50.309531 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.309507 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "33cd2620-ef7c-434f-8baf-6b44b39be4c8" (UID: "33cd2620-ef7c-434f-8baf-6b44b39be4c8"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:20:50.309631 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.309534 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-config" (OuterVolumeSpecName: "config") pod "33cd2620-ef7c-434f-8baf-6b44b39be4c8" (UID: "33cd2620-ef7c-434f-8baf-6b44b39be4c8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:20:50.309703 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.309679 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "33cd2620-ef7c-434f-8baf-6b44b39be4c8" (UID: "33cd2620-ef7c-434f-8baf-6b44b39be4c8"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:20:50.309853 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.309828 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33cd2620-ef7c-434f-8baf-6b44b39be4c8-kube-api-access-brgfp" (OuterVolumeSpecName: "kube-api-access-brgfp") pod "33cd2620-ef7c-434f-8baf-6b44b39be4c8" (UID: "33cd2620-ef7c-434f-8baf-6b44b39be4c8"). InnerVolumeSpecName "kube-api-access-brgfp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:20:50.318681 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.318657 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-web-config" (OuterVolumeSpecName: "web-config") pod "33cd2620-ef7c-434f-8baf-6b44b39be4c8" (UID: "33cd2620-ef7c-434f-8baf-6b44b39be4c8"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:20:50.405027 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.404995 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-brgfp\" (UniqueName: \"kubernetes.io/projected/33cd2620-ef7c-434f-8baf-6b44b39be4c8-kube-api-access-brgfp\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:50.405027 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.405026 2582 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:50.405027 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.405036 2582 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-kube-rbac-proxy\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:50.405285 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.405046 2582 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:50.405285 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.405057 2582 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/33cd2620-ef7c-434f-8baf-6b44b39be4c8-config-out\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:50.405285 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.405067 2582 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-prometheus-k8s-tls\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:50.405285 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.405077 2582 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:50.405285 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.405087 2582 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/33cd2620-ef7c-434f-8baf-6b44b39be4c8-prometheus-k8s-db\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:50.405285 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.405097 2582 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-web-config\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:50.405285 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.405105 2582 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-grpc-tls\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:50.405285 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.405113 2582 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:50.405285 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.405121 2582 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/33cd2620-ef7c-434f-8baf-6b44b39be4c8-tls-assets\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:50.405285 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.405131 2582 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-thanos-prometheus-http-client-file\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:50.405285 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.405141 2582 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-prometheus-trusted-ca-bundle\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:50.405285 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.405163 2582 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-config\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:50.405285 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.405176 2582 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/33cd2620-ef7c-434f-8baf-6b44b39be4c8-secret-metrics-client-certs\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:50.405285 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.405190 2582 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33cd2620-ef7c-434f-8baf-6b44b39be4c8-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:20:50.810130 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.810097 2582 generic.go:358] "Generic (PLEG): container finished" podID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerID="5cc58e70539715ac110857df30400c7e663618c4fd2a94c800f775719d38c759" exitCode=0 Apr 16 19:20:50.810541 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.810188 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"33cd2620-ef7c-434f-8baf-6b44b39be4c8","Type":"ContainerDied","Data":"5cc58e70539715ac110857df30400c7e663618c4fd2a94c800f775719d38c759"} Apr 16 19:20:50.810541 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.810242 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:50.810541 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.810257 2582 scope.go:117] "RemoveContainer" containerID="008a50a43f03b73bd90d9aab0d31cf3150d6006e468c1c522fc7ada266eebe45" Apr 16 19:20:50.810541 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.810242 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"33cd2620-ef7c-434f-8baf-6b44b39be4c8","Type":"ContainerDied","Data":"adf1982795a8599ea8a065681f077692f8beaaa39e86ef293f9c2b0b3ed05511"} Apr 16 19:20:50.818399 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.818230 2582 scope.go:117] "RemoveContainer" containerID="602fef8860e23884627a98b7062925db5cd0ee1b6fd85ad24d2f5170413b4f2e" Apr 16 19:20:50.825313 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.825295 2582 scope.go:117] "RemoveContainer" containerID="5cc58e70539715ac110857df30400c7e663618c4fd2a94c800f775719d38c759" Apr 16 19:20:50.831775 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.831758 2582 scope.go:117] "RemoveContainer" containerID="dfc587a1d60607c3801621b24c59dd279bd59b606a12322137f316a8bea65c09" Apr 16 19:20:50.834318 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.834296 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:20:50.838488 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.838464 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:20:50.839205 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.839183 2582 scope.go:117] "RemoveContainer" containerID="3e03a40e36bfcc5516aa09df567e991d091695a43b0d5c52406837dad4ea26bb" Apr 16 19:20:50.845804 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.845787 2582 scope.go:117] "RemoveContainer" containerID="35d8978b99d1910976ae758e3415cf3450f18f7ef95491faaa70ff91c758d001" Apr 16 19:20:50.852750 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.852731 2582 scope.go:117] "RemoveContainer" containerID="4c707484b60b3313034a617f8746cf8aa92664a3c420d31ae6085a8fe286fa24" Apr 16 19:20:50.859317 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.859302 2582 scope.go:117] "RemoveContainer" containerID="008a50a43f03b73bd90d9aab0d31cf3150d6006e468c1c522fc7ada266eebe45" Apr 16 19:20:50.859573 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:50.859554 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"008a50a43f03b73bd90d9aab0d31cf3150d6006e468c1c522fc7ada266eebe45\": container with ID starting with 008a50a43f03b73bd90d9aab0d31cf3150d6006e468c1c522fc7ada266eebe45 not found: ID does not exist" containerID="008a50a43f03b73bd90d9aab0d31cf3150d6006e468c1c522fc7ada266eebe45" Apr 16 19:20:50.859628 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.859582 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"008a50a43f03b73bd90d9aab0d31cf3150d6006e468c1c522fc7ada266eebe45"} err="failed to get container status \"008a50a43f03b73bd90d9aab0d31cf3150d6006e468c1c522fc7ada266eebe45\": rpc error: code = NotFound desc = could not find container \"008a50a43f03b73bd90d9aab0d31cf3150d6006e468c1c522fc7ada266eebe45\": container with ID starting with 008a50a43f03b73bd90d9aab0d31cf3150d6006e468c1c522fc7ada266eebe45 not found: ID does not exist" Apr 16 19:20:50.859628 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.859598 2582 scope.go:117] "RemoveContainer" containerID="602fef8860e23884627a98b7062925db5cd0ee1b6fd85ad24d2f5170413b4f2e" Apr 16 19:20:50.859835 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:50.859816 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"602fef8860e23884627a98b7062925db5cd0ee1b6fd85ad24d2f5170413b4f2e\": container with ID starting with 602fef8860e23884627a98b7062925db5cd0ee1b6fd85ad24d2f5170413b4f2e not found: ID does not exist" containerID="602fef8860e23884627a98b7062925db5cd0ee1b6fd85ad24d2f5170413b4f2e" Apr 16 19:20:50.859876 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.859843 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"602fef8860e23884627a98b7062925db5cd0ee1b6fd85ad24d2f5170413b4f2e"} err="failed to get container status \"602fef8860e23884627a98b7062925db5cd0ee1b6fd85ad24d2f5170413b4f2e\": rpc error: code = NotFound desc = could not find container \"602fef8860e23884627a98b7062925db5cd0ee1b6fd85ad24d2f5170413b4f2e\": container with ID starting with 602fef8860e23884627a98b7062925db5cd0ee1b6fd85ad24d2f5170413b4f2e not found: ID does not exist" Apr 16 19:20:50.859876 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.859862 2582 scope.go:117] "RemoveContainer" containerID="5cc58e70539715ac110857df30400c7e663618c4fd2a94c800f775719d38c759" Apr 16 19:20:50.860096 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:50.860078 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cc58e70539715ac110857df30400c7e663618c4fd2a94c800f775719d38c759\": container with ID starting with 5cc58e70539715ac110857df30400c7e663618c4fd2a94c800f775719d38c759 not found: ID does not exist" containerID="5cc58e70539715ac110857df30400c7e663618c4fd2a94c800f775719d38c759" Apr 16 19:20:50.860131 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.860103 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cc58e70539715ac110857df30400c7e663618c4fd2a94c800f775719d38c759"} err="failed to get container status \"5cc58e70539715ac110857df30400c7e663618c4fd2a94c800f775719d38c759\": rpc error: code = NotFound desc = could not find container \"5cc58e70539715ac110857df30400c7e663618c4fd2a94c800f775719d38c759\": container with ID starting with 5cc58e70539715ac110857df30400c7e663618c4fd2a94c800f775719d38c759 not found: ID does not exist" Apr 16 19:20:50.860200 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.860130 2582 scope.go:117] "RemoveContainer" containerID="dfc587a1d60607c3801621b24c59dd279bd59b606a12322137f316a8bea65c09" Apr 16 19:20:50.860383 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:50.860366 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfc587a1d60607c3801621b24c59dd279bd59b606a12322137f316a8bea65c09\": container with ID starting with dfc587a1d60607c3801621b24c59dd279bd59b606a12322137f316a8bea65c09 not found: ID does not exist" containerID="dfc587a1d60607c3801621b24c59dd279bd59b606a12322137f316a8bea65c09" Apr 16 19:20:50.860445 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.860391 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfc587a1d60607c3801621b24c59dd279bd59b606a12322137f316a8bea65c09"} err="failed to get container status \"dfc587a1d60607c3801621b24c59dd279bd59b606a12322137f316a8bea65c09\": rpc error: code = NotFound desc = could not find container \"dfc587a1d60607c3801621b24c59dd279bd59b606a12322137f316a8bea65c09\": container with ID starting with dfc587a1d60607c3801621b24c59dd279bd59b606a12322137f316a8bea65c09 not found: ID does not exist" Apr 16 19:20:50.860445 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.860412 2582 scope.go:117] "RemoveContainer" containerID="3e03a40e36bfcc5516aa09df567e991d091695a43b0d5c52406837dad4ea26bb" Apr 16 19:20:50.860657 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:50.860640 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e03a40e36bfcc5516aa09df567e991d091695a43b0d5c52406837dad4ea26bb\": container with ID starting with 3e03a40e36bfcc5516aa09df567e991d091695a43b0d5c52406837dad4ea26bb not found: ID does not exist" containerID="3e03a40e36bfcc5516aa09df567e991d091695a43b0d5c52406837dad4ea26bb" Apr 16 19:20:50.860701 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.860661 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e03a40e36bfcc5516aa09df567e991d091695a43b0d5c52406837dad4ea26bb"} err="failed to get container status \"3e03a40e36bfcc5516aa09df567e991d091695a43b0d5c52406837dad4ea26bb\": rpc error: code = NotFound desc = could not find container \"3e03a40e36bfcc5516aa09df567e991d091695a43b0d5c52406837dad4ea26bb\": container with ID starting with 3e03a40e36bfcc5516aa09df567e991d091695a43b0d5c52406837dad4ea26bb not found: ID does not exist" Apr 16 19:20:50.860701 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.860686 2582 scope.go:117] "RemoveContainer" containerID="35d8978b99d1910976ae758e3415cf3450f18f7ef95491faaa70ff91c758d001" Apr 16 19:20:50.860893 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:50.860876 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35d8978b99d1910976ae758e3415cf3450f18f7ef95491faaa70ff91c758d001\": container with ID starting with 35d8978b99d1910976ae758e3415cf3450f18f7ef95491faaa70ff91c758d001 not found: ID does not exist" containerID="35d8978b99d1910976ae758e3415cf3450f18f7ef95491faaa70ff91c758d001" Apr 16 19:20:50.860931 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.860907 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35d8978b99d1910976ae758e3415cf3450f18f7ef95491faaa70ff91c758d001"} err="failed to get container status \"35d8978b99d1910976ae758e3415cf3450f18f7ef95491faaa70ff91c758d001\": rpc error: code = NotFound desc = could not find container \"35d8978b99d1910976ae758e3415cf3450f18f7ef95491faaa70ff91c758d001\": container with ID starting with 35d8978b99d1910976ae758e3415cf3450f18f7ef95491faaa70ff91c758d001 not found: ID does not exist" Apr 16 19:20:50.860931 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.860921 2582 scope.go:117] "RemoveContainer" containerID="4c707484b60b3313034a617f8746cf8aa92664a3c420d31ae6085a8fe286fa24" Apr 16 19:20:50.861141 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:20:50.861125 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c707484b60b3313034a617f8746cf8aa92664a3c420d31ae6085a8fe286fa24\": container with ID starting with 4c707484b60b3313034a617f8746cf8aa92664a3c420d31ae6085a8fe286fa24 not found: ID does not exist" containerID="4c707484b60b3313034a617f8746cf8aa92664a3c420d31ae6085a8fe286fa24" Apr 16 19:20:50.861215 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.861164 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c707484b60b3313034a617f8746cf8aa92664a3c420d31ae6085a8fe286fa24"} err="failed to get container status \"4c707484b60b3313034a617f8746cf8aa92664a3c420d31ae6085a8fe286fa24\": rpc error: code = NotFound desc = could not find container \"4c707484b60b3313034a617f8746cf8aa92664a3c420d31ae6085a8fe286fa24\": container with ID starting with 4c707484b60b3313034a617f8746cf8aa92664a3c420d31ae6085a8fe286fa24 not found: ID does not exist" Apr 16 19:20:50.870487 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.870465 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:20:50.870778 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.870767 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="kube-rbac-proxy-web" Apr 16 19:20:50.870819 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.870780 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="kube-rbac-proxy-web" Apr 16 19:20:50.870819 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.870795 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="kube-rbac-proxy-thanos" Apr 16 19:20:50.870819 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.870800 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="kube-rbac-proxy-thanos" Apr 16 19:20:50.870819 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.870809 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="init-config-reloader" Apr 16 19:20:50.870819 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.870815 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="init-config-reloader" Apr 16 19:20:50.870965 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.870830 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="prometheus" Apr 16 19:20:50.870965 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.870836 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="prometheus" Apr 16 19:20:50.870965 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.870843 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="config-reloader" Apr 16 19:20:50.870965 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.870849 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="config-reloader" Apr 16 19:20:50.870965 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.870855 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="thanos-sidecar" Apr 16 19:20:50.870965 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.870861 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="thanos-sidecar" Apr 16 19:20:50.870965 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.870868 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="kube-rbac-proxy" Apr 16 19:20:50.870965 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.870873 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="kube-rbac-proxy" Apr 16 19:20:50.870965 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.870924 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="thanos-sidecar" Apr 16 19:20:50.870965 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.870930 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="prometheus" Apr 16 19:20:50.870965 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.870936 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="kube-rbac-proxy-web" Apr 16 19:20:50.870965 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.870944 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="kube-rbac-proxy-thanos" Apr 16 19:20:50.870965 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.870950 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="kube-rbac-proxy" Apr 16 19:20:50.870965 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.870957 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" containerName="config-reloader" Apr 16 19:20:50.874828 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.874813 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:50.877299 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.877252 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 19:20:50.877397 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.877275 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 19:20:50.877485 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.877462 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 19:20:50.877546 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.877463 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 19:20:50.877747 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.877730 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 19:20:50.877821 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.877751 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 19:20:50.877883 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.877752 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 19:20:50.877883 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.877840 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 19:20:50.877883 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.877818 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-sghz7\"" Apr 16 19:20:50.878064 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.878050 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 19:20:50.878446 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.878428 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 19:20:50.878529 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.878473 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8dih7s3s9oa5u\"" Apr 16 19:20:50.886066 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.886030 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 19:20:50.886745 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.886656 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 19:20:50.888711 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.888689 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:20:50.908820 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.908796 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247ec332-5c00-47e4-b12b-08a0fef6a5fe-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:50.908947 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.908842 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/247ec332-5c00-47e4-b12b-08a0fef6a5fe-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:50.908947 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.908875 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/247ec332-5c00-47e4-b12b-08a0fef6a5fe-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:50.908947 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.908900 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:50.908947 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.908924 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:50.909171 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.908969 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:50.909171 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.908995 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qc6n\" (UniqueName: \"kubernetes.io/projected/247ec332-5c00-47e4-b12b-08a0fef6a5fe-kube-api-access-4qc6n\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:50.909171 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.909050 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247ec332-5c00-47e4-b12b-08a0fef6a5fe-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:50.909171 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.909095 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-config\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:50.909171 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.909122 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247ec332-5c00-47e4-b12b-08a0fef6a5fe-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:50.909402 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.909176 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/247ec332-5c00-47e4-b12b-08a0fef6a5fe-config-out\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:50.909402 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.909214 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:50.909402 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.909239 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-web-config\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:50.909402 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.909286 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:50.909402 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.909322 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:50.909402 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.909351 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:50.909658 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.909405 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/247ec332-5c00-47e4-b12b-08a0fef6a5fe-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:50.909658 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:50.909436 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/247ec332-5c00-47e4-b12b-08a0fef6a5fe-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.010831 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.010799 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247ec332-5c00-47e4-b12b-08a0fef6a5fe-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.010831 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.010837 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-config\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.011077 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.010859 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247ec332-5c00-47e4-b12b-08a0fef6a5fe-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.011077 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.010879 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/247ec332-5c00-47e4-b12b-08a0fef6a5fe-config-out\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.011077 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.010908 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.011077 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.010932 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-web-config\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.011077 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.010978 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.011077 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.011007 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.011077 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.011028 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.011077 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.011051 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/247ec332-5c00-47e4-b12b-08a0fef6a5fe-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.011077 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.011073 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/247ec332-5c00-47e4-b12b-08a0fef6a5fe-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.011575 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.011104 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247ec332-5c00-47e4-b12b-08a0fef6a5fe-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.011575 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.011138 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/247ec332-5c00-47e4-b12b-08a0fef6a5fe-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.011575 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.011197 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/247ec332-5c00-47e4-b12b-08a0fef6a5fe-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.011575 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.011224 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.011575 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.011260 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.011575 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.011296 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.011575 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.011325 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qc6n\" (UniqueName: \"kubernetes.io/projected/247ec332-5c00-47e4-b12b-08a0fef6a5fe-kube-api-access-4qc6n\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.011908 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.011696 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247ec332-5c00-47e4-b12b-08a0fef6a5fe-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.012189 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.012051 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247ec332-5c00-47e4-b12b-08a0fef6a5fe-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.012574 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.012324 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/247ec332-5c00-47e4-b12b-08a0fef6a5fe-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.014395 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.013938 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/247ec332-5c00-47e4-b12b-08a0fef6a5fe-config-out\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.014395 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.014065 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-web-config\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.014395 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.014118 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.014395 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.014247 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-config\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.014395 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.014292 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.014713 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.014484 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/247ec332-5c00-47e4-b12b-08a0fef6a5fe-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.014713 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.014539 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.014713 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.014611 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/247ec332-5c00-47e4-b12b-08a0fef6a5fe-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.015092 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.015072 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/247ec332-5c00-47e4-b12b-08a0fef6a5fe-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.016719 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.016698 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.016810 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.016793 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.017020 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.017002 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.017127 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.017111 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/247ec332-5c00-47e4-b12b-08a0fef6a5fe-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.017521 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.017500 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/247ec332-5c00-47e4-b12b-08a0fef6a5fe-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.019700 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.019683 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qc6n\" (UniqueName: \"kubernetes.io/projected/247ec332-5c00-47e4-b12b-08a0fef6a5fe-kube-api-access-4qc6n\") pod \"prometheus-k8s-0\" (UID: \"247ec332-5c00-47e4-b12b-08a0fef6a5fe\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.114943 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.114865 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33cd2620-ef7c-434f-8baf-6b44b39be4c8" path="/var/lib/kubelet/pods/33cd2620-ef7c-434f-8baf-6b44b39be4c8/volumes" Apr 16 19:20:51.189675 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.189642 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:20:51.342580 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.342450 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 19:20:51.344790 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:20:51.344763 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod247ec332_5c00_47e4_b12b_08a0fef6a5fe.slice/crio-a249fd3ed515a590469c3ad2542286a3138e19f1d7108ff02ab386cc17d8b283 WatchSource:0}: Error finding container a249fd3ed515a590469c3ad2542286a3138e19f1d7108ff02ab386cc17d8b283: Status 404 returned error can't find the container with id a249fd3ed515a590469c3ad2542286a3138e19f1d7108ff02ab386cc17d8b283 Apr 16 19:20:51.815848 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.815818 2582 generic.go:358] "Generic (PLEG): container finished" podID="247ec332-5c00-47e4-b12b-08a0fef6a5fe" containerID="3f6ce29a20f0ea19d10c3bea7ab096a41989064ebfd9ae2ddb9bef6633d723b6" exitCode=0 Apr 16 19:20:51.816237 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.815861 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"247ec332-5c00-47e4-b12b-08a0fef6a5fe","Type":"ContainerDied","Data":"3f6ce29a20f0ea19d10c3bea7ab096a41989064ebfd9ae2ddb9bef6633d723b6"} Apr 16 19:20:51.816237 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:51.815881 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"247ec332-5c00-47e4-b12b-08a0fef6a5fe","Type":"ContainerStarted","Data":"a249fd3ed515a590469c3ad2542286a3138e19f1d7108ff02ab386cc17d8b283"} Apr 16 19:20:52.823097 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:52.823062 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"247ec332-5c00-47e4-b12b-08a0fef6a5fe","Type":"ContainerStarted","Data":"6fed366b59310fa44ffe730fffe103c635c213f2cc9f83fcc9d6e855eb63d73d"} Apr 16 19:20:52.823097 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:52.823101 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"247ec332-5c00-47e4-b12b-08a0fef6a5fe","Type":"ContainerStarted","Data":"1d6ca5296093f3f12648441588b5f105add089a7adaa30a67e33d7269180e42b"} Apr 16 19:20:52.823484 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:52.823111 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"247ec332-5c00-47e4-b12b-08a0fef6a5fe","Type":"ContainerStarted","Data":"52a2ebfc45a26d5c8186e260115af443f915fcef48f54f5316b3789858556274"} Apr 16 19:20:52.823484 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:52.823119 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"247ec332-5c00-47e4-b12b-08a0fef6a5fe","Type":"ContainerStarted","Data":"f25b3a24bcccc2bec60a3fa723668541d40b008e8fe5d7db4885a5c38e2aad81"} Apr 16 19:20:52.823484 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:52.823127 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"247ec332-5c00-47e4-b12b-08a0fef6a5fe","Type":"ContainerStarted","Data":"7651ee48c94fd272368c31fcc0ef2efcb21abc22451630156a6c4e51bea347a7"} Apr 16 19:20:52.823484 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:52.823137 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"247ec332-5c00-47e4-b12b-08a0fef6a5fe","Type":"ContainerStarted","Data":"e03150a6fb6934baa30b56158492f0f1d89ae043816b910211c71389ad561101"} Apr 16 19:20:52.857565 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:52.857518 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.857500065 podStartE2EDuration="2.857500065s" podCreationTimestamp="2026-04-16 19:20:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:20:52.855665217 +0000 UTC m=+180.350255835" watchObservedRunningTime="2026-04-16 19:20:52.857500065 +0000 UTC m=+180.352090683" Apr 16 19:20:56.190642 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:20:56.190585 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:21:51.190172 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:21:51.190108 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:21:51.205645 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:21:51.205615 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:21:52.023765 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:21:52.023738 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 19:22:52.970535 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:22:52.970503 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/2.log" Apr 16 19:22:52.971423 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:22:52.971402 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/2.log" Apr 16 19:22:52.979681 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:22:52.979660 2582 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 19:27:30.821583 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:30.821548 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-66b64c949f-tr8wk"] Apr 16 19:27:30.824886 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:30.824867 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-tr8wk" Apr 16 19:27:30.828308 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:30.828283 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 19:27:30.828432 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:30.828293 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 19:27:30.828432 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:30.828340 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 19:27:30.828432 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:30.828351 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-42x6b\"" Apr 16 19:27:30.828681 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:30.828667 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 19:27:30.847725 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:30.847693 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-66b64c949f-tr8wk"] Apr 16 19:27:30.858024 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:30.857999 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf4nz\" (UniqueName: \"kubernetes.io/projected/f8f2f20f-2ab8-4d48-bd0b-cf681c274c68-kube-api-access-gf4nz\") pod \"opendatahub-operator-controller-manager-66b64c949f-tr8wk\" (UID: \"f8f2f20f-2ab8-4d48-bd0b-cf681c274c68\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-tr8wk" Apr 16 19:27:30.858167 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:30.858093 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f8f2f20f-2ab8-4d48-bd0b-cf681c274c68-webhook-cert\") pod \"opendatahub-operator-controller-manager-66b64c949f-tr8wk\" (UID: \"f8f2f20f-2ab8-4d48-bd0b-cf681c274c68\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-tr8wk" Apr 16 19:27:30.858167 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:30.858134 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8f2f20f-2ab8-4d48-bd0b-cf681c274c68-apiservice-cert\") pod \"opendatahub-operator-controller-manager-66b64c949f-tr8wk\" (UID: \"f8f2f20f-2ab8-4d48-bd0b-cf681c274c68\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-tr8wk" Apr 16 19:27:30.959116 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:30.959063 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gf4nz\" (UniqueName: \"kubernetes.io/projected/f8f2f20f-2ab8-4d48-bd0b-cf681c274c68-kube-api-access-gf4nz\") pod \"opendatahub-operator-controller-manager-66b64c949f-tr8wk\" (UID: \"f8f2f20f-2ab8-4d48-bd0b-cf681c274c68\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-tr8wk" Apr 16 19:27:30.959356 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:30.959195 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f8f2f20f-2ab8-4d48-bd0b-cf681c274c68-webhook-cert\") pod \"opendatahub-operator-controller-manager-66b64c949f-tr8wk\" (UID: \"f8f2f20f-2ab8-4d48-bd0b-cf681c274c68\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-tr8wk" Apr 16 19:27:30.959356 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:30.959220 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8f2f20f-2ab8-4d48-bd0b-cf681c274c68-apiservice-cert\") pod \"opendatahub-operator-controller-manager-66b64c949f-tr8wk\" (UID: \"f8f2f20f-2ab8-4d48-bd0b-cf681c274c68\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-tr8wk" Apr 16 19:27:30.961603 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:30.961579 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f8f2f20f-2ab8-4d48-bd0b-cf681c274c68-webhook-cert\") pod \"opendatahub-operator-controller-manager-66b64c949f-tr8wk\" (UID: \"f8f2f20f-2ab8-4d48-bd0b-cf681c274c68\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-tr8wk" Apr 16 19:27:30.961712 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:30.961602 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8f2f20f-2ab8-4d48-bd0b-cf681c274c68-apiservice-cert\") pod \"opendatahub-operator-controller-manager-66b64c949f-tr8wk\" (UID: \"f8f2f20f-2ab8-4d48-bd0b-cf681c274c68\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-tr8wk" Apr 16 19:27:30.968978 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:30.968947 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf4nz\" (UniqueName: \"kubernetes.io/projected/f8f2f20f-2ab8-4d48-bd0b-cf681c274c68-kube-api-access-gf4nz\") pod \"opendatahub-operator-controller-manager-66b64c949f-tr8wk\" (UID: \"f8f2f20f-2ab8-4d48-bd0b-cf681c274c68\") " pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-tr8wk" Apr 16 19:27:31.135816 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:31.135735 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-tr8wk" Apr 16 19:27:31.268119 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:31.268096 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-66b64c949f-tr8wk"] Apr 16 19:27:31.270821 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:27:31.270792 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8f2f20f_2ab8_4d48_bd0b_cf681c274c68.slice/crio-d31a4237b47b59d1cd783f38eef94ccd28f47552ef84d3173ef15b44fe4d08c1 WatchSource:0}: Error finding container d31a4237b47b59d1cd783f38eef94ccd28f47552ef84d3173ef15b44fe4d08c1: Status 404 returned error can't find the container with id d31a4237b47b59d1cd783f38eef94ccd28f47552ef84d3173ef15b44fe4d08c1 Apr 16 19:27:31.272558 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:31.272539 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:27:32.059799 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:32.059760 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-tr8wk" event={"ID":"f8f2f20f-2ab8-4d48-bd0b-cf681c274c68","Type":"ContainerStarted","Data":"d31a4237b47b59d1cd783f38eef94ccd28f47552ef84d3173ef15b44fe4d08c1"} Apr 16 19:27:34.069537 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.069489 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-tr8wk" event={"ID":"f8f2f20f-2ab8-4d48-bd0b-cf681c274c68","Type":"ContainerStarted","Data":"0b79ecaea3acd6142fd7c0bd2eb7442c5881eb0a3c1ecb26fe19b6c2d78d4283"} Apr 16 19:27:34.069969 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.069694 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-tr8wk" Apr 16 19:27:34.099475 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.099418 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-tr8wk" podStartSLOduration=1.702660974 podStartE2EDuration="4.099403982s" podCreationTimestamp="2026-04-16 19:27:30 +0000 UTC" firstStartedPulling="2026-04-16 19:27:31.272660044 +0000 UTC m=+578.767250639" lastFinishedPulling="2026-04-16 19:27:33.669403049 +0000 UTC m=+581.163993647" observedRunningTime="2026-04-16 19:27:34.09718218 +0000 UTC m=+581.591772787" watchObservedRunningTime="2026-04-16 19:27:34.099403982 +0000 UTC m=+581.593994599" Apr 16 19:27:34.210080 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.210047 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-66b4cb6588-bnz8x"] Apr 16 19:27:34.213570 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.213552 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-bnz8x" Apr 16 19:27:34.218453 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.218430 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 19:27:34.218590 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.218527 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-bmv5f\"" Apr 16 19:27:34.219469 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.219449 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 19:27:34.219743 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.219726 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 19:27:34.219743 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.219738 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 19:27:34.220983 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.220968 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:27:34.240947 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.240918 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-66b4cb6588-bnz8x"] Apr 16 19:27:34.292041 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.292008 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dc52\" (UniqueName: \"kubernetes.io/projected/85547f04-5633-4ab8-b014-a1e326a9ed35-kube-api-access-6dc52\") pod \"lws-controller-manager-66b4cb6588-bnz8x\" (UID: \"85547f04-5633-4ab8-b014-a1e326a9ed35\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-bnz8x" Apr 16 19:27:34.292272 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.292076 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/85547f04-5633-4ab8-b014-a1e326a9ed35-manager-config\") pod \"lws-controller-manager-66b4cb6588-bnz8x\" (UID: \"85547f04-5633-4ab8-b014-a1e326a9ed35\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-bnz8x" Apr 16 19:27:34.292272 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.292101 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85547f04-5633-4ab8-b014-a1e326a9ed35-cert\") pod \"lws-controller-manager-66b4cb6588-bnz8x\" (UID: \"85547f04-5633-4ab8-b014-a1e326a9ed35\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-bnz8x" Apr 16 19:27:34.292272 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.292136 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/85547f04-5633-4ab8-b014-a1e326a9ed35-metrics-cert\") pod \"lws-controller-manager-66b4cb6588-bnz8x\" (UID: \"85547f04-5633-4ab8-b014-a1e326a9ed35\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-bnz8x" Apr 16 19:27:34.393563 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.393461 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/85547f04-5633-4ab8-b014-a1e326a9ed35-manager-config\") pod \"lws-controller-manager-66b4cb6588-bnz8x\" (UID: \"85547f04-5633-4ab8-b014-a1e326a9ed35\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-bnz8x" Apr 16 19:27:34.393563 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.393514 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85547f04-5633-4ab8-b014-a1e326a9ed35-cert\") pod \"lws-controller-manager-66b4cb6588-bnz8x\" (UID: \"85547f04-5633-4ab8-b014-a1e326a9ed35\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-bnz8x" Apr 16 19:27:34.393563 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.393555 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/85547f04-5633-4ab8-b014-a1e326a9ed35-metrics-cert\") pod \"lws-controller-manager-66b4cb6588-bnz8x\" (UID: \"85547f04-5633-4ab8-b014-a1e326a9ed35\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-bnz8x" Apr 16 19:27:34.393782 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.393656 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dc52\" (UniqueName: \"kubernetes.io/projected/85547f04-5633-4ab8-b014-a1e326a9ed35-kube-api-access-6dc52\") pod \"lws-controller-manager-66b4cb6588-bnz8x\" (UID: \"85547f04-5633-4ab8-b014-a1e326a9ed35\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-bnz8x" Apr 16 19:27:34.394254 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.394223 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/85547f04-5633-4ab8-b014-a1e326a9ed35-manager-config\") pod \"lws-controller-manager-66b4cb6588-bnz8x\" (UID: \"85547f04-5633-4ab8-b014-a1e326a9ed35\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-bnz8x" Apr 16 19:27:34.396178 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.396134 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85547f04-5633-4ab8-b014-a1e326a9ed35-cert\") pod \"lws-controller-manager-66b4cb6588-bnz8x\" (UID: \"85547f04-5633-4ab8-b014-a1e326a9ed35\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-bnz8x" Apr 16 19:27:34.396281 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.396234 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/85547f04-5633-4ab8-b014-a1e326a9ed35-metrics-cert\") pod \"lws-controller-manager-66b4cb6588-bnz8x\" (UID: \"85547f04-5633-4ab8-b014-a1e326a9ed35\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-bnz8x" Apr 16 19:27:34.407086 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.407060 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dc52\" (UniqueName: \"kubernetes.io/projected/85547f04-5633-4ab8-b014-a1e326a9ed35-kube-api-access-6dc52\") pod \"lws-controller-manager-66b4cb6588-bnz8x\" (UID: \"85547f04-5633-4ab8-b014-a1e326a9ed35\") " pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-bnz8x" Apr 16 19:27:34.522949 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.522916 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-bnz8x" Apr 16 19:27:34.660167 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:34.660078 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-66b4cb6588-bnz8x"] Apr 16 19:27:34.663199 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:27:34.663171 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85547f04_5633_4ab8_b014_a1e326a9ed35.slice/crio-f2a3c7099a23b3b52e35a5214e81a1ad2453a01502f082b3eed04c8caa556ae8 WatchSource:0}: Error finding container f2a3c7099a23b3b52e35a5214e81a1ad2453a01502f082b3eed04c8caa556ae8: Status 404 returned error can't find the container with id f2a3c7099a23b3b52e35a5214e81a1ad2453a01502f082b3eed04c8caa556ae8 Apr 16 19:27:35.075132 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:35.075098 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-bnz8x" event={"ID":"85547f04-5633-4ab8-b014-a1e326a9ed35","Type":"ContainerStarted","Data":"f2a3c7099a23b3b52e35a5214e81a1ad2453a01502f082b3eed04c8caa556ae8"} Apr 16 19:27:37.085957 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:37.085924 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-bnz8x" event={"ID":"85547f04-5633-4ab8-b014-a1e326a9ed35","Type":"ContainerStarted","Data":"ef329481e1952d652af162a8427885a75b3780e6aa436a0c25f13e9bbb58b1a2"} Apr 16 19:27:37.086387 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:37.085999 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-bnz8x" Apr 16 19:27:37.105873 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:37.105822 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-bnz8x" podStartSLOduration=0.765265275 podStartE2EDuration="3.105807117s" podCreationTimestamp="2026-04-16 19:27:34 +0000 UTC" firstStartedPulling="2026-04-16 19:27:34.665065875 +0000 UTC m=+582.159656474" lastFinishedPulling="2026-04-16 19:27:37.005607716 +0000 UTC m=+584.500198316" observedRunningTime="2026-04-16 19:27:37.104896351 +0000 UTC m=+584.599486969" watchObservedRunningTime="2026-04-16 19:27:37.105807117 +0000 UTC m=+584.600397733" Apr 16 19:27:45.077654 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:45.077621 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-66b64c949f-tr8wk" Apr 16 19:27:47.932566 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:47.932528 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-d894ddccb-g2dzt"] Apr 16 19:27:47.938798 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:47.938776 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-d894ddccb-g2dzt" Apr 16 19:27:47.949331 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:47.949310 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 19:27:47.949817 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:47.949799 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 19:27:47.950125 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:47.950108 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-fsk68\"" Apr 16 19:27:47.954343 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:47.954320 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-d894ddccb-g2dzt"] Apr 16 19:27:48.018715 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:48.018678 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p8l9\" (UniqueName: \"kubernetes.io/projected/41d447d7-da03-4891-b7de-4f79a67dd23e-kube-api-access-5p8l9\") pod \"kube-auth-proxy-d894ddccb-g2dzt\" (UID: \"41d447d7-da03-4891-b7de-4f79a67dd23e\") " pod="openshift-ingress/kube-auth-proxy-d894ddccb-g2dzt" Apr 16 19:27:48.018882 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:48.018744 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/41d447d7-da03-4891-b7de-4f79a67dd23e-tmp\") pod \"kube-auth-proxy-d894ddccb-g2dzt\" (UID: \"41d447d7-da03-4891-b7de-4f79a67dd23e\") " pod="openshift-ingress/kube-auth-proxy-d894ddccb-g2dzt" Apr 16 19:27:48.018882 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:48.018857 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/41d447d7-da03-4891-b7de-4f79a67dd23e-tls-certs\") pod \"kube-auth-proxy-d894ddccb-g2dzt\" (UID: \"41d447d7-da03-4891-b7de-4f79a67dd23e\") " pod="openshift-ingress/kube-auth-proxy-d894ddccb-g2dzt" Apr 16 19:27:48.092795 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:48.092765 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-66b4cb6588-bnz8x" Apr 16 19:27:48.120206 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:48.120173 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/41d447d7-da03-4891-b7de-4f79a67dd23e-tmp\") pod \"kube-auth-proxy-d894ddccb-g2dzt\" (UID: \"41d447d7-da03-4891-b7de-4f79a67dd23e\") " pod="openshift-ingress/kube-auth-proxy-d894ddccb-g2dzt" Apr 16 19:27:48.120396 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:48.120273 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/41d447d7-da03-4891-b7de-4f79a67dd23e-tls-certs\") pod \"kube-auth-proxy-d894ddccb-g2dzt\" (UID: \"41d447d7-da03-4891-b7de-4f79a67dd23e\") " pod="openshift-ingress/kube-auth-proxy-d894ddccb-g2dzt" Apr 16 19:27:48.120396 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:48.120334 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5p8l9\" (UniqueName: \"kubernetes.io/projected/41d447d7-da03-4891-b7de-4f79a67dd23e-kube-api-access-5p8l9\") pod \"kube-auth-proxy-d894ddccb-g2dzt\" (UID: \"41d447d7-da03-4891-b7de-4f79a67dd23e\") " pod="openshift-ingress/kube-auth-proxy-d894ddccb-g2dzt" Apr 16 19:27:48.122367 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:48.122341 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/41d447d7-da03-4891-b7de-4f79a67dd23e-tmp\") pod \"kube-auth-proxy-d894ddccb-g2dzt\" (UID: \"41d447d7-da03-4891-b7de-4f79a67dd23e\") " pod="openshift-ingress/kube-auth-proxy-d894ddccb-g2dzt" Apr 16 19:27:48.122677 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:48.122656 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/41d447d7-da03-4891-b7de-4f79a67dd23e-tls-certs\") pod \"kube-auth-proxy-d894ddccb-g2dzt\" (UID: \"41d447d7-da03-4891-b7de-4f79a67dd23e\") " pod="openshift-ingress/kube-auth-proxy-d894ddccb-g2dzt" Apr 16 19:27:48.143105 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:48.143075 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p8l9\" (UniqueName: \"kubernetes.io/projected/41d447d7-da03-4891-b7de-4f79a67dd23e-kube-api-access-5p8l9\") pod \"kube-auth-proxy-d894ddccb-g2dzt\" (UID: \"41d447d7-da03-4891-b7de-4f79a67dd23e\") " pod="openshift-ingress/kube-auth-proxy-d894ddccb-g2dzt" Apr 16 19:27:48.249442 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:48.249415 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-d894ddccb-g2dzt" Apr 16 19:27:48.402628 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:48.402598 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-d894ddccb-g2dzt"] Apr 16 19:27:48.404364 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:27:48.404334 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41d447d7_da03_4891_b7de_4f79a67dd23e.slice/crio-f906d15e804a7d7d0e48de867ccdcd131afd37a0a8e902121620c658943ebd69 WatchSource:0}: Error finding container f906d15e804a7d7d0e48de867ccdcd131afd37a0a8e902121620c658943ebd69: Status 404 returned error can't find the container with id f906d15e804a7d7d0e48de867ccdcd131afd37a0a8e902121620c658943ebd69 Apr 16 19:27:49.130500 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:49.130460 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-d894ddccb-g2dzt" event={"ID":"41d447d7-da03-4891-b7de-4f79a67dd23e","Type":"ContainerStarted","Data":"f906d15e804a7d7d0e48de867ccdcd131afd37a0a8e902121620c658943ebd69"} Apr 16 19:27:52.142457 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:52.142416 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-d894ddccb-g2dzt" event={"ID":"41d447d7-da03-4891-b7de-4f79a67dd23e","Type":"ContainerStarted","Data":"13b1b58c632dec9c9adbe60ec78a82c9f3ae7fb20a2ff5b7973a4e7707c93e8a"} Apr 16 19:27:52.162372 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:52.162329 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-d894ddccb-g2dzt" podStartSLOduration=2.030696655 podStartE2EDuration="5.162315208s" podCreationTimestamp="2026-04-16 19:27:47 +0000 UTC" firstStartedPulling="2026-04-16 19:27:48.405965208 +0000 UTC m=+595.900555804" lastFinishedPulling="2026-04-16 19:27:51.537583761 +0000 UTC m=+599.032174357" observedRunningTime="2026-04-16 19:27:52.161539357 +0000 UTC m=+599.656129991" watchObservedRunningTime="2026-04-16 19:27:52.162315208 +0000 UTC m=+599.656905826" Apr 16 19:27:52.998447 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:52.998418 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/2.log" Apr 16 19:27:52.999029 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:27:52.999005 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/2.log" Apr 16 19:29:42.090357 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:29:42.090276 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-f2g22"] Apr 16 19:29:42.093716 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:29:42.093698 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-f2g22" Apr 16 19:29:42.096260 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:29:42.096235 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 19:29:42.096260 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:29:42.096253 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 19:29:42.096446 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:29:42.096275 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 19:29:42.096446 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:29:42.096275 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-c46nd\"" Apr 16 19:29:42.097015 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:29:42.096996 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 19:29:42.103929 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:29:42.103905 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-f2g22"] Apr 16 19:29:42.143914 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:29:42.143882 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ede8d40-96b1-4398-a6ce-a0bc9a42317f-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-f2g22\" (UID: \"2ede8d40-96b1-4398-a6ce-a0bc9a42317f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-f2g22" Apr 16 19:29:42.144088 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:29:42.144069 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2ede8d40-96b1-4398-a6ce-a0bc9a42317f-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-f2g22\" (UID: \"2ede8d40-96b1-4398-a6ce-a0bc9a42317f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-f2g22" Apr 16 19:29:42.144191 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:29:42.144138 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r629v\" (UniqueName: \"kubernetes.io/projected/2ede8d40-96b1-4398-a6ce-a0bc9a42317f-kube-api-access-r629v\") pod \"kuadrant-console-plugin-6cb54b5c86-f2g22\" (UID: \"2ede8d40-96b1-4398-a6ce-a0bc9a42317f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-f2g22" Apr 16 19:29:42.245421 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:29:42.245383 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r629v\" (UniqueName: \"kubernetes.io/projected/2ede8d40-96b1-4398-a6ce-a0bc9a42317f-kube-api-access-r629v\") pod \"kuadrant-console-plugin-6cb54b5c86-f2g22\" (UID: \"2ede8d40-96b1-4398-a6ce-a0bc9a42317f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-f2g22" Apr 16 19:29:42.245617 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:29:42.245460 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ede8d40-96b1-4398-a6ce-a0bc9a42317f-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-f2g22\" (UID: \"2ede8d40-96b1-4398-a6ce-a0bc9a42317f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-f2g22" Apr 16 19:29:42.245617 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:29:42.245494 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2ede8d40-96b1-4398-a6ce-a0bc9a42317f-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-f2g22\" (UID: \"2ede8d40-96b1-4398-a6ce-a0bc9a42317f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-f2g22" Apr 16 19:29:42.245617 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:29:42.245608 2582 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 16 19:29:42.245784 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:29:42.245689 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ede8d40-96b1-4398-a6ce-a0bc9a42317f-plugin-serving-cert podName:2ede8d40-96b1-4398-a6ce-a0bc9a42317f nodeName:}" failed. No retries permitted until 2026-04-16 19:29:42.745666185 +0000 UTC m=+710.240256786 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/2ede8d40-96b1-4398-a6ce-a0bc9a42317f-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-f2g22" (UID: "2ede8d40-96b1-4398-a6ce-a0bc9a42317f") : secret "plugin-serving-cert" not found Apr 16 19:29:42.246055 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:29:42.246036 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2ede8d40-96b1-4398-a6ce-a0bc9a42317f-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-f2g22\" (UID: \"2ede8d40-96b1-4398-a6ce-a0bc9a42317f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-f2g22" Apr 16 19:29:42.268757 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:29:42.268730 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r629v\" (UniqueName: \"kubernetes.io/projected/2ede8d40-96b1-4398-a6ce-a0bc9a42317f-kube-api-access-r629v\") pod \"kuadrant-console-plugin-6cb54b5c86-f2g22\" (UID: \"2ede8d40-96b1-4398-a6ce-a0bc9a42317f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-f2g22" Apr 16 19:29:42.750676 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:29:42.750644 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ede8d40-96b1-4398-a6ce-a0bc9a42317f-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-f2g22\" (UID: \"2ede8d40-96b1-4398-a6ce-a0bc9a42317f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-f2g22" Apr 16 19:29:42.753111 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:29:42.753083 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2ede8d40-96b1-4398-a6ce-a0bc9a42317f-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-f2g22\" (UID: \"2ede8d40-96b1-4398-a6ce-a0bc9a42317f\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-f2g22" Apr 16 19:29:43.011071 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:29:43.010989 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-f2g22" Apr 16 19:29:43.133519 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:29:43.133497 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-f2g22"] Apr 16 19:29:43.135892 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:29:43.135862 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ede8d40_96b1_4398_a6ce_a0bc9a42317f.slice/crio-06d8672e654b612f1700de353456463c07ddaa69873895d323866b9e52598270 WatchSource:0}: Error finding container 06d8672e654b612f1700de353456463c07ddaa69873895d323866b9e52598270: Status 404 returned error can't find the container with id 06d8672e654b612f1700de353456463c07ddaa69873895d323866b9e52598270 Apr 16 19:29:43.524395 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:29:43.524357 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-f2g22" event={"ID":"2ede8d40-96b1-4398-a6ce-a0bc9a42317f","Type":"ContainerStarted","Data":"06d8672e654b612f1700de353456463c07ddaa69873895d323866b9e52598270"} Apr 16 19:30:07.622841 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:30:07.622803 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-f2g22" event={"ID":"2ede8d40-96b1-4398-a6ce-a0bc9a42317f","Type":"ContainerStarted","Data":"f1956d6854f34a83fa6089c2231defad73ec711985c5b696b0f6b288e9fce95c"} Apr 16 19:30:07.638656 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:30:07.638597 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-f2g22" podStartSLOduration=1.559563762 podStartE2EDuration="25.638578166s" podCreationTimestamp="2026-04-16 19:29:42 +0000 UTC" firstStartedPulling="2026-04-16 19:29:43.13724107 +0000 UTC m=+710.631831669" lastFinishedPulling="2026-04-16 19:30:07.216255459 +0000 UTC m=+734.710846073" observedRunningTime="2026-04-16 19:30:07.637204328 +0000 UTC m=+735.131794949" watchObservedRunningTime="2026-04-16 19:30:07.638578166 +0000 UTC m=+735.133168784" Apr 16 19:30:29.116669 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:30:29.116636 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:30:29.161540 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:30:29.161510 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:30:29.161540 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:30:29.161537 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:30:29.161724 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:30:29.161640 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-7hhd5" Apr 16 19:30:29.163995 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:30:29.163973 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 19:30:29.298248 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:30:29.298220 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv264\" (UniqueName: \"kubernetes.io/projected/a6a2335a-6958-4fa7-baac-35076649a576-kube-api-access-cv264\") pod \"limitador-limitador-78c99df468-7hhd5\" (UID: \"a6a2335a-6958-4fa7-baac-35076649a576\") " pod="kuadrant-system/limitador-limitador-78c99df468-7hhd5" Apr 16 19:30:29.298401 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:30:29.298281 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a6a2335a-6958-4fa7-baac-35076649a576-config-file\") pod \"limitador-limitador-78c99df468-7hhd5\" (UID: \"a6a2335a-6958-4fa7-baac-35076649a576\") " pod="kuadrant-system/limitador-limitador-78c99df468-7hhd5" Apr 16 19:30:29.398884 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:30:29.398809 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cv264\" (UniqueName: \"kubernetes.io/projected/a6a2335a-6958-4fa7-baac-35076649a576-kube-api-access-cv264\") pod \"limitador-limitador-78c99df468-7hhd5\" (UID: \"a6a2335a-6958-4fa7-baac-35076649a576\") " pod="kuadrant-system/limitador-limitador-78c99df468-7hhd5" Apr 16 19:30:29.398884 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:30:29.398867 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a6a2335a-6958-4fa7-baac-35076649a576-config-file\") pod \"limitador-limitador-78c99df468-7hhd5\" (UID: \"a6a2335a-6958-4fa7-baac-35076649a576\") " pod="kuadrant-system/limitador-limitador-78c99df468-7hhd5" Apr 16 19:30:29.399536 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:30:29.399510 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a6a2335a-6958-4fa7-baac-35076649a576-config-file\") pod \"limitador-limitador-78c99df468-7hhd5\" (UID: \"a6a2335a-6958-4fa7-baac-35076649a576\") " pod="kuadrant-system/limitador-limitador-78c99df468-7hhd5" Apr 16 19:30:29.406403 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:30:29.406376 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv264\" (UniqueName: \"kubernetes.io/projected/a6a2335a-6958-4fa7-baac-35076649a576-kube-api-access-cv264\") pod \"limitador-limitador-78c99df468-7hhd5\" (UID: \"a6a2335a-6958-4fa7-baac-35076649a576\") " pod="kuadrant-system/limitador-limitador-78c99df468-7hhd5" Apr 16 19:30:29.471770 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:30:29.471740 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-7hhd5" Apr 16 19:30:29.602913 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:30:29.602890 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:30:29.605212 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:30:29.605180 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6a2335a_6958_4fa7_baac_35076649a576.slice/crio-84c17c16ba611b2b2de71c00b7c4acc5fc4fea2ce4ee4a443cf9fe61df54471e WatchSource:0}: Error finding container 84c17c16ba611b2b2de71c00b7c4acc5fc4fea2ce4ee4a443cf9fe61df54471e: Status 404 returned error can't find the container with id 84c17c16ba611b2b2de71c00b7c4acc5fc4fea2ce4ee4a443cf9fe61df54471e Apr 16 19:30:29.706132 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:30:29.706046 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-7hhd5" event={"ID":"a6a2335a-6958-4fa7-baac-35076649a576","Type":"ContainerStarted","Data":"84c17c16ba611b2b2de71c00b7c4acc5fc4fea2ce4ee4a443cf9fe61df54471e"} Apr 16 19:30:32.719924 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:30:32.719887 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-7hhd5" event={"ID":"a6a2335a-6958-4fa7-baac-35076649a576","Type":"ContainerStarted","Data":"a660b1b59ae9e605d4113275b5ac8e282579ec23764af8c13dc816ddc957b8d3"} Apr 16 19:30:32.720358 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:30:32.720004 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-7hhd5" Apr 16 19:30:32.738310 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:30:32.738267 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-7hhd5" podStartSLOduration=1.278928068 podStartE2EDuration="3.738255972s" podCreationTimestamp="2026-04-16 19:30:29 +0000 UTC" firstStartedPulling="2026-04-16 19:30:29.607003969 +0000 UTC m=+757.101594564" lastFinishedPulling="2026-04-16 19:30:32.066331871 +0000 UTC m=+759.560922468" observedRunningTime="2026-04-16 19:30:32.735873493 +0000 UTC m=+760.230464110" watchObservedRunningTime="2026-04-16 19:30:32.738255972 +0000 UTC m=+760.232846589" Apr 16 19:30:43.724134 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:30:43.724100 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-7hhd5" Apr 16 19:30:59.241651 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:30:59.241611 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:31:52.704961 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:31:52.704922 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:31:55.164165 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:31:55.164126 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:31:58.885927 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:31:58.885894 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:32:17.086601 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:32:17.086572 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:32:36.275422 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:32:36.275380 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:32:47.274601 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:32:47.274563 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:32:53.026706 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:32:53.026678 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/2.log" Apr 16 19:32:53.028786 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:32:53.028759 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/2.log" Apr 16 19:33:50.072301 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:33:50.072262 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:34:01.068437 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:34:01.068400 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:34:09.164604 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:34:09.164522 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:34:20.065961 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:34:20.065923 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:34:30.076165 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:34:30.076111 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:34:39.170336 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:34:39.170298 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:35:42.873176 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:35:42.873072 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:35:58.587562 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:35:58.587526 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:36:36.999493 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:36:36.999456 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:36:53.460194 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:36:53.460163 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:37:07.666842 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:37:07.666760 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:37:24.665142 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:37:24.665107 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:37:28.769260 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:37:28.769223 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:37:53.052704 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:37:53.052672 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/2.log" Apr 16 19:37:53.054733 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:37:53.054710 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/2.log" Apr 16 19:38:18.962088 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:38:18.962047 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:38:27.469716 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:38:27.469681 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:38:45.454210 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:38:45.454113 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:38:54.068446 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:38:54.068406 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:39:10.165228 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:39:10.165193 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:39:18.274552 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:39:18.274515 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:39:50.671574 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:39:50.671541 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:39:59.963972 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:39:59.963926 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:40:07.858880 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:40:07.858796 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:40:17.078407 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:40:17.078373 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:40:24.260598 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:40:24.260551 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:40:41.974317 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:40:41.974262 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:40:53.555751 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:40:53.555713 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:41:40.093063 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:41:40.092970 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:41:48.600257 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:41:48.600215 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:41:58.363726 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:41:58.363688 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:42:05.758342 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:42:05.758308 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:42:15.078778 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:42:15.078741 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:42:23.261997 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:42:23.261960 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:42:32.458002 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:42:32.457971 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:42:41.462086 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:42:41.462054 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:42:49.569319 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:42:49.569284 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:42:53.078353 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:42:53.078322 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/2.log" Apr 16 19:42:53.081670 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:42:53.081646 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/2.log" Apr 16 19:42:58.256895 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:42:58.256859 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:43:07.961894 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:43:07.961809 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:43:16.662137 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:43:16.662099 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:43:25.166426 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:43:25.166393 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:43:33.468672 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:43:33.468629 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:43:42.658226 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:43:42.658192 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:43:50.558135 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:43:50.558094 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:43:59.958888 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:43:59.958855 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:44:08.265371 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:44:08.265334 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:45:00.145533 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:00.145493 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29606145-gxsnw"] Apr 16 19:45:00.147867 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:00.147845 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606145-gxsnw" Apr 16 19:45:00.150591 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:00.150572 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-cczkd\"" Apr 16 19:45:00.153962 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:00.153936 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606145-gxsnw"] Apr 16 19:45:00.249986 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:00.249946 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db74g\" (UniqueName: \"kubernetes.io/projected/09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a-kube-api-access-db74g\") pod \"maas-api-key-cleanup-29606145-gxsnw\" (UID: \"09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a\") " pod="opendatahub/maas-api-key-cleanup-29606145-gxsnw" Apr 16 19:45:00.351118 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:00.351067 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-db74g\" (UniqueName: \"kubernetes.io/projected/09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a-kube-api-access-db74g\") pod \"maas-api-key-cleanup-29606145-gxsnw\" (UID: \"09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a\") " pod="opendatahub/maas-api-key-cleanup-29606145-gxsnw" Apr 16 19:45:00.359857 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:00.359827 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-db74g\" (UniqueName: \"kubernetes.io/projected/09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a-kube-api-access-db74g\") pod \"maas-api-key-cleanup-29606145-gxsnw\" (UID: \"09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a\") " pod="opendatahub/maas-api-key-cleanup-29606145-gxsnw" Apr 16 19:45:00.459383 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:00.459300 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606145-gxsnw" Apr 16 19:45:00.785540 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:00.785501 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606145-gxsnw"] Apr 16 19:45:00.791742 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:00.791710 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:45:01.696880 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:01.696843 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606145-gxsnw" event={"ID":"09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a","Type":"ContainerStarted","Data":"678c112c55e19558fbcd1c5b3555b01988a395ff3454bd1451f49c3c316a6d87"} Apr 16 19:45:03.705988 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:03.705954 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606145-gxsnw" event={"ID":"09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a","Type":"ContainerStarted","Data":"cc8e9b1147ab2f340efa7cda1380c2878a8caaa68f829e2087a2f9baaef8425b"} Apr 16 19:45:03.721925 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:03.721854 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29606145-gxsnw" podStartSLOduration=1.760994293 podStartE2EDuration="3.721840291s" podCreationTimestamp="2026-04-16 19:45:00 +0000 UTC" firstStartedPulling="2026-04-16 19:45:00.791912758 +0000 UTC m=+1628.286503368" lastFinishedPulling="2026-04-16 19:45:02.752758755 +0000 UTC m=+1630.247349366" observedRunningTime="2026-04-16 19:45:03.720808395 +0000 UTC m=+1631.215399038" watchObservedRunningTime="2026-04-16 19:45:03.721840291 +0000 UTC m=+1631.216430907" Apr 16 19:45:23.777361 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:23.777330 2582 generic.go:358] "Generic (PLEG): container finished" podID="09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a" containerID="cc8e9b1147ab2f340efa7cda1380c2878a8caaa68f829e2087a2f9baaef8425b" exitCode=6 Apr 16 19:45:23.777753 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:23.777399 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606145-gxsnw" event={"ID":"09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a","Type":"ContainerDied","Data":"cc8e9b1147ab2f340efa7cda1380c2878a8caaa68f829e2087a2f9baaef8425b"} Apr 16 19:45:23.777753 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:23.777721 2582 scope.go:117] "RemoveContainer" containerID="cc8e9b1147ab2f340efa7cda1380c2878a8caaa68f829e2087a2f9baaef8425b" Apr 16 19:45:24.782879 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:24.782845 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606145-gxsnw" event={"ID":"09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a","Type":"ContainerStarted","Data":"4777f4c5a3b6ef1a0f3c17c5c41660e58613f2ddea77f970732d2de1a0b9be49"} Apr 16 19:45:44.854689 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:44.854648 2582 generic.go:358] "Generic (PLEG): container finished" podID="09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a" containerID="4777f4c5a3b6ef1a0f3c17c5c41660e58613f2ddea77f970732d2de1a0b9be49" exitCode=6 Apr 16 19:45:44.855180 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:44.854719 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606145-gxsnw" event={"ID":"09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a","Type":"ContainerDied","Data":"4777f4c5a3b6ef1a0f3c17c5c41660e58613f2ddea77f970732d2de1a0b9be49"} Apr 16 19:45:44.855180 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:44.854760 2582 scope.go:117] "RemoveContainer" containerID="cc8e9b1147ab2f340efa7cda1380c2878a8caaa68f829e2087a2f9baaef8425b" Apr 16 19:45:44.855180 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:44.855080 2582 scope.go:117] "RemoveContainer" containerID="4777f4c5a3b6ef1a0f3c17c5c41660e58613f2ddea77f970732d2de1a0b9be49" Apr 16 19:45:44.855344 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:45:44.855322 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29606145-gxsnw_opendatahub(09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a)\"" pod="opendatahub/maas-api-key-cleanup-29606145-gxsnw" podUID="09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a" Apr 16 19:45:57.110846 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:57.110816 2582 scope.go:117] "RemoveContainer" containerID="4777f4c5a3b6ef1a0f3c17c5c41660e58613f2ddea77f970732d2de1a0b9be49" Apr 16 19:45:57.900607 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:57.900572 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606145-gxsnw" event={"ID":"09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a","Type":"ContainerStarted","Data":"7783901633ed28c32478d0a331df9a01bf09907dce706a240f770b1793d7c973"} Apr 16 19:45:58.139166 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:58.139111 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606145-gxsnw"] Apr 16 19:45:58.903466 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:45:58.903428 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29606145-gxsnw" podUID="09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a" containerName="cleanup" containerID="cri-o://7783901633ed28c32478d0a331df9a01bf09907dce706a240f770b1793d7c973" gracePeriod=30 Apr 16 19:46:17.848365 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:46:17.848339 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606145-gxsnw" Apr 16 19:46:17.949521 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:46:17.949435 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-db74g\" (UniqueName: \"kubernetes.io/projected/09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a-kube-api-access-db74g\") pod \"09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a\" (UID: \"09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a\") " Apr 16 19:46:17.951582 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:46:17.951544 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a-kube-api-access-db74g" (OuterVolumeSpecName: "kube-api-access-db74g") pod "09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a" (UID: "09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a"). InnerVolumeSpecName "kube-api-access-db74g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:46:17.967717 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:46:17.967688 2582 generic.go:358] "Generic (PLEG): container finished" podID="09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a" containerID="7783901633ed28c32478d0a331df9a01bf09907dce706a240f770b1793d7c973" exitCode=6 Apr 16 19:46:17.967833 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:46:17.967743 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606145-gxsnw" event={"ID":"09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a","Type":"ContainerDied","Data":"7783901633ed28c32478d0a331df9a01bf09907dce706a240f770b1793d7c973"} Apr 16 19:46:17.967833 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:46:17.967753 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29606145-gxsnw" Apr 16 19:46:17.967833 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:46:17.967768 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29606145-gxsnw" event={"ID":"09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a","Type":"ContainerDied","Data":"678c112c55e19558fbcd1c5b3555b01988a395ff3454bd1451f49c3c316a6d87"} Apr 16 19:46:17.967833 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:46:17.967785 2582 scope.go:117] "RemoveContainer" containerID="7783901633ed28c32478d0a331df9a01bf09907dce706a240f770b1793d7c973" Apr 16 19:46:17.978832 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:46:17.977407 2582 scope.go:117] "RemoveContainer" containerID="4777f4c5a3b6ef1a0f3c17c5c41660e58613f2ddea77f970732d2de1a0b9be49" Apr 16 19:46:17.986185 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:46:17.986165 2582 scope.go:117] "RemoveContainer" containerID="7783901633ed28c32478d0a331df9a01bf09907dce706a240f770b1793d7c973" Apr 16 19:46:17.986431 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:46:17.986410 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7783901633ed28c32478d0a331df9a01bf09907dce706a240f770b1793d7c973\": container with ID starting with 7783901633ed28c32478d0a331df9a01bf09907dce706a240f770b1793d7c973 not found: ID does not exist" containerID="7783901633ed28c32478d0a331df9a01bf09907dce706a240f770b1793d7c973" Apr 16 19:46:17.986496 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:46:17.986439 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7783901633ed28c32478d0a331df9a01bf09907dce706a240f770b1793d7c973"} err="failed to get container status \"7783901633ed28c32478d0a331df9a01bf09907dce706a240f770b1793d7c973\": rpc error: code = NotFound desc = could not find container \"7783901633ed28c32478d0a331df9a01bf09907dce706a240f770b1793d7c973\": container with ID starting with 7783901633ed28c32478d0a331df9a01bf09907dce706a240f770b1793d7c973 not found: ID does not exist" Apr 16 19:46:17.986496 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:46:17.986457 2582 scope.go:117] "RemoveContainer" containerID="4777f4c5a3b6ef1a0f3c17c5c41660e58613f2ddea77f970732d2de1a0b9be49" Apr 16 19:46:17.986695 ip-10-0-128-123 kubenswrapper[2582]: E0416 19:46:17.986678 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4777f4c5a3b6ef1a0f3c17c5c41660e58613f2ddea77f970732d2de1a0b9be49\": container with ID starting with 4777f4c5a3b6ef1a0f3c17c5c41660e58613f2ddea77f970732d2de1a0b9be49 not found: ID does not exist" containerID="4777f4c5a3b6ef1a0f3c17c5c41660e58613f2ddea77f970732d2de1a0b9be49" Apr 16 19:46:17.986743 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:46:17.986701 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4777f4c5a3b6ef1a0f3c17c5c41660e58613f2ddea77f970732d2de1a0b9be49"} err="failed to get container status \"4777f4c5a3b6ef1a0f3c17c5c41660e58613f2ddea77f970732d2de1a0b9be49\": rpc error: code = NotFound desc = could not find container \"4777f4c5a3b6ef1a0f3c17c5c41660e58613f2ddea77f970732d2de1a0b9be49\": container with ID starting with 4777f4c5a3b6ef1a0f3c17c5c41660e58613f2ddea77f970732d2de1a0b9be49 not found: ID does not exist" Apr 16 19:46:17.998805 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:46:17.998784 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606145-gxsnw"] Apr 16 19:46:18.002035 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:46:18.002015 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29606145-gxsnw"] Apr 16 19:46:18.050271 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:46:18.050246 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-db74g\" (UniqueName: \"kubernetes.io/projected/09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a-kube-api-access-db74g\") on node \"ip-10-0-128-123.ec2.internal\" DevicePath \"\"" Apr 16 19:46:19.114749 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:46:19.114717 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a" path="/var/lib/kubelet/pods/09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a/volumes" Apr 16 19:46:26.060778 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:46:26.060741 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:46:31.164088 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:46:31.164054 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:46:56.774003 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:46:56.773963 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:47:03.668446 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:47:03.668409 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:47:12.765471 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:47:12.765431 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:47:22.661635 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:47:22.661595 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:47:31.961283 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:47:31.961244 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:47:41.762488 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:47:41.762455 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:47:51.564301 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:47:51.564217 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:47:53.104130 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:47:53.104101 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/2.log" Apr 16 19:47:53.107891 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:47:53.107867 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/2.log" Apr 16 19:48:00.765477 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:48:00.765440 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:48:10.054952 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:48:10.054919 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:48:20.847814 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:48:20.847778 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:48:30.258168 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:48:30.258115 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:49:02.166751 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:49:02.166711 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:49:46.051592 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:49:46.051558 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:49:54.149015 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:49:54.148973 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:50:01.961289 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:50:01.961254 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:50:11.672673 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:50:11.672632 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:50:19.670391 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:50:19.670351 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:50:32.454184 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:50:32.454133 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:50:41.464677 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:50:41.464579 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:50:49.655338 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:50:49.655300 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:50:57.664643 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:50:57.664606 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:51:06.566924 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:51:06.566890 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:51:14.458468 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:51:14.458433 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:51:27.752449 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:51:27.752413 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:51:44.763119 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:51:44.763079 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:51:53.264065 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:51:53.264030 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:52:01.359370 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:52:01.359337 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:52:09.652988 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:52:09.652907 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:52:27.603289 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:52:27.603252 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:52:35.657603 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:52:35.657562 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:52:44.758903 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:52:44.758864 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:52:52.959061 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:52:52.959024 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:52:53.129189 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:52:53.129145 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/2.log" Apr 16 19:52:53.134741 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:52:53.134717 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/2.log" Apr 16 19:53:02.563439 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:53:02.563397 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:53:09.656738 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:53:09.656699 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:53:18.960481 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:53:18.960441 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:53:32.353181 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:53:32.353134 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:53:41.756010 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:53:41.755926 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:53:52.781632 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:53:52.781589 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:54:00.358461 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:54:00.358428 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:54:06.450432 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:54:06.450398 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:54:11.650789 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:54:11.650749 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:54:19.860449 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:54:19.860414 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:54:28.054423 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:54:28.054384 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:54:44.160557 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:54:44.160518 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:54:52.659143 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:54:52.659105 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:55:02.263060 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:55:02.263023 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:55:08.555337 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:55:08.555257 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:55:34.052930 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:55:34.052892 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:55:46.956448 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:55:46.956415 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7hhd5"] Apr 16 19:55:51.942634 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:55:51.942600 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-66b64c949f-tr8wk_f8f2f20f-2ab8-4d48-bd0b-cf681c274c68/manager/0.log" Apr 16 19:55:53.609447 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:55:53.609419 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-f2g22_2ede8d40-96b1-4398-a6ce-a0bc9a42317f/kuadrant-console-plugin/0.log" Apr 16 19:55:53.938302 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:55:53.938228 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-7hhd5_a6a2335a-6958-4fa7-baac-35076649a576/limitador/0.log" Apr 16 19:55:54.701803 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:55:54.701763 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-d894ddccb-g2dzt_41d447d7-da03-4891-b7de-4f79a67dd23e/kube-auth-proxy/0.log" Apr 16 19:55:54.928351 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:55:54.928281 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5f4947fcd8-gffmg_260a217a-9aa3-43e3-9715-9255e451adff/router/0.log" Apr 16 19:56:00.047123 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.047092 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g4x49/must-gather-rm5p8"] Apr 16 19:56:00.047513 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.047480 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a" containerName="cleanup" Apr 16 19:56:00.047513 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.047492 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a" containerName="cleanup" Apr 16 19:56:00.047513 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.047507 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a" containerName="cleanup" Apr 16 19:56:00.047513 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.047514 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a" containerName="cleanup" Apr 16 19:56:00.047643 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.047577 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a" containerName="cleanup" Apr 16 19:56:00.047643 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.047586 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a" containerName="cleanup" Apr 16 19:56:00.047643 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.047594 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a" containerName="cleanup" Apr 16 19:56:00.047730 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.047659 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a" containerName="cleanup" Apr 16 19:56:00.047730 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.047665 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="09eed056-ccaf-4ae0-ab41-6b24ffd6fc7a" containerName="cleanup" Apr 16 19:56:00.049774 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.049759 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4x49/must-gather-rm5p8" Apr 16 19:56:00.052228 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.052205 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g4x49\"/\"kube-root-ca.crt\"" Apr 16 19:56:00.052965 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.052947 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g4x49\"/\"openshift-service-ca.crt\"" Apr 16 19:56:00.053070 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.053053 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-g4x49\"/\"default-dockercfg-42kbq\"" Apr 16 19:56:00.065996 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.065975 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g4x49/must-gather-rm5p8"] Apr 16 19:56:00.103381 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.103345 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b9e21be1-9c58-4f23-b315-ef777f76d4b4-must-gather-output\") pod \"must-gather-rm5p8\" (UID: \"b9e21be1-9c58-4f23-b315-ef777f76d4b4\") " pod="openshift-must-gather-g4x49/must-gather-rm5p8" Apr 16 19:56:00.103541 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.103404 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl29r\" (UniqueName: \"kubernetes.io/projected/b9e21be1-9c58-4f23-b315-ef777f76d4b4-kube-api-access-zl29r\") pod \"must-gather-rm5p8\" (UID: \"b9e21be1-9c58-4f23-b315-ef777f76d4b4\") " pod="openshift-must-gather-g4x49/must-gather-rm5p8" Apr 16 19:56:00.204489 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.204437 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b9e21be1-9c58-4f23-b315-ef777f76d4b4-must-gather-output\") pod \"must-gather-rm5p8\" (UID: \"b9e21be1-9c58-4f23-b315-ef777f76d4b4\") " pod="openshift-must-gather-g4x49/must-gather-rm5p8" Apr 16 19:56:00.204685 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.204544 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zl29r\" (UniqueName: \"kubernetes.io/projected/b9e21be1-9c58-4f23-b315-ef777f76d4b4-kube-api-access-zl29r\") pod \"must-gather-rm5p8\" (UID: \"b9e21be1-9c58-4f23-b315-ef777f76d4b4\") " pod="openshift-must-gather-g4x49/must-gather-rm5p8" Apr 16 19:56:00.204932 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.204900 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b9e21be1-9c58-4f23-b315-ef777f76d4b4-must-gather-output\") pod \"must-gather-rm5p8\" (UID: \"b9e21be1-9c58-4f23-b315-ef777f76d4b4\") " pod="openshift-must-gather-g4x49/must-gather-rm5p8" Apr 16 19:56:00.216244 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.216216 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl29r\" (UniqueName: \"kubernetes.io/projected/b9e21be1-9c58-4f23-b315-ef777f76d4b4-kube-api-access-zl29r\") pod \"must-gather-rm5p8\" (UID: \"b9e21be1-9c58-4f23-b315-ef777f76d4b4\") " pod="openshift-must-gather-g4x49/must-gather-rm5p8" Apr 16 19:56:00.358559 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.358471 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4x49/must-gather-rm5p8" Apr 16 19:56:00.478366 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.478342 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g4x49/must-gather-rm5p8"] Apr 16 19:56:00.480590 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:56:00.480552 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9e21be1_9c58_4f23_b315_ef777f76d4b4.slice/crio-d0cd918592d3f5955a10e51a254245b13808af4b198756a365c929616db2d013 WatchSource:0}: Error finding container d0cd918592d3f5955a10e51a254245b13808af4b198756a365c929616db2d013: Status 404 returned error can't find the container with id d0cd918592d3f5955a10e51a254245b13808af4b198756a365c929616db2d013 Apr 16 19:56:00.482410 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.482390 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:56:00.975433 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:00.975402 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4x49/must-gather-rm5p8" event={"ID":"b9e21be1-9c58-4f23-b315-ef777f76d4b4","Type":"ContainerStarted","Data":"d0cd918592d3f5955a10e51a254245b13808af4b198756a365c929616db2d013"} Apr 16 19:56:01.984996 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:01.984949 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4x49/must-gather-rm5p8" event={"ID":"b9e21be1-9c58-4f23-b315-ef777f76d4b4","Type":"ContainerStarted","Data":"cdc338d170c5f3f331d4729c778e510af2a38ef925244f795732c2ef16b6d324"} Apr 16 19:56:01.984996 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:01.985001 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4x49/must-gather-rm5p8" event={"ID":"b9e21be1-9c58-4f23-b315-ef777f76d4b4","Type":"ContainerStarted","Data":"dde061a39457e6d211bfad372ec329bfb21132fcd42279a14853c6a8bb02cc23"} Apr 16 19:56:02.001389 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:02.001322 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g4x49/must-gather-rm5p8" podStartSLOduration=1.2326255640000001 podStartE2EDuration="2.001299814s" podCreationTimestamp="2026-04-16 19:56:00 +0000 UTC" firstStartedPulling="2026-04-16 19:56:00.482526348 +0000 UTC m=+2287.977116943" lastFinishedPulling="2026-04-16 19:56:01.251200594 +0000 UTC m=+2288.745791193" observedRunningTime="2026-04-16 19:56:01.999563457 +0000 UTC m=+2289.494154076" watchObservedRunningTime="2026-04-16 19:56:02.001299814 +0000 UTC m=+2289.495890433" Apr 16 19:56:02.890540 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:02.890507 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-fg84m_27412d9f-8c9a-4ed3-92cb-4002bafb01fa/global-pull-secret-syncer/0.log" Apr 16 19:56:03.054945 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:03.054911 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-lfd4r_d0b93718-99a8-48ec-8713-62b20201de35/konnectivity-agent/0.log" Apr 16 19:56:03.078612 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:03.078581 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-123.ec2.internal_4602511561889ed4b3b1e98e97d43dc5/haproxy/0.log" Apr 16 19:56:07.569689 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:07.569656 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-f2g22_2ede8d40-96b1-4398-a6ce-a0bc9a42317f/kuadrant-console-plugin/0.log" Apr 16 19:56:07.697938 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:07.697858 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-7hhd5_a6a2335a-6958-4fa7-baac-35076649a576/limitador/0.log" Apr 16 19:56:09.151043 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:09.151005 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b1bf3a4d-389a-44ac-9ef1-cef2367546aa/alertmanager/0.log" Apr 16 19:56:09.178908 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:09.178876 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b1bf3a4d-389a-44ac-9ef1-cef2367546aa/config-reloader/0.log" Apr 16 19:56:09.201020 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:09.200932 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b1bf3a4d-389a-44ac-9ef1-cef2367546aa/kube-rbac-proxy-web/0.log" Apr 16 19:56:09.222454 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:09.222400 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b1bf3a4d-389a-44ac-9ef1-cef2367546aa/kube-rbac-proxy/0.log" Apr 16 19:56:09.243493 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:09.243429 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b1bf3a4d-389a-44ac-9ef1-cef2367546aa/kube-rbac-proxy-metric/0.log" Apr 16 19:56:09.266304 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:09.266274 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b1bf3a4d-389a-44ac-9ef1-cef2367546aa/prom-label-proxy/0.log" Apr 16 19:56:09.289710 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:09.289679 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_b1bf3a4d-389a-44ac-9ef1-cef2367546aa/init-config-reloader/0.log" Apr 16 19:56:09.336441 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:09.336405 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-9sbws_448c73ab-a4f5-4a5c-8143-1deb13253eec/cluster-monitoring-operator/0.log" Apr 16 19:56:09.441799 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:09.441767 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-797486cb67-lw5s7_12ea81e7-90ab-476f-805c-836751743647/metrics-server/0.log" Apr 16 19:56:09.651352 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:09.651297 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xf965_5e5dbd1b-6936-4ebc-83c5-9d234738556b/node-exporter/0.log" Apr 16 19:56:09.672750 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:09.672718 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xf965_5e5dbd1b-6936-4ebc-83c5-9d234738556b/kube-rbac-proxy/0.log" Apr 16 19:56:09.703523 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:09.703447 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xf965_5e5dbd1b-6936-4ebc-83c5-9d234738556b/init-textfile/0.log" Apr 16 19:56:09.726269 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:09.726239 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-86pzj_fd3c9bfd-8e4a-498b-9c73-93f8b57377f5/kube-rbac-proxy-main/0.log" Apr 16 19:56:09.749653 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:09.749622 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-86pzj_fd3c9bfd-8e4a-498b-9c73-93f8b57377f5/kube-rbac-proxy-self/0.log" Apr 16 19:56:09.770561 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:09.770533 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-86pzj_fd3c9bfd-8e4a-498b-9c73-93f8b57377f5/openshift-state-metrics/0.log" Apr 16 19:56:09.807335 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:09.807304 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_247ec332-5c00-47e4-b12b-08a0fef6a5fe/prometheus/0.log" Apr 16 19:56:09.828312 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:09.828274 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_247ec332-5c00-47e4-b12b-08a0fef6a5fe/config-reloader/0.log" Apr 16 19:56:09.853097 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:09.853048 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_247ec332-5c00-47e4-b12b-08a0fef6a5fe/thanos-sidecar/0.log" Apr 16 19:56:09.873571 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:09.873545 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_247ec332-5c00-47e4-b12b-08a0fef6a5fe/kube-rbac-proxy-web/0.log" Apr 16 19:56:09.902174 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:09.902112 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_247ec332-5c00-47e4-b12b-08a0fef6a5fe/kube-rbac-proxy/0.log" Apr 16 19:56:09.925645 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:09.925616 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_247ec332-5c00-47e4-b12b-08a0fef6a5fe/kube-rbac-proxy-thanos/0.log" Apr 16 19:56:09.946012 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:09.945983 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_247ec332-5c00-47e4-b12b-08a0fef6a5fe/init-config-reloader/0.log" Apr 16 19:56:10.060073 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:10.060038 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-9d45d7bd4-hgrc5_ba226b86-8449-4c16-881e-753110579cfe/telemeter-client/0.log" Apr 16 19:56:10.084412 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:10.084384 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-9d45d7bd4-hgrc5_ba226b86-8449-4c16-881e-753110579cfe/reload/0.log" Apr 16 19:56:10.115125 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:10.115093 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-9d45d7bd4-hgrc5_ba226b86-8449-4c16-881e-753110579cfe/kube-rbac-proxy/0.log" Apr 16 19:56:11.283036 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.283009 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-pl62h_04310661-51ad-4a3b-86cf-b9a2a0d1dda1/networking-console-plugin/0.log" Apr 16 19:56:11.442060 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.442025 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2"] Apr 16 19:56:11.447030 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.447006 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2" Apr 16 19:56:11.455621 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.455367 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2"] Apr 16 19:56:11.528340 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.528257 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fd9dca7d-bf3f-4967-bcd3-e411b4afcb93-podres\") pod \"perf-node-gather-daemonset-s6pt2\" (UID: \"fd9dca7d-bf3f-4967-bcd3-e411b4afcb93\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2" Apr 16 19:56:11.528529 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.528408 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlz9b\" (UniqueName: \"kubernetes.io/projected/fd9dca7d-bf3f-4967-bcd3-e411b4afcb93-kube-api-access-xlz9b\") pod \"perf-node-gather-daemonset-s6pt2\" (UID: \"fd9dca7d-bf3f-4967-bcd3-e411b4afcb93\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2" Apr 16 19:56:11.528529 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.528493 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fd9dca7d-bf3f-4967-bcd3-e411b4afcb93-proc\") pod \"perf-node-gather-daemonset-s6pt2\" (UID: \"fd9dca7d-bf3f-4967-bcd3-e411b4afcb93\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2" Apr 16 19:56:11.528645 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.528534 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd9dca7d-bf3f-4967-bcd3-e411b4afcb93-sys\") pod \"perf-node-gather-daemonset-s6pt2\" (UID: \"fd9dca7d-bf3f-4967-bcd3-e411b4afcb93\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2" Apr 16 19:56:11.528645 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.528578 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd9dca7d-bf3f-4967-bcd3-e411b4afcb93-lib-modules\") pod \"perf-node-gather-daemonset-s6pt2\" (UID: \"fd9dca7d-bf3f-4967-bcd3-e411b4afcb93\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2" Apr 16 19:56:11.629815 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.629736 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlz9b\" (UniqueName: \"kubernetes.io/projected/fd9dca7d-bf3f-4967-bcd3-e411b4afcb93-kube-api-access-xlz9b\") pod \"perf-node-gather-daemonset-s6pt2\" (UID: \"fd9dca7d-bf3f-4967-bcd3-e411b4afcb93\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2" Apr 16 19:56:11.629815 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.629794 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fd9dca7d-bf3f-4967-bcd3-e411b4afcb93-proc\") pod \"perf-node-gather-daemonset-s6pt2\" (UID: \"fd9dca7d-bf3f-4967-bcd3-e411b4afcb93\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2" Apr 16 19:56:11.630012 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.629823 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd9dca7d-bf3f-4967-bcd3-e411b4afcb93-sys\") pod \"perf-node-gather-daemonset-s6pt2\" (UID: \"fd9dca7d-bf3f-4967-bcd3-e411b4afcb93\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2" Apr 16 19:56:11.630012 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.629855 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd9dca7d-bf3f-4967-bcd3-e411b4afcb93-lib-modules\") pod \"perf-node-gather-daemonset-s6pt2\" (UID: \"fd9dca7d-bf3f-4967-bcd3-e411b4afcb93\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2" Apr 16 19:56:11.630012 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.629889 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fd9dca7d-bf3f-4967-bcd3-e411b4afcb93-podres\") pod \"perf-node-gather-daemonset-s6pt2\" (UID: \"fd9dca7d-bf3f-4967-bcd3-e411b4afcb93\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2" Apr 16 19:56:11.630012 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.629932 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fd9dca7d-bf3f-4967-bcd3-e411b4afcb93-proc\") pod \"perf-node-gather-daemonset-s6pt2\" (UID: \"fd9dca7d-bf3f-4967-bcd3-e411b4afcb93\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2" Apr 16 19:56:11.630012 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.629951 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd9dca7d-bf3f-4967-bcd3-e411b4afcb93-sys\") pod \"perf-node-gather-daemonset-s6pt2\" (UID: \"fd9dca7d-bf3f-4967-bcd3-e411b4afcb93\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2" Apr 16 19:56:11.630012 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.630013 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fd9dca7d-bf3f-4967-bcd3-e411b4afcb93-podres\") pod \"perf-node-gather-daemonset-s6pt2\" (UID: \"fd9dca7d-bf3f-4967-bcd3-e411b4afcb93\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2" Apr 16 19:56:11.630257 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.630018 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd9dca7d-bf3f-4967-bcd3-e411b4afcb93-lib-modules\") pod \"perf-node-gather-daemonset-s6pt2\" (UID: \"fd9dca7d-bf3f-4967-bcd3-e411b4afcb93\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2" Apr 16 19:56:11.638406 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.638365 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlz9b\" (UniqueName: \"kubernetes.io/projected/fd9dca7d-bf3f-4967-bcd3-e411b4afcb93-kube-api-access-xlz9b\") pod \"perf-node-gather-daemonset-s6pt2\" (UID: \"fd9dca7d-bf3f-4967-bcd3-e411b4afcb93\") " pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2" Apr 16 19:56:11.760519 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.760481 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2" Apr 16 19:56:11.885628 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.885473 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/2.log" Apr 16 19:56:11.891323 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.891260 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kqmw8_82d1300a-6831-4de2-a99c-90a2b28f9a33/console-operator/3.log" Apr 16 19:56:11.912440 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:11.912230 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2"] Apr 16 19:56:11.917123 ip-10-0-128-123 kubenswrapper[2582]: W0416 19:56:11.916503 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfd9dca7d_bf3f_4967_bcd3_e411b4afcb93.slice/crio-029502f5d05260a9a1330df7c01149fd274ebf4a835ec3bddd2401317f3abc56 WatchSource:0}: Error finding container 029502f5d05260a9a1330df7c01149fd274ebf4a835ec3bddd2401317f3abc56: Status 404 returned error can't find the container with id 029502f5d05260a9a1330df7c01149fd274ebf4a835ec3bddd2401317f3abc56 Apr 16 19:56:12.035451 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:12.035421 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2" event={"ID":"fd9dca7d-bf3f-4967-bcd3-e411b4afcb93","Type":"ContainerStarted","Data":"029502f5d05260a9a1330df7c01149fd274ebf4a835ec3bddd2401317f3abc56"} Apr 16 19:56:12.906036 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:12.906000 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-f8xvx_c4c4f849-8b71-4e5a-a7d6-079b83a72af1/volume-data-source-validator/0.log" Apr 16 19:56:13.040209 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:13.040175 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2" event={"ID":"fd9dca7d-bf3f-4967-bcd3-e411b4afcb93","Type":"ContainerStarted","Data":"261dead9a19696b966a00121d24fa2008427d7950bedab7369bd1992f29f4196"} Apr 16 19:56:13.040403 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:13.040263 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2" Apr 16 19:56:13.057951 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:13.057888 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2" podStartSLOduration=2.057865543 podStartE2EDuration="2.057865543s" podCreationTimestamp="2026-04-16 19:56:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:56:13.054645411 +0000 UTC m=+2300.549236025" watchObservedRunningTime="2026-04-16 19:56:13.057865543 +0000 UTC m=+2300.552456162" Apr 16 19:56:13.785270 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:13.785241 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8xgc6_513c0bb7-f253-4f0d-bc14-1d473d560c39/dns/0.log" Apr 16 19:56:13.805756 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:13.805727 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8xgc6_513c0bb7-f253-4f0d-bc14-1d473d560c39/kube-rbac-proxy/0.log" Apr 16 19:56:13.850904 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:13.850875 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bmngp_97d2de57-ec6a-4f59-985c-24aea83be3fd/dns-node-resolver/0.log" Apr 16 19:56:14.351137 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:14.351109 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mgz4l_7caf9f3a-4884-4e15-b154-262d7a60b314/node-ca/0.log" Apr 16 19:56:15.345964 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:15.345931 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-d894ddccb-g2dzt_41d447d7-da03-4891-b7de-4f79a67dd23e/kube-auth-proxy/0.log" Apr 16 19:56:15.427061 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:15.427033 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5f4947fcd8-gffmg_260a217a-9aa3-43e3-9715-9255e451adff/router/0.log" Apr 16 19:56:15.937421 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:15.937382 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dp4cw_d82ed6e1-d7aa-4d47-bcb6-f4539431d578/serve-healthcheck-canary/0.log" Apr 16 19:56:16.405290 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:16.405265 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-ksczz_907cddc0-db0e-4159-aa65-8778fb6d6a30/insights-operator/0.log" Apr 16 19:56:16.407425 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:16.407403 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-ksczz_907cddc0-db0e-4159-aa65-8778fb6d6a30/insights-operator/1.log" Apr 16 19:56:16.493142 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:16.493108 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fhd58_ca94893d-31a8-4ecf-9f0d-fc52580b40f4/kube-rbac-proxy/0.log" Apr 16 19:56:16.513174 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:16.513118 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fhd58_ca94893d-31a8-4ecf-9f0d-fc52580b40f4/exporter/0.log" Apr 16 19:56:16.534992 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:16.534968 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fhd58_ca94893d-31a8-4ecf-9f0d-fc52580b40f4/extractor/0.log" Apr 16 19:56:18.690307 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:18.690259 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-66b64c949f-tr8wk_f8f2f20f-2ab8-4d48-bd0b-cf681c274c68/manager/0.log" Apr 16 19:56:19.052974 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:19.052949 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-g4x49/perf-node-gather-daemonset-s6pt2" Apr 16 19:56:19.986845 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:19.986809 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-66b4cb6588-bnz8x_85547f04-5633-4ab8-b014-a1e326a9ed35/manager/0.log" Apr 16 19:56:24.901502 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:24.901464 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-dtbpz_c83523cc-27c2-4924-9113-67ff5b311e42/kube-storage-version-migrator-operator/1.log" Apr 16 19:56:24.902875 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:24.902843 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-dtbpz_c83523cc-27c2-4924-9113-67ff5b311e42/kube-storage-version-migrator-operator/0.log" Apr 16 19:56:26.039018 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:26.038990 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j5vpx_fb965cc4-1192-4694-81d4-b4802f0b6e56/kube-multus-additional-cni-plugins/0.log" Apr 16 19:56:26.058728 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:26.058701 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j5vpx_fb965cc4-1192-4694-81d4-b4802f0b6e56/egress-router-binary-copy/0.log" Apr 16 19:56:26.078768 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:26.078741 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j5vpx_fb965cc4-1192-4694-81d4-b4802f0b6e56/cni-plugins/0.log" Apr 16 19:56:26.102929 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:26.102899 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j5vpx_fb965cc4-1192-4694-81d4-b4802f0b6e56/bond-cni-plugin/0.log" Apr 16 19:56:26.125959 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:26.125931 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j5vpx_fb965cc4-1192-4694-81d4-b4802f0b6e56/routeoverride-cni/0.log" Apr 16 19:56:26.149102 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:26.149075 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j5vpx_fb965cc4-1192-4694-81d4-b4802f0b6e56/whereabouts-cni-bincopy/0.log" Apr 16 19:56:26.168979 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:26.168951 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-j5vpx_fb965cc4-1192-4694-81d4-b4802f0b6e56/whereabouts-cni/0.log" Apr 16 19:56:26.416561 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:26.416463 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-s29hv_ff091581-6d2a-4584-b8a2-9f02cd7c342d/kube-multus/0.log" Apr 16 19:56:26.482318 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:26.482288 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lvp6d_0fa55098-1c0e-4cf5-963c-602d47a411cc/network-metrics-daemon/0.log" Apr 16 19:56:26.501796 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:26.501763 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lvp6d_0fa55098-1c0e-4cf5-963c-602d47a411cc/kube-rbac-proxy/0.log" Apr 16 19:56:27.357213 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:27.357185 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xbfq_03e41e43-a8fe-424e-85ea-c86ea5b657e4/ovn-controller/0.log" Apr 16 19:56:27.386493 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:27.386461 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xbfq_03e41e43-a8fe-424e-85ea-c86ea5b657e4/ovn-acl-logging/0.log" Apr 16 19:56:27.403816 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:27.403790 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xbfq_03e41e43-a8fe-424e-85ea-c86ea5b657e4/kube-rbac-proxy-node/0.log" Apr 16 19:56:27.425143 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:27.425122 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xbfq_03e41e43-a8fe-424e-85ea-c86ea5b657e4/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:56:27.449667 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:27.449640 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xbfq_03e41e43-a8fe-424e-85ea-c86ea5b657e4/northd/0.log" Apr 16 19:56:27.473031 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:27.473006 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xbfq_03e41e43-a8fe-424e-85ea-c86ea5b657e4/nbdb/0.log" Apr 16 19:56:27.493560 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:27.493537 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xbfq_03e41e43-a8fe-424e-85ea-c86ea5b657e4/sbdb/0.log" Apr 16 19:56:27.602251 ip-10-0-128-123 kubenswrapper[2582]: I0416 19:56:27.602221 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4xbfq_03e41e43-a8fe-424e-85ea-c86ea5b657e4/ovnkube-controller/0.log"