Apr 20 14:24:26.312148 ip-10-0-142-166 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 14:24:26.312159 ip-10-0-142-166 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 14:24:26.312166 ip-10-0-142-166 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 14:24:26.312429 ip-10-0-142-166 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 14:24:36.533657 ip-10-0-142-166 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 14:24:36.533675 ip-10-0-142-166 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 23cfd4b839754bd8b18873fb3ff41e48 -- Apr 20 14:26:53.133856 ip-10-0-142-166 systemd[1]: Starting Kubernetes Kubelet... Apr 20 14:26:53.672683 ip-10-0-142-166 kubenswrapper[2580]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:26:53.672683 ip-10-0-142-166 kubenswrapper[2580]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 14:26:53.672683 ip-10-0-142-166 kubenswrapper[2580]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:26:53.672683 ip-10-0-142-166 kubenswrapper[2580]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 14:26:53.672683 ip-10-0-142-166 kubenswrapper[2580]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:26:53.674677 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.674580 2580 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 14:26:53.677047 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677031 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:26:53.677047 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677047 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:26:53.677123 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677051 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:26:53.677123 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677055 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:26:53.677123 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677061 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:26:53.677123 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677073 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:26:53.677123 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677077 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:26:53.677123 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677081 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:26:53.677123 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677084 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:26:53.677123 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677088 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:26:53.677123 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677092 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:26:53.677123 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677095 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:26:53.677123 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677098 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:26:53.677123 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677101 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:26:53.677123 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677104 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:26:53.677123 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677106 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:26:53.677123 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677109 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:26:53.677123 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677111 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:26:53.677123 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677114 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:26:53.677123 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677117 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:26:53.677123 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677119 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:26:53.677123 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677122 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:26:53.677621 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677124 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:26:53.677621 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677129 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:26:53.677621 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677132 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:26:53.677621 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677135 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:26:53.677621 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677138 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:26:53.677621 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677141 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:26:53.677621 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677143 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:26:53.677621 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677146 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:26:53.677621 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677148 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:26:53.677621 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677150 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:26:53.677621 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677153 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:26:53.677621 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677155 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:26:53.677621 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677158 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:26:53.677621 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677161 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:26:53.677621 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677163 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:26:53.677621 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677165 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:26:53.677621 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677168 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:26:53.677621 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677170 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:26:53.677621 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677174 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:26:53.677621 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677177 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:26:53.678109 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677179 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:26:53.678109 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677181 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:26:53.678109 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677184 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:26:53.678109 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677187 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:26:53.678109 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677189 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:26:53.678109 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677191 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:26:53.678109 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677194 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:26:53.678109 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677197 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:26:53.678109 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677199 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:26:53.678109 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677201 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:26:53.678109 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677204 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:26:53.678109 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677206 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:26:53.678109 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677209 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:26:53.678109 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677213 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:26:53.678109 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677215 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:26:53.678109 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677218 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:26:53.678109 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677221 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:26:53.678109 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677223 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:26:53.678109 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677226 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:26:53.678640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677229 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:26:53.678640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677231 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:26:53.678640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677234 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:26:53.678640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677237 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:26:53.678640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677239 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:26:53.678640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677242 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:26:53.678640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677244 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:26:53.678640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677247 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:26:53.678640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677249 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:26:53.678640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677253 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:26:53.678640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677255 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:26:53.678640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677258 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:26:53.678640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677261 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:26:53.678640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677264 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:26:53.678640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677267 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:26:53.678640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677270 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:26:53.678640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677272 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:26:53.678640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677275 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:26:53.678640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677277 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:26:53.678640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677280 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:26:53.679140 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677282 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:26:53.679140 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677285 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:26:53.679140 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677287 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:26:53.679140 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677290 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:26:53.679140 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677292 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:26:53.679140 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677716 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:26:53.679140 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677724 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:26:53.679140 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677726 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:26:53.679140 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677729 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:26:53.679140 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677733 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:26:53.679140 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677736 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:26:53.679140 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677738 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:26:53.679140 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677741 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:26:53.679140 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677744 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:26:53.679140 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677746 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:26:53.679140 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677749 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:26:53.679140 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677751 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:26:53.679140 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677754 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:26:53.679140 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677756 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:26:53.679140 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677759 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:26:53.679640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677762 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:26:53.679640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677764 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:26:53.679640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677767 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:26:53.679640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677770 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:26:53.679640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677772 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:26:53.679640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677776 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:26:53.679640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677779 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:26:53.679640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677781 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:26:53.679640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677784 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:26:53.679640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677787 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:26:53.679640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677790 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:26:53.679640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677792 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:26:53.679640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677795 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:26:53.679640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677799 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:26:53.679640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677803 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:26:53.679640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677806 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:26:53.679640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677809 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:26:53.679640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677812 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:26:53.679640 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677815 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:26:53.680134 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677818 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:26:53.680134 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677821 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:26:53.680134 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677823 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:26:53.680134 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677826 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:26:53.680134 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677828 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:26:53.680134 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677831 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:26:53.680134 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677833 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:26:53.680134 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677836 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:26:53.680134 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677839 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:26:53.680134 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677841 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:26:53.680134 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677844 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:26:53.680134 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677846 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:26:53.680134 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677849 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:26:53.680134 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677851 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:26:53.680134 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677854 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:26:53.680134 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677856 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:26:53.680134 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677859 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:26:53.680134 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677862 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:26:53.680134 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677864 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:26:53.680134 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677868 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:26:53.680655 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677871 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:26:53.680655 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677873 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:26:53.680655 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677876 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:26:53.680655 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677879 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:26:53.680655 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677881 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:26:53.680655 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677884 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:26:53.680655 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677886 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:26:53.680655 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677888 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:26:53.680655 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677891 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:26:53.680655 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677894 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:26:53.680655 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677896 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:26:53.680655 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677899 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:26:53.680655 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677902 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:26:53.680655 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677904 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:26:53.680655 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677906 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:26:53.680655 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677909 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:26:53.680655 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677912 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:26:53.680655 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677915 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:26:53.680655 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677918 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:26:53.681135 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677922 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:26:53.681135 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677926 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:26:53.681135 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677928 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:26:53.681135 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677931 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:26:53.681135 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677934 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:26:53.681135 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677936 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:26:53.681135 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677939 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:26:53.681135 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677941 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:26:53.681135 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677944 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:26:53.681135 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677946 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:26:53.681135 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677949 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:26:53.681135 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677951 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:26:53.681135 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.677954 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:26:53.681135 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678035 2580 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 14:26:53.681135 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678053 2580 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 14:26:53.681135 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678066 2580 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 14:26:53.681135 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678071 2580 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 14:26:53.681135 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678076 2580 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 14:26:53.681135 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678080 2580 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 14:26:53.681135 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678084 2580 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 14:26:53.681135 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678089 2580 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678092 2580 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678095 2580 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678100 2580 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678103 2580 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678107 2580 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678110 2580 flags.go:64] FLAG: --cgroup-root="" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678113 2580 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678116 2580 flags.go:64] FLAG: --client-ca-file="" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678119 2580 flags.go:64] FLAG: --cloud-config="" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678122 2580 flags.go:64] FLAG: --cloud-provider="external" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678129 2580 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678134 2580 flags.go:64] FLAG: --cluster-domain="" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678137 2580 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678140 2580 flags.go:64] FLAG: --config-dir="" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678143 2580 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678147 2580 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678152 2580 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678155 2580 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678158 2580 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678161 2580 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678164 2580 flags.go:64] FLAG: --contention-profiling="false" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678167 2580 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678170 2580 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678174 2580 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 14:26:53.681665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678176 2580 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678182 2580 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678185 2580 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678188 2580 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678191 2580 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678194 2580 flags.go:64] FLAG: --enable-server="true" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678197 2580 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678201 2580 flags.go:64] FLAG: --event-burst="100" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678204 2580 flags.go:64] FLAG: --event-qps="50" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678207 2580 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678211 2580 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678214 2580 flags.go:64] FLAG: --eviction-hard="" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678218 2580 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678221 2580 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678224 2580 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678227 2580 flags.go:64] FLAG: --eviction-soft="" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678230 2580 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678232 2580 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678238 2580 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678241 2580 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678243 2580 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678246 2580 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678249 2580 flags.go:64] FLAG: --feature-gates="" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678253 2580 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678256 2580 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 14:26:53.682267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678260 2580 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678263 2580 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678266 2580 flags.go:64] FLAG: --healthz-port="10248" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678270 2580 flags.go:64] FLAG: --help="false" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678272 2580 flags.go:64] FLAG: --hostname-override="ip-10-0-142-166.ec2.internal" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678276 2580 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678279 2580 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678282 2580 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678286 2580 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678289 2580 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678292 2580 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678295 2580 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678298 2580 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678301 2580 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678304 2580 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678307 2580 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678310 2580 flags.go:64] FLAG: --kube-reserved="" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678313 2580 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678316 2580 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678320 2580 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678323 2580 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678325 2580 flags.go:64] FLAG: --lock-file="" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678328 2580 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678331 2580 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 14:26:53.682894 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678334 2580 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678341 2580 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678344 2580 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678347 2580 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678350 2580 flags.go:64] FLAG: --logging-format="text" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678353 2580 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678356 2580 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678359 2580 flags.go:64] FLAG: --manifest-url="" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678362 2580 flags.go:64] FLAG: --manifest-url-header="" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678367 2580 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678370 2580 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678374 2580 flags.go:64] FLAG: --max-pods="110" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678377 2580 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678380 2580 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678383 2580 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678386 2580 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678390 2580 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678393 2580 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678396 2580 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678404 2580 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678408 2580 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678411 2580 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678414 2580 flags.go:64] FLAG: --pod-cidr="" Apr 20 14:26:53.683540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678417 2580 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678423 2580 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678426 2580 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678430 2580 flags.go:64] FLAG: --pods-per-core="0" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678433 2580 flags.go:64] FLAG: --port="10250" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678436 2580 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678439 2580 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0736ddd18c7305246" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678442 2580 flags.go:64] FLAG: --qos-reserved="" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678445 2580 flags.go:64] FLAG: --read-only-port="10255" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678448 2580 flags.go:64] FLAG: --register-node="true" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678452 2580 flags.go:64] FLAG: --register-schedulable="true" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678455 2580 flags.go:64] FLAG: --register-with-taints="" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678459 2580 flags.go:64] FLAG: --registry-burst="10" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678462 2580 flags.go:64] FLAG: --registry-qps="5" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678464 2580 flags.go:64] FLAG: --reserved-cpus="" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678467 2580 flags.go:64] FLAG: --reserved-memory="" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678471 2580 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678474 2580 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678477 2580 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678480 2580 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678483 2580 flags.go:64] FLAG: --runonce="false" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678486 2580 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678490 2580 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678493 2580 flags.go:64] FLAG: --seccomp-default="false" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678508 2580 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678511 2580 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 14:26:53.684138 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678515 2580 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678518 2580 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678522 2580 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678525 2580 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678528 2580 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678532 2580 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678535 2580 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678538 2580 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678541 2580 flags.go:64] FLAG: --system-cgroups="" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678544 2580 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678549 2580 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678552 2580 flags.go:64] FLAG: --tls-cert-file="" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678555 2580 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678560 2580 flags.go:64] FLAG: --tls-min-version="" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678563 2580 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678566 2580 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678574 2580 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678578 2580 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678581 2580 flags.go:64] FLAG: --v="2" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678585 2580 flags.go:64] FLAG: --version="false" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678589 2580 flags.go:64] FLAG: --vmodule="" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678594 2580 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678597 2580 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678693 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:26:53.684785 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678697 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:26:53.685463 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678700 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:26:53.685463 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678703 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:26:53.685463 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678706 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:26:53.685463 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678708 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:26:53.685463 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678711 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:26:53.685463 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678713 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:26:53.685463 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678716 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:26:53.685463 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678723 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:26:53.685463 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678725 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:26:53.685463 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678728 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:26:53.685463 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678732 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:26:53.685463 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678735 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:26:53.685463 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678737 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:26:53.685463 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678740 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:26:53.685463 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678743 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:26:53.685463 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678746 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:26:53.685463 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678748 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:26:53.685463 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678751 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:26:53.685463 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678753 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:26:53.685463 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678756 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:26:53.686005 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678758 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:26:53.686005 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678761 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:26:53.686005 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678764 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:26:53.686005 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678768 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:26:53.686005 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678770 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:26:53.686005 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678773 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:26:53.686005 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678776 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:26:53.686005 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678778 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:26:53.686005 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678781 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:26:53.686005 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678784 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:26:53.686005 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678786 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:26:53.686005 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678789 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:26:53.686005 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678792 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:26:53.686005 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678794 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:26:53.686005 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678796 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:26:53.686005 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678799 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:26:53.686005 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678801 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:26:53.686005 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678804 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:26:53.686005 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678807 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:26:53.686005 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678811 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:26:53.686516 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678813 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:26:53.686516 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678818 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:26:53.686516 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678821 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:26:53.686516 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678825 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:26:53.686516 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678828 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:26:53.686516 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678831 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:26:53.686516 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678834 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:26:53.686516 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678837 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:26:53.686516 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678840 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:26:53.686516 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678842 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:26:53.686516 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678845 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:26:53.686516 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678847 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:26:53.686516 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678850 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:26:53.686516 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678853 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:26:53.686516 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678855 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:26:53.686516 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678859 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:26:53.686516 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678862 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:26:53.686516 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678864 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:26:53.686516 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678867 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:26:53.687034 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678869 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:26:53.687034 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678872 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:26:53.687034 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678875 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:26:53.687034 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678877 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:26:53.687034 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678880 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:26:53.687034 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678882 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:26:53.687034 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678885 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:26:53.687034 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678887 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:26:53.687034 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678890 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:26:53.687034 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678893 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:26:53.687034 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678896 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:26:53.687034 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678898 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:26:53.687034 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678902 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:26:53.687034 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678905 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:26:53.687034 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678908 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:26:53.687034 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678911 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:26:53.687034 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678914 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:26:53.687034 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678917 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:26:53.687034 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678919 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:26:53.687034 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678922 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:26:53.687579 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678924 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:26:53.687579 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678927 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:26:53.687579 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678931 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:26:53.687579 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678934 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:26:53.687579 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.678937 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:26:53.687579 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.678946 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:26:53.687579 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.685430 2580 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 14:26:53.687579 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.685447 2580 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 14:26:53.687579 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685508 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:26:53.687579 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685514 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:26:53.687579 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685517 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:26:53.687579 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685521 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:26:53.687579 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685524 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:26:53.687579 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685526 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:26:53.687579 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685529 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:26:53.687579 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685532 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:26:53.687981 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685535 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:26:53.687981 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685537 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:26:53.687981 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685540 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:26:53.687981 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685543 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:26:53.687981 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685545 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:26:53.687981 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685548 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:26:53.687981 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685551 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:26:53.687981 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685554 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:26:53.687981 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685556 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:26:53.687981 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685559 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:26:53.687981 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685561 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:26:53.687981 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685564 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:26:53.687981 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685567 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:26:53.687981 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685570 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:26:53.687981 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685572 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:26:53.687981 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685574 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:26:53.687981 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685577 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:26:53.687981 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685580 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:26:53.687981 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685584 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:26:53.688450 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685588 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:26:53.688450 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685592 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:26:53.688450 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685594 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:26:53.688450 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685597 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:26:53.688450 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685603 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:26:53.688450 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685607 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:26:53.688450 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685609 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:26:53.688450 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685612 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:26:53.688450 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685615 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:26:53.688450 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685618 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:26:53.688450 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685620 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:26:53.688450 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685623 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:26:53.688450 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685626 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:26:53.688450 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685628 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:26:53.688450 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685631 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:26:53.688450 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685633 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:26:53.688450 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685636 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:26:53.688450 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685638 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:26:53.688450 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685641 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:26:53.688947 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685644 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:26:53.688947 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685647 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:26:53.688947 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685649 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:26:53.688947 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685652 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:26:53.688947 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685654 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:26:53.688947 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685657 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:26:53.688947 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685659 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:26:53.688947 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685662 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:26:53.688947 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685664 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:26:53.688947 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685667 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:26:53.688947 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685670 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:26:53.688947 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685672 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:26:53.688947 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685675 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:26:53.688947 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685678 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:26:53.688947 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685680 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:26:53.688947 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685683 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:26:53.688947 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685685 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:26:53.688947 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685688 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:26:53.688947 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685692 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:26:53.689427 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685694 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:26:53.689427 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685697 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:26:53.689427 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685700 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:26:53.689427 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685702 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:26:53.689427 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685705 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:26:53.689427 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685707 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:26:53.689427 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685710 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:26:53.689427 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685713 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:26:53.689427 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685716 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:26:53.689427 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685718 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:26:53.689427 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685721 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:26:53.689427 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685723 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:26:53.689427 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685726 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:26:53.689427 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685728 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:26:53.689427 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685731 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:26:53.689427 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685733 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:26:53.689427 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685736 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:26:53.689427 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685739 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:26:53.689427 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685741 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:26:53.689427 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685744 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:26:53.689935 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685746 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:26:53.689935 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.685751 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:26:53.689935 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685854 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:26:53.689935 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685860 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:26:53.689935 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685863 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:26:53.689935 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685866 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:26:53.689935 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685869 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:26:53.689935 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685873 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:26:53.689935 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685877 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:26:53.689935 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685880 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:26:53.689935 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685883 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:26:53.689935 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685887 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:26:53.689935 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685890 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:26:53.689935 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685893 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:26:53.689935 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685896 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:26:53.690309 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685899 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:26:53.690309 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685902 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:26:53.690309 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685904 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:26:53.690309 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685907 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:26:53.690309 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685909 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:26:53.690309 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685913 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:26:53.690309 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685916 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:26:53.690309 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685920 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:26:53.690309 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685922 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:26:53.690309 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685925 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:26:53.690309 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685927 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:26:53.690309 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685930 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:26:53.690309 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685932 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:26:53.690309 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685935 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:26:53.690309 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685937 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:26:53.690309 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685940 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:26:53.690309 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685943 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:26:53.690309 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685946 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:26:53.690309 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685948 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:26:53.690847 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685951 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:26:53.690847 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685954 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:26:53.690847 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685957 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:26:53.690847 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685959 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:26:53.690847 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685962 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:26:53.690847 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685965 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:26:53.690847 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685967 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:26:53.690847 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685970 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:26:53.690847 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685973 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:26:53.690847 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685975 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:26:53.690847 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685978 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:26:53.690847 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685981 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:26:53.690847 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685984 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:26:53.690847 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685986 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:26:53.690847 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685989 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:26:53.690847 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685991 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:26:53.690847 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685994 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:26:53.690847 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685996 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:26:53.690847 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.685999 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:26:53.690847 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686001 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:26:53.691343 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686004 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:26:53.691343 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686006 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:26:53.691343 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686009 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:26:53.691343 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686012 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:26:53.691343 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686015 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:26:53.691343 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686018 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:26:53.691343 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686021 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:26:53.691343 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686023 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:26:53.691343 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686026 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:26:53.691343 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686029 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:26:53.691343 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686032 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:26:53.691343 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686034 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:26:53.691343 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686037 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:26:53.691343 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686039 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:26:53.691343 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686042 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:26:53.691343 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686044 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:26:53.691343 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686047 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:26:53.691343 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686049 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:26:53.691343 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686052 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:26:53.691343 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686055 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:26:53.691906 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686057 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:26:53.691906 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686060 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:26:53.691906 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686062 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:26:53.691906 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686065 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:26:53.691906 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686068 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:26:53.691906 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686070 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:26:53.691906 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686073 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:26:53.691906 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686075 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:26:53.691906 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686078 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:26:53.691906 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686081 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:26:53.691906 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686083 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:26:53.691906 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686086 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:26:53.691906 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686088 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:26:53.691906 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:53.686091 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:26:53.691906 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.686096 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:26:53.691906 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.687171 2580 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 14:26:53.692318 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.690447 2580 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 14:26:53.692318 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.691486 2580 server.go:1019] "Starting client certificate rotation" Apr 20 14:26:53.692318 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.691596 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 14:26:53.692318 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.691629 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 14:26:53.719354 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.719331 2580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 14:26:53.726117 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.726093 2580 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 14:26:53.745111 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.745082 2580 log.go:25] "Validated CRI v1 runtime API" Apr 20 14:26:53.751784 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.751759 2580 log.go:25] "Validated CRI v1 image API" Apr 20 14:26:53.751904 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.751776 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 14:26:53.753118 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.753100 2580 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 14:26:53.758373 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.758343 2580 fs.go:135] Filesystem UUIDs: map[03ea84d1-1df3-47f7-94c9-1d1a3bfe1068:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 c7e9ceee-011c-4f88-ab02-02524354b351:/dev/nvme0n1p3] Apr 20 14:26:53.758465 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.758372 2580 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 14:26:53.764441 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.764318 2580 manager.go:217] Machine: {Timestamp:2026-04-20 14:26:53.762212444 +0000 UTC m=+0.482513867 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099748 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d8f613a7f4636edfdfaa9f6f61a27 SystemUUID:ec2d8f61-3a7f-4636-edfd-faa9f6f61a27 BootID:23cfd4b8-3975-4bd8-b188-73fb3ff41e48 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:2c:1c:e4:37:01 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:2c:1c:e4:37:01 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:26:43:9c:50:40:d1 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 14:26:53.764441 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.764435 2580 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 14:26:53.764575 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.764537 2580 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 14:26:53.765769 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.765741 2580 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 14:26:53.765924 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.765771 2580 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-166.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 14:26:53.765971 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.765933 2580 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 14:26:53.765971 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.765942 2580 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 14:26:53.765971 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.765960 2580 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 14:26:53.766055 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.765976 2580 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 14:26:53.767951 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.767939 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 20 14:26:53.768064 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.768055 2580 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 14:26:53.771291 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.771278 2580 kubelet.go:491] "Attempting to sync node with API server" Apr 20 14:26:53.771337 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.771296 2580 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 14:26:53.771337 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.771308 2580 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 14:26:53.771337 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.771319 2580 kubelet.go:397] "Adding apiserver pod source" Apr 20 14:26:53.771337 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.771328 2580 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 14:26:53.772533 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.772519 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 14:26:53.772583 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.772542 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 14:26:53.776074 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.776053 2580 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 14:26:53.777818 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.777805 2580 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 14:26:53.780463 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.780449 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 14:26:53.780546 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.780467 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 14:26:53.780546 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.780474 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 14:26:53.780546 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.780480 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 14:26:53.780546 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.780488 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 14:26:53.780546 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.780508 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 14:26:53.780546 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.780519 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 14:26:53.780546 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.780527 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 14:26:53.780546 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.780536 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 14:26:53.780546 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.780542 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 14:26:53.780775 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.780560 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 14:26:53.780775 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.780570 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 14:26:53.781461 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.781447 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 14:26:53.781461 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.781459 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 14:26:53.784669 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:53.784617 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-166.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 14:26:53.784770 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:53.784671 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 14:26:53.784958 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.784938 2580 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-166.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 14:26:53.786408 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.786389 2580 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 14:26:53.786517 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.786439 2580 server.go:1295] "Started kubelet" Apr 20 14:26:53.786576 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.786485 2580 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 14:26:53.786648 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.786600 2580 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 14:26:53.786697 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.786668 2580 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 14:26:53.787426 ip-10-0-142-166 systemd[1]: Started Kubernetes Kubelet. Apr 20 14:26:53.788772 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.788758 2580 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 14:26:53.789973 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.789957 2580 server.go:317] "Adding debug handlers to kubelet server" Apr 20 14:26:53.792902 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:53.791655 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-166.ec2.internal.18a816e233e017dc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-166.ec2.internal,UID:ip-10-0-142-166.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-166.ec2.internal,},FirstTimestamp:2026-04-20 14:26:53.786404828 +0000 UTC m=+0.506706250,LastTimestamp:2026-04-20 14:26:53.786404828 +0000 UTC m=+0.506706250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-166.ec2.internal,}" Apr 20 14:26:53.796519 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.796484 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-88pb9" Apr 20 14:26:53.796659 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.796643 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 14:26:53.798433 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.798089 2580 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 14:26:53.798433 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:53.798145 2580 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 14:26:53.799173 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.799155 2580 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 14:26:53.799173 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.799175 2580 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 14:26:53.799288 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.799278 2580 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 14:26:53.799395 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.799384 2580 reconstruct.go:97] "Volume reconstruction finished" Apr 20 14:26:53.799395 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.799394 2580 reconciler.go:26] "Reconciler: start to sync state" Apr 20 14:26:53.799514 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:53.799441 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 20 14:26:53.799854 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.799835 2580 factory.go:55] Registering systemd factory Apr 20 14:26:53.799949 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.799861 2580 factory.go:223] Registration of the systemd container factory successfully Apr 20 14:26:53.800086 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.800072 2580 factory.go:153] Registering CRI-O factory Apr 20 14:26:53.800149 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.800091 2580 factory.go:223] Registration of the crio container factory successfully Apr 20 14:26:53.800149 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.800145 2580 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 14:26:53.800237 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.800183 2580 factory.go:103] Registering Raw factory Apr 20 14:26:53.800237 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.800200 2580 manager.go:1196] Started watching for new ooms in manager Apr 20 14:26:53.800563 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.800552 2580 manager.go:319] Starting recovery of all containers Apr 20 14:26:53.804243 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.804222 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-88pb9" Apr 20 14:26:53.810393 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.810227 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:26:53.812015 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.811998 2580 manager.go:324] Recovery completed Apr 20 14:26:53.813165 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:53.813144 2580 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-142-166.ec2.internal\" not found" node="ip-10-0-142-166.ec2.internal" Apr 20 14:26:53.816255 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.816243 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:26:53.819037 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.819023 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:26:53.819096 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.819052 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:26:53.819096 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.819063 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:26:53.819605 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.819589 2580 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 14:26:53.819605 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.819603 2580 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 14:26:53.819714 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.819619 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 20 14:26:53.822698 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.822687 2580 policy_none.go:49] "None policy: Start" Apr 20 14:26:53.822744 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.822703 2580 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 14:26:53.822744 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.822712 2580 state_mem.go:35] "Initializing new in-memory state store" Apr 20 14:26:53.874347 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.874325 2580 manager.go:341] "Starting Device Plugin manager" Apr 20 14:26:53.879257 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:53.874366 2580 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 14:26:53.879257 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.874379 2580 server.go:85] "Starting device plugin registration server" Apr 20 14:26:53.879257 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.874691 2580 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 14:26:53.879257 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.874705 2580 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 14:26:53.879257 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.874807 2580 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 14:26:53.879257 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.874918 2580 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 14:26:53.879257 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.874927 2580 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 14:26:53.879257 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:53.875481 2580 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 14:26:53.879257 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:53.875542 2580 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-166.ec2.internal\" not found" Apr 20 14:26:53.925822 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.925737 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 14:26:53.927012 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.926982 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 14:26:53.927125 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.927016 2580 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 14:26:53.927125 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.927040 2580 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 14:26:53.927125 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.927049 2580 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 14:26:53.927125 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:53.927091 2580 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 14:26:53.929530 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.929493 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:26:53.975246 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.975206 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:26:53.976239 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.976220 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:26:53.976347 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.976252 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:26:53.976347 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.976263 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:26:53.976347 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.976288 2580 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-166.ec2.internal" Apr 20 14:26:53.984474 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:53.984458 2580 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-166.ec2.internal" Apr 20 14:26:53.984537 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:53.984483 2580 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-166.ec2.internal\": node \"ip-10-0-142-166.ec2.internal\" not found" Apr 20 14:26:54.013807 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:54.013781 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 20 14:26:54.027994 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.027936 2580 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-142-166.ec2.internal"] Apr 20 14:26:54.028066 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.028046 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:26:54.029768 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.029751 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:26:54.029857 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.029780 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:26:54.029857 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.029791 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:26:54.031115 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.031104 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:26:54.031269 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.031255 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" Apr 20 14:26:54.031318 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.031283 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:26:54.031895 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.031877 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:26:54.031895 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.031889 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:26:54.032043 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.031906 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:26:54.032043 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.031906 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:26:54.032043 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.031921 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:26:54.032043 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.031929 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:26:54.033595 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.033580 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-166.ec2.internal" Apr 20 14:26:54.033681 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.033603 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:26:54.034262 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.034243 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:26:54.034347 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.034278 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:26:54.034347 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.034293 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:26:54.056264 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:54.056243 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-166.ec2.internal\" not found" node="ip-10-0-142-166.ec2.internal" Apr 20 14:26:54.060790 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:54.060774 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-166.ec2.internal\" not found" node="ip-10-0-142-166.ec2.internal" Apr 20 14:26:54.100848 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.100819 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b77815b0bb4108444a9fb656311ac214-config\") pod \"kube-apiserver-proxy-ip-10-0-142-166.ec2.internal\" (UID: \"b77815b0bb4108444a9fb656311ac214\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-166.ec2.internal" Apr 20 14:26:54.100848 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.100849 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/018517f7eeb11abdb65728aa3a7d830b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal\" (UID: \"018517f7eeb11abdb65728aa3a7d830b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" Apr 20 14:26:54.101007 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.100868 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/018517f7eeb11abdb65728aa3a7d830b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal\" (UID: \"018517f7eeb11abdb65728aa3a7d830b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" Apr 20 14:26:54.114018 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:54.113994 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 20 14:26:54.201912 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.201847 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/018517f7eeb11abdb65728aa3a7d830b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal\" (UID: \"018517f7eeb11abdb65728aa3a7d830b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" Apr 20 14:26:54.201912 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.201883 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b77815b0bb4108444a9fb656311ac214-config\") pod \"kube-apiserver-proxy-ip-10-0-142-166.ec2.internal\" (UID: \"b77815b0bb4108444a9fb656311ac214\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-166.ec2.internal" Apr 20 14:26:54.201912 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.201909 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/018517f7eeb11abdb65728aa3a7d830b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal\" (UID: \"018517f7eeb11abdb65728aa3a7d830b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" Apr 20 14:26:54.202055 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.201934 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/018517f7eeb11abdb65728aa3a7d830b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal\" (UID: \"018517f7eeb11abdb65728aa3a7d830b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" Apr 20 14:26:54.202055 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.201935 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/018517f7eeb11abdb65728aa3a7d830b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal\" (UID: \"018517f7eeb11abdb65728aa3a7d830b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" Apr 20 14:26:54.202055 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.201953 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b77815b0bb4108444a9fb656311ac214-config\") pod \"kube-apiserver-proxy-ip-10-0-142-166.ec2.internal\" (UID: \"b77815b0bb4108444a9fb656311ac214\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-166.ec2.internal" Apr 20 14:26:54.214964 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:54.214917 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 20 14:26:54.315715 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:54.315680 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 20 14:26:54.358888 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.358863 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" Apr 20 14:26:54.363543 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.363516 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-166.ec2.internal" Apr 20 14:26:54.415827 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:54.415790 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 20 14:26:54.516352 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:54.516271 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 20 14:26:54.616814 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:54.616759 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 20 14:26:54.691098 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.691059 2580 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 14:26:54.691759 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.691222 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 14:26:54.691759 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.691277 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 14:26:54.717540 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:54.717505 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 20 14:26:54.796841 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.796771 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 14:26:54.806206 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.806179 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 14:26:54.807584 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.807552 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 14:21:53 +0000 UTC" deadline="2027-11-03 04:46:52.042357707 +0000 UTC" Apr 20 14:26:54.807584 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.807584 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13478h19m57.234777318s" Apr 20 14:26:54.817875 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:54.817843 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 20 14:26:54.828882 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.828861 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-m2bcq" Apr 20 14:26:54.835813 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.835790 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-m2bcq" Apr 20 14:26:54.840089 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:54.840052 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod018517f7eeb11abdb65728aa3a7d830b.slice/crio-3b40d4b8cc718e6155b4f4c8d58eae04716f29bd6a9001e84dd5ce587bde19b2 WatchSource:0}: Error finding container 3b40d4b8cc718e6155b4f4c8d58eae04716f29bd6a9001e84dd5ce587bde19b2: Status 404 returned error can't find the container with id 3b40d4b8cc718e6155b4f4c8d58eae04716f29bd6a9001e84dd5ce587bde19b2 Apr 20 14:26:54.844837 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.844821 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 14:26:54.880097 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:54.880057 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb77815b0bb4108444a9fb656311ac214.slice/crio-c2b943240218425d51da5ca5421071f6a214bb7e276e5a12d8221563866af8b1 WatchSource:0}: Error finding container c2b943240218425d51da5ca5421071f6a214bb7e276e5a12d8221563866af8b1: Status 404 returned error can't find the container with id c2b943240218425d51da5ca5421071f6a214bb7e276e5a12d8221563866af8b1 Apr 20 14:26:54.918109 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:54.918054 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-166.ec2.internal\" not found" Apr 20 14:26:54.930050 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.930004 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-166.ec2.internal" event={"ID":"b77815b0bb4108444a9fb656311ac214","Type":"ContainerStarted","Data":"c2b943240218425d51da5ca5421071f6a214bb7e276e5a12d8221563866af8b1"} Apr 20 14:26:54.930958 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.930934 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" event={"ID":"018517f7eeb11abdb65728aa3a7d830b","Type":"ContainerStarted","Data":"3b40d4b8cc718e6155b4f4c8d58eae04716f29bd6a9001e84dd5ce587bde19b2"} Apr 20 14:26:54.933019 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:54.933005 2580 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:26:55.000256 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.000221 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" Apr 20 14:26:55.014236 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.014210 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 14:26:55.015966 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.015951 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-166.ec2.internal" Apr 20 14:26:55.023777 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.023757 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 14:26:55.192588 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.192489 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:26:55.591127 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.591041 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:26:55.772926 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.772891 2580 apiserver.go:52] "Watching apiserver" Apr 20 14:26:55.778281 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.778211 2580 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 14:26:55.778754 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.778725 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-xhmdf","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw","openshift-cluster-node-tuning-operator/tuned-5fdfv","openshift-dns/node-resolver-mw9b9","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal","openshift-multus/multus-ct824","openshift-multus/network-metrics-daemon-qsqks","openshift-ovn-kubernetes/ovnkube-node-s4vzx","kube-system/kube-apiserver-proxy-ip-10-0-142-166.ec2.internal","openshift-image-registry/node-ca-2rnmj","openshift-multus/multus-additional-cni-plugins-dmfdx","openshift-network-diagnostics/network-check-target-fgfxx","openshift-network-operator/iptables-alerter-2rbh6"] Apr 20 14:26:55.781321 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.781295 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:26:55.781437 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:55.781390 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qsqks" podUID="db79a290-5377-45f9-bb87-89588231d8a7" Apr 20 14:26:55.782657 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.782630 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:55.784017 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.783996 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.786433 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.784942 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 14:26:55.786433 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.784948 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 14:26:55.786433 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.784946 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 14:26:55.786433 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.784947 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-rfgjt\"" Apr 20 14:26:55.786433 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.785356 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mw9b9" Apr 20 14:26:55.786433 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.785963 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 14:26:55.786433 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.786011 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:26:55.786433 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.786286 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-25znw\"" Apr 20 14:26:55.787372 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.787351 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 14:26:55.787470 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.787434 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 14:26:55.787750 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.787659 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qhcng\"" Apr 20 14:26:55.788359 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.788225 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ct824" Apr 20 14:26:55.788359 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.788354 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xhmdf" Apr 20 14:26:55.790244 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.790220 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 14:26:55.790465 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.790444 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mp542\"" Apr 20 14:26:55.790729 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.790712 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-bwrw8\"" Apr 20 14:26:55.790805 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.790758 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 14:26:55.790805 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.790767 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 14:26:55.790937 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.790920 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 14:26:55.790997 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.790967 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 14:26:55.790997 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.790985 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 14:26:55.797519 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.797482 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.797631 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.797482 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2rnmj" Apr 20 14:26:55.799178 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.799155 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.800285 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.800266 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 14:26:55.800383 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.800365 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 14:26:55.800643 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.800509 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-lslm2\"" Apr 20 14:26:55.800643 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.800548 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 14:26:55.800643 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.800553 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 14:26:55.801388 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.800837 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 14:26:55.801388 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.800907 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 14:26:55.801388 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.801016 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 14:26:55.801388 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.801071 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 14:26:55.801388 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.801245 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-mgnbv\"" Apr 20 14:26:55.802564 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.801848 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 14:26:55.802564 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.802052 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 14:26:55.802564 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.802165 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 14:26:55.802564 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.802234 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-mmqxz\"" Apr 20 14:26:55.803942 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.803105 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2rbh6" Apr 20 14:26:55.803942 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.803454 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:26:55.803942 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:55.803534 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fgfxx" podUID="35f35192-c0aa-4a62-b812-0f0be61d0f8e" Apr 20 14:26:55.805266 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.804985 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:26:55.805604 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.805466 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-8skvl\"" Apr 20 14:26:55.805731 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.805712 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 14:26:55.805976 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.805712 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 14:26:55.809368 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809302 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cch5v\" (UniqueName: \"kubernetes.io/projected/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-kube-api-access-cch5v\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.809368 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809337 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c30eab0-ce99-4717-9b34-99ab3a10543c-system-cni-dir\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.809528 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809365 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbs6n\" (UniqueName: \"kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n\") pod \"network-check-target-fgfxx\" (UID: \"35f35192-c0aa-4a62-b812-0f0be61d0f8e\") " pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:26:55.809528 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809397 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-ovnkube-config\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.809528 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809423 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-os-release\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.809528 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809447 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-multus-socket-dir-parent\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.809528 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809472 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/236542ea-fda7-4b96-ae9e-dd685e15e5ef-hosts-file\") pod \"node-resolver-mw9b9\" (UID: \"236542ea-fda7-4b96-ae9e-dd685e15e5ef\") " pod="openshift-dns/node-resolver-mw9b9" Apr 20 14:26:55.809528 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809512 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-host-run-k8s-cni-cncf-io\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.809804 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809538 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-host-var-lib-cni-multus\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.809804 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809562 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c2fe8871-85d4-447d-a6ca-09895bb1faae-konnectivity-ca\") pod \"konnectivity-agent-xhmdf\" (UID: \"c2fe8871-85d4-447d-a6ca-09895bb1faae\") " pod="kube-system/konnectivity-agent-xhmdf" Apr 20 14:26:55.809804 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809615 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-host-cni-bin\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.809804 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809661 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rm2v\" (UniqueName: \"kubernetes.io/projected/175a3efc-4aa3-4f7d-ac63-bb40b7cd457b-kube-api-access-9rm2v\") pod \"node-ca-2rnmj\" (UID: \"175a3efc-4aa3-4f7d-ac63-bb40b7cd457b\") " pod="openshift-image-registry/node-ca-2rnmj" Apr 20 14:26:55.809804 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809689 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-etc-systemd\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.809804 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809712 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-tmp\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.809804 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809734 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-host-run-multus-certs\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.809804 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809760 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/236542ea-fda7-4b96-ae9e-dd685e15e5ef-tmp-dir\") pod \"node-resolver-mw9b9\" (UID: \"236542ea-fda7-4b96-ae9e-dd685e15e5ef\") " pod="openshift-dns/node-resolver-mw9b9" Apr 20 14:26:55.810183 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809804 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-ovnkube-script-lib\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.810183 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809836 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/175a3efc-4aa3-4f7d-ac63-bb40b7cd457b-host\") pod \"node-ca-2rnmj\" (UID: \"175a3efc-4aa3-4f7d-ac63-bb40b7cd457b\") " pod="openshift-image-registry/node-ca-2rnmj" Apr 20 14:26:55.810183 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809865 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cznq\" (UniqueName: \"kubernetes.io/projected/3c30eab0-ce99-4717-9b34-99ab3a10543c-kube-api-access-6cznq\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.810183 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809891 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-run-ovn\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.810183 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809918 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-env-overrides\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.810183 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809944 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzn2l\" (UniqueName: \"kubernetes.io/projected/db79a290-5377-45f9-bb87-89588231d8a7-kube-api-access-wzn2l\") pod \"network-metrics-daemon-qsqks\" (UID: \"db79a290-5377-45f9-bb87-89588231d8a7\") " pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:26:55.810183 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.809976 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-hostroot\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.810183 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810000 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c30eab0-ce99-4717-9b34-99ab3a10543c-os-release\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.810183 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810027 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c30eab0-ce99-4717-9b34-99ab3a10543c-cni-binary-copy\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.810183 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810049 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-etc-modprobe-d\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.810183 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810071 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjlfd\" (UniqueName: \"kubernetes.io/projected/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-kube-api-access-tjlfd\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.810183 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810093 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-multus-cni-dir\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.810183 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810116 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-multus-conf-dir\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.810183 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810152 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-etc-openvswitch\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.810838 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810191 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-host-var-lib-kubelet\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.810838 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810232 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-host-cni-netd\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.810838 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810281 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-sys\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.810838 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810309 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-system-cni-dir\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.810838 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810344 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twxvc\" (UniqueName: \"kubernetes.io/projected/236542ea-fda7-4b96-ae9e-dd685e15e5ef-kube-api-access-twxvc\") pod \"node-resolver-mw9b9\" (UID: \"236542ea-fda7-4b96-ae9e-dd685e15e5ef\") " pod="openshift-dns/node-resolver-mw9b9" Apr 20 14:26:55.810838 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810374 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-host-kubelet\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.810838 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810400 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-systemd-units\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.810838 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810423 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-run-openvswitch\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.810838 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810448 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-ovn-node-metrics-cert\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.810838 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810477 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-run\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.810838 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810529 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-host\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.810838 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810563 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-etc-kubernetes\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.810838 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810587 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-node-log\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.810838 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810612 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/308d3703-6cc2-4cc6-a2e4-187390a83535-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2vmqw\" (UID: \"308d3703-6cc2-4cc6-a2e4-187390a83535\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:55.810838 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810635 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-etc-sysconfig\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.810838 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810657 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-host-var-lib-cni-bin\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.810838 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810700 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/175a3efc-4aa3-4f7d-ac63-bb40b7cd457b-serviceca\") pod \"node-ca-2rnmj\" (UID: \"175a3efc-4aa3-4f7d-ac63-bb40b7cd457b\") " pod="openshift-image-registry/node-ca-2rnmj" Apr 20 14:26:55.811514 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810723 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/308d3703-6cc2-4cc6-a2e4-187390a83535-device-dir\") pod \"aws-ebs-csi-driver-node-2vmqw\" (UID: \"308d3703-6cc2-4cc6-a2e4-187390a83535\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:55.811514 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810746 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/308d3703-6cc2-4cc6-a2e4-187390a83535-sys-fs\") pod \"aws-ebs-csi-driver-node-2vmqw\" (UID: \"308d3703-6cc2-4cc6-a2e4-187390a83535\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:55.811514 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810769 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-etc-sysctl-d\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.811514 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810791 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-etc-tuned\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.811514 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810814 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-cni-binary-copy\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.811514 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810837 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-multus-daemon-config\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.811514 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810863 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c30eab0-ce99-4717-9b34-99ab3a10543c-cnibin\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.811514 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810898 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-host-slash\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.811514 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810921 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-var-lib-openvswitch\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.811514 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.810943 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-log-socket\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.811514 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.811006 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c2fe8871-85d4-447d-a6ca-09895bb1faae-agent-certs\") pod \"konnectivity-agent-xhmdf\" (UID: \"c2fe8871-85d4-447d-a6ca-09895bb1faae\") " pod="kube-system/konnectivity-agent-xhmdf" Apr 20 14:26:55.811514 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.811040 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-etc-kubernetes\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.811514 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.811065 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-var-lib-kubelet\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.811514 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.811101 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c30eab0-ce99-4717-9b34-99ab3a10543c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.811514 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.811132 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3c30eab0-ce99-4717-9b34-99ab3a10543c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.811514 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.811176 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-host-run-netns\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.812094 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.811202 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-host-run-ovn-kubernetes\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.812094 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.811227 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/308d3703-6cc2-4cc6-a2e4-187390a83535-registration-dir\") pod \"aws-ebs-csi-driver-node-2vmqw\" (UID: \"308d3703-6cc2-4cc6-a2e4-187390a83535\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:55.812094 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.811254 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/308d3703-6cc2-4cc6-a2e4-187390a83535-etc-selinux\") pod \"aws-ebs-csi-driver-node-2vmqw\" (UID: \"308d3703-6cc2-4cc6-a2e4-187390a83535\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:55.812094 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.811275 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-etc-sysctl-conf\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.812094 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.811297 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-host-run-netns\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.812094 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.811353 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3c30eab0-ce99-4717-9b34-99ab3a10543c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.812094 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.811387 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-lib-modules\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.812094 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.811410 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-cnibin\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.812094 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.811433 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs\") pod \"network-metrics-daemon-qsqks\" (UID: \"db79a290-5377-45f9-bb87-89588231d8a7\") " pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:26:55.812094 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.811455 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-run-systemd\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.812094 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.811480 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.812094 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.811522 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl8vk\" (UniqueName: \"kubernetes.io/projected/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-kube-api-access-rl8vk\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.812094 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.811547 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/308d3703-6cc2-4cc6-a2e4-187390a83535-socket-dir\") pod \"aws-ebs-csi-driver-node-2vmqw\" (UID: \"308d3703-6cc2-4cc6-a2e4-187390a83535\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:55.812094 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.811572 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl8f4\" (UniqueName: \"kubernetes.io/projected/308d3703-6cc2-4cc6-a2e4-187390a83535-kube-api-access-jl8f4\") pod \"aws-ebs-csi-driver-node-2vmqw\" (UID: \"308d3703-6cc2-4cc6-a2e4-187390a83535\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:55.836591 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.836531 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:21:54 +0000 UTC" deadline="2027-10-31 15:33:48.637863792 +0000 UTC" Apr 20 14:26:55.836591 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.836590 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13417h6m52.801278411s" Apr 20 14:26:55.900884 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.900834 2580 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 14:26:55.912617 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.912579 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbs6n\" (UniqueName: \"kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n\") pod \"network-check-target-fgfxx\" (UID: \"35f35192-c0aa-4a62-b812-0f0be61d0f8e\") " pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:26:55.912804 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.912632 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-ovnkube-config\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.912804 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.912660 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-os-release\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.912804 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.912681 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-multus-socket-dir-parent\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.912804 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.912703 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/236542ea-fda7-4b96-ae9e-dd685e15e5ef-hosts-file\") pod \"node-resolver-mw9b9\" (UID: \"236542ea-fda7-4b96-ae9e-dd685e15e5ef\") " pod="openshift-dns/node-resolver-mw9b9" Apr 20 14:26:55.912804 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.912783 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-host-run-k8s-cni-cncf-io\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.913040 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.912825 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/236542ea-fda7-4b96-ae9e-dd685e15e5ef-hosts-file\") pod \"node-resolver-mw9b9\" (UID: \"236542ea-fda7-4b96-ae9e-dd685e15e5ef\") " pod="openshift-dns/node-resolver-mw9b9" Apr 20 14:26:55.913040 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.912826 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-host-run-k8s-cni-cncf-io\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.913040 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.912785 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-os-release\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.913040 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.912827 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-host-var-lib-cni-multus\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.913040 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.912792 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-multus-socket-dir-parent\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.913040 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.912873 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c2fe8871-85d4-447d-a6ca-09895bb1faae-konnectivity-ca\") pod \"konnectivity-agent-xhmdf\" (UID: \"c2fe8871-85d4-447d-a6ca-09895bb1faae\") " pod="kube-system/konnectivity-agent-xhmdf" Apr 20 14:26:55.913040 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.912871 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-host-var-lib-cni-multus\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.913040 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.912899 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-host-cni-bin\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.913040 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.912924 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rm2v\" (UniqueName: \"kubernetes.io/projected/175a3efc-4aa3-4f7d-ac63-bb40b7cd457b-kube-api-access-9rm2v\") pod \"node-ca-2rnmj\" (UID: \"175a3efc-4aa3-4f7d-ac63-bb40b7cd457b\") " pod="openshift-image-registry/node-ca-2rnmj" Apr 20 14:26:55.913040 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.912949 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-etc-systemd\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.913040 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.912969 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-host-cni-bin\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.913040 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.912977 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-tmp\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.913040 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913037 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-etc-systemd\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.913664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913088 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-host-run-multus-certs\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.913664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913114 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/236542ea-fda7-4b96-ae9e-dd685e15e5ef-tmp-dir\") pod \"node-resolver-mw9b9\" (UID: \"236542ea-fda7-4b96-ae9e-dd685e15e5ef\") " pod="openshift-dns/node-resolver-mw9b9" Apr 20 14:26:55.913664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913138 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-ovnkube-script-lib\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.913664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913160 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/175a3efc-4aa3-4f7d-ac63-bb40b7cd457b-host\") pod \"node-ca-2rnmj\" (UID: \"175a3efc-4aa3-4f7d-ac63-bb40b7cd457b\") " pod="openshift-image-registry/node-ca-2rnmj" Apr 20 14:26:55.913664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913186 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cznq\" (UniqueName: \"kubernetes.io/projected/3c30eab0-ce99-4717-9b34-99ab3a10543c-kube-api-access-6cznq\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.913664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913213 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-run-ovn\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.913664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913238 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-env-overrides\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.913664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913261 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzn2l\" (UniqueName: \"kubernetes.io/projected/db79a290-5377-45f9-bb87-89588231d8a7-kube-api-access-wzn2l\") pod \"network-metrics-daemon-qsqks\" (UID: \"db79a290-5377-45f9-bb87-89588231d8a7\") " pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:26:55.913664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913284 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-hostroot\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.913664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913312 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c30eab0-ce99-4717-9b34-99ab3a10543c-os-release\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.913664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913291 2580 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 14:26:55.913664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913337 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c30eab0-ce99-4717-9b34-99ab3a10543c-cni-binary-copy\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.913664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913361 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-etc-modprobe-d\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.913664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913385 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjlfd\" (UniqueName: \"kubernetes.io/projected/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-kube-api-access-tjlfd\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.913664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913406 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-ovnkube-config\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.913664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913450 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/236542ea-fda7-4b96-ae9e-dd685e15e5ef-tmp-dir\") pod \"node-resolver-mw9b9\" (UID: \"236542ea-fda7-4b96-ae9e-dd685e15e5ef\") " pod="openshift-dns/node-resolver-mw9b9" Apr 20 14:26:55.913664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913408 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-multus-cni-dir\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.913664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913524 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-multus-conf-dir\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.914465 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913526 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/175a3efc-4aa3-4f7d-ac63-bb40b7cd457b-host\") pod \"node-ca-2rnmj\" (UID: \"175a3efc-4aa3-4f7d-ac63-bb40b7cd457b\") " pod="openshift-image-registry/node-ca-2rnmj" Apr 20 14:26:55.914465 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913553 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-etc-openvswitch\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.914465 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913552 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c2fe8871-85d4-447d-a6ca-09895bb1faae-konnectivity-ca\") pod \"konnectivity-agent-xhmdf\" (UID: \"c2fe8871-85d4-447d-a6ca-09895bb1faae\") " pod="kube-system/konnectivity-agent-xhmdf" Apr 20 14:26:55.914465 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913585 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c65ed945-ecba-4055-a1e4-c1ab80fb47d8-iptables-alerter-script\") pod \"iptables-alerter-2rbh6\" (UID: \"c65ed945-ecba-4055-a1e4-c1ab80fb47d8\") " pod="openshift-network-operator/iptables-alerter-2rbh6" Apr 20 14:26:55.914465 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913588 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-run-ovn\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.914465 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913466 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-multus-cni-dir\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.914465 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913643 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-hostroot\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.914465 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913654 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-multus-conf-dir\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.914465 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913538 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-host-run-multus-certs\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.914465 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913698 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-host-var-lib-kubelet\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.914465 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913716 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-ovnkube-script-lib\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.914465 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913723 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-host-cni-netd\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.914465 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913752 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c65ed945-ecba-4055-a1e4-c1ab80fb47d8-host-slash\") pod \"iptables-alerter-2rbh6\" (UID: \"c65ed945-ecba-4055-a1e4-c1ab80fb47d8\") " pod="openshift-network-operator/iptables-alerter-2rbh6" Apr 20 14:26:55.914465 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913761 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-host-var-lib-kubelet\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.914465 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913779 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-sys\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.914465 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913783 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-etc-modprobe-d\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.914465 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913820 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c30eab0-ce99-4717-9b34-99ab3a10543c-os-release\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.914465 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913822 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-host-cni-netd\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.915308 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913829 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-sys\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.915308 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913854 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-system-cni-dir\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.915308 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913862 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-etc-openvswitch\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.915308 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913876 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twxvc\" (UniqueName: \"kubernetes.io/projected/236542ea-fda7-4b96-ae9e-dd685e15e5ef-kube-api-access-twxvc\") pod \"node-resolver-mw9b9\" (UID: \"236542ea-fda7-4b96-ae9e-dd685e15e5ef\") " pod="openshift-dns/node-resolver-mw9b9" Apr 20 14:26:55.915308 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913893 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-host-kubelet\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.915308 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913921 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-system-cni-dir\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.915308 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913935 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-systemd-units\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.915308 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913948 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-host-kubelet\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.915308 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913955 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-env-overrides\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.915308 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913960 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-run-openvswitch\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.915308 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913989 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-systemd-units\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.915308 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.913991 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-run-openvswitch\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.915308 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914014 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-ovn-node-metrics-cert\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.915308 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914046 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-run\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.915308 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914071 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-host\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.915308 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914083 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c30eab0-ce99-4717-9b34-99ab3a10543c-cni-binary-copy\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.915308 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914096 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-etc-kubernetes\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.915308 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914103 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-run\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.916112 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914135 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-etc-kubernetes\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.916112 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914135 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-node-log\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.916112 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914146 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-host\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.916112 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914168 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-node-log\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.916112 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914170 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f5hw\" (UniqueName: \"kubernetes.io/projected/c65ed945-ecba-4055-a1e4-c1ab80fb47d8-kube-api-access-6f5hw\") pod \"iptables-alerter-2rbh6\" (UID: \"c65ed945-ecba-4055-a1e4-c1ab80fb47d8\") " pod="openshift-network-operator/iptables-alerter-2rbh6" Apr 20 14:26:55.916112 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914206 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/308d3703-6cc2-4cc6-a2e4-187390a83535-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2vmqw\" (UID: \"308d3703-6cc2-4cc6-a2e4-187390a83535\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:55.916112 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914248 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-etc-sysconfig\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.916112 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914250 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/308d3703-6cc2-4cc6-a2e4-187390a83535-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2vmqw\" (UID: \"308d3703-6cc2-4cc6-a2e4-187390a83535\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:55.916112 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914283 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-host-var-lib-cni-bin\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.916112 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914336 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-etc-sysconfig\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.916112 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914344 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/175a3efc-4aa3-4f7d-ac63-bb40b7cd457b-serviceca\") pod \"node-ca-2rnmj\" (UID: \"175a3efc-4aa3-4f7d-ac63-bb40b7cd457b\") " pod="openshift-image-registry/node-ca-2rnmj" Apr 20 14:26:55.916112 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914366 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-host-var-lib-cni-bin\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.916112 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914413 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/308d3703-6cc2-4cc6-a2e4-187390a83535-device-dir\") pod \"aws-ebs-csi-driver-node-2vmqw\" (UID: \"308d3703-6cc2-4cc6-a2e4-187390a83535\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:55.916112 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914440 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/308d3703-6cc2-4cc6-a2e4-187390a83535-sys-fs\") pod \"aws-ebs-csi-driver-node-2vmqw\" (UID: \"308d3703-6cc2-4cc6-a2e4-187390a83535\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:55.916112 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914451 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/308d3703-6cc2-4cc6-a2e4-187390a83535-device-dir\") pod \"aws-ebs-csi-driver-node-2vmqw\" (UID: \"308d3703-6cc2-4cc6-a2e4-187390a83535\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:55.916112 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914466 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-etc-sysctl-d\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.916112 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914489 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-etc-tuned\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.916852 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914521 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/308d3703-6cc2-4cc6-a2e4-187390a83535-sys-fs\") pod \"aws-ebs-csi-driver-node-2vmqw\" (UID: \"308d3703-6cc2-4cc6-a2e4-187390a83535\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:55.916852 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914530 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-cni-binary-copy\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.916852 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914556 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-multus-daemon-config\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.916852 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914581 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c30eab0-ce99-4717-9b34-99ab3a10543c-cnibin\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.916852 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914606 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-host-slash\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.916852 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914631 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-var-lib-openvswitch\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.916852 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914655 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-log-socket\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.916852 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914680 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c2fe8871-85d4-447d-a6ca-09895bb1faae-agent-certs\") pod \"konnectivity-agent-xhmdf\" (UID: \"c2fe8871-85d4-447d-a6ca-09895bb1faae\") " pod="kube-system/konnectivity-agent-xhmdf" Apr 20 14:26:55.916852 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914704 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-etc-kubernetes\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.916852 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914729 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-var-lib-kubelet\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.916852 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914754 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c30eab0-ce99-4717-9b34-99ab3a10543c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.916852 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914783 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3c30eab0-ce99-4717-9b34-99ab3a10543c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.916852 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914808 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-host-run-netns\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.916852 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914832 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-host-run-ovn-kubernetes\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.916852 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914860 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/308d3703-6cc2-4cc6-a2e4-187390a83535-registration-dir\") pod \"aws-ebs-csi-driver-node-2vmqw\" (UID: \"308d3703-6cc2-4cc6-a2e4-187390a83535\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:55.916852 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914899 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/308d3703-6cc2-4cc6-a2e4-187390a83535-etc-selinux\") pod \"aws-ebs-csi-driver-node-2vmqw\" (UID: \"308d3703-6cc2-4cc6-a2e4-187390a83535\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:55.916852 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914925 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-etc-sysctl-conf\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.917682 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914952 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-host-run-netns\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.917682 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914982 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3c30eab0-ce99-4717-9b34-99ab3a10543c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.917682 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915010 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-lib-modules\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.917682 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915041 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-cnibin\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.917682 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915067 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs\") pod \"network-metrics-daemon-qsqks\" (UID: \"db79a290-5377-45f9-bb87-89588231d8a7\") " pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:26:55.917682 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915094 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-run-systemd\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.917682 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915122 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.917682 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915153 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rl8vk\" (UniqueName: \"kubernetes.io/projected/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-kube-api-access-rl8vk\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.917682 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915179 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/308d3703-6cc2-4cc6-a2e4-187390a83535-socket-dir\") pod \"aws-ebs-csi-driver-node-2vmqw\" (UID: \"308d3703-6cc2-4cc6-a2e4-187390a83535\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:55.917682 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915205 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jl8f4\" (UniqueName: \"kubernetes.io/projected/308d3703-6cc2-4cc6-a2e4-187390a83535-kube-api-access-jl8f4\") pod \"aws-ebs-csi-driver-node-2vmqw\" (UID: \"308d3703-6cc2-4cc6-a2e4-187390a83535\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:55.917682 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915232 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cch5v\" (UniqueName: \"kubernetes.io/projected/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-kube-api-access-cch5v\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.917682 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915259 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c30eab0-ce99-4717-9b34-99ab3a10543c-system-cni-dir\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.917682 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915290 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-cni-binary-copy\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.917682 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915340 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c30eab0-ce99-4717-9b34-99ab3a10543c-system-cni-dir\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.917682 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915354 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/308d3703-6cc2-4cc6-a2e4-187390a83535-registration-dir\") pod \"aws-ebs-csi-driver-node-2vmqw\" (UID: \"308d3703-6cc2-4cc6-a2e4-187390a83535\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:55.917682 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915392 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c30eab0-ce99-4717-9b34-99ab3a10543c-cnibin\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.917682 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915422 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-host-slash\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.918445 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.914605 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-etc-sysctl-d\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.918445 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915434 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/308d3703-6cc2-4cc6-a2e4-187390a83535-etc-selinux\") pod \"aws-ebs-csi-driver-node-2vmqw\" (UID: \"308d3703-6cc2-4cc6-a2e4-187390a83535\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:55.918445 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915463 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-var-lib-openvswitch\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.918445 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915542 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-log-socket\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.918445 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915566 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-etc-sysctl-conf\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.918445 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915693 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/175a3efc-4aa3-4f7d-ac63-bb40b7cd457b-serviceca\") pod \"node-ca-2rnmj\" (UID: \"175a3efc-4aa3-4f7d-ac63-bb40b7cd457b\") " pod="openshift-image-registry/node-ca-2rnmj" Apr 20 14:26:55.918445 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915702 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-multus-daemon-config\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.918445 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915723 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c30eab0-ce99-4717-9b34-99ab3a10543c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.918445 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915771 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-etc-kubernetes\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.918445 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915770 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-host-run-netns\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.918445 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915772 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-var-lib-kubelet\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.918445 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915803 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-run-systemd\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.918445 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915836 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-cnibin\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:55.918445 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915876 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-host-run-netns\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.918445 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:55.915891 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:26:55.918445 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915900 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-lib-modules\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.918445 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.915919 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-host-run-ovn-kubernetes\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.919213 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:55.916010 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs podName:db79a290-5377-45f9-bb87-89588231d8a7 nodeName:}" failed. No retries permitted until 2026-04-20 14:26:56.415971876 +0000 UTC m=+3.136273287 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs") pod "network-metrics-daemon-qsqks" (UID: "db79a290-5377-45f9-bb87-89588231d8a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:26:55.919213 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.916023 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/308d3703-6cc2-4cc6-a2e4-187390a83535-socket-dir\") pod \"aws-ebs-csi-driver-node-2vmqw\" (UID: \"308d3703-6cc2-4cc6-a2e4-187390a83535\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:55.919213 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.916024 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.919213 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.916319 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3c30eab0-ce99-4717-9b34-99ab3a10543c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.919213 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.916352 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3c30eab0-ce99-4717-9b34-99ab3a10543c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.919213 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.916989 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-tmp\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.919213 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.917862 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-ovn-node-metrics-cert\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.919213 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.918419 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-etc-tuned\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.919213 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:55.918975 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:26:55.919213 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:55.918997 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:26:55.919213 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:55.919011 2580 projected.go:194] Error preparing data for projected volume kube-api-access-jbs6n for pod openshift-network-diagnostics/network-check-target-fgfxx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:26:55.919213 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:55.919136 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n podName:35f35192-c0aa-4a62-b812-0f0be61d0f8e nodeName:}" failed. No retries permitted until 2026-04-20 14:26:56.419117849 +0000 UTC m=+3.139419273 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jbs6n" (UniqueName: "kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n") pod "network-check-target-fgfxx" (UID: "35f35192-c0aa-4a62-b812-0f0be61d0f8e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:26:55.919742 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.919433 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c2fe8871-85d4-447d-a6ca-09895bb1faae-agent-certs\") pod \"konnectivity-agent-xhmdf\" (UID: \"c2fe8871-85d4-447d-a6ca-09895bb1faae\") " pod="kube-system/konnectivity-agent-xhmdf" Apr 20 14:26:55.925100 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.925071 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rm2v\" (UniqueName: \"kubernetes.io/projected/175a3efc-4aa3-4f7d-ac63-bb40b7cd457b-kube-api-access-9rm2v\") pod \"node-ca-2rnmj\" (UID: \"175a3efc-4aa3-4f7d-ac63-bb40b7cd457b\") " pod="openshift-image-registry/node-ca-2rnmj" Apr 20 14:26:55.927068 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.927042 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjlfd\" (UniqueName: \"kubernetes.io/projected/d554fe2d-d4e2-4aa6-920b-8f246b30eaf5-kube-api-access-tjlfd\") pod \"tuned-5fdfv\" (UID: \"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5\") " pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:55.927997 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.927960 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twxvc\" (UniqueName: \"kubernetes.io/projected/236542ea-fda7-4b96-ae9e-dd685e15e5ef-kube-api-access-twxvc\") pod \"node-resolver-mw9b9\" (UID: \"236542ea-fda7-4b96-ae9e-dd685e15e5ef\") " pod="openshift-dns/node-resolver-mw9b9" Apr 20 14:26:55.928820 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.928568 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl8f4\" (UniqueName: \"kubernetes.io/projected/308d3703-6cc2-4cc6-a2e4-187390a83535-kube-api-access-jl8f4\") pod \"aws-ebs-csi-driver-node-2vmqw\" (UID: \"308d3703-6cc2-4cc6-a2e4-187390a83535\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:55.928820 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.928575 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cznq\" (UniqueName: \"kubernetes.io/projected/3c30eab0-ce99-4717-9b34-99ab3a10543c-kube-api-access-6cznq\") pod \"multus-additional-cni-plugins-dmfdx\" (UID: \"3c30eab0-ce99-4717-9b34-99ab3a10543c\") " pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:55.928820 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.928768 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzn2l\" (UniqueName: \"kubernetes.io/projected/db79a290-5377-45f9-bb87-89588231d8a7-kube-api-access-wzn2l\") pod \"network-metrics-daemon-qsqks\" (UID: \"db79a290-5377-45f9-bb87-89588231d8a7\") " pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:26:55.929357 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.929323 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl8vk\" (UniqueName: \"kubernetes.io/projected/b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad-kube-api-access-rl8vk\") pod \"ovnkube-node-s4vzx\" (UID: \"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:55.929833 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:55.929802 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cch5v\" (UniqueName: \"kubernetes.io/projected/5ab1d81b-18bf-4cdd-80f4-67fa99ae9490-kube-api-access-cch5v\") pod \"multus-ct824\" (UID: \"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490\") " pod="openshift-multus/multus-ct824" Apr 20 14:26:56.016095 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.016051 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c65ed945-ecba-4055-a1e4-c1ab80fb47d8-iptables-alerter-script\") pod \"iptables-alerter-2rbh6\" (UID: \"c65ed945-ecba-4055-a1e4-c1ab80fb47d8\") " pod="openshift-network-operator/iptables-alerter-2rbh6" Apr 20 14:26:56.016282 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.016109 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c65ed945-ecba-4055-a1e4-c1ab80fb47d8-host-slash\") pod \"iptables-alerter-2rbh6\" (UID: \"c65ed945-ecba-4055-a1e4-c1ab80fb47d8\") " pod="openshift-network-operator/iptables-alerter-2rbh6" Apr 20 14:26:56.016282 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.016144 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6f5hw\" (UniqueName: \"kubernetes.io/projected/c65ed945-ecba-4055-a1e4-c1ab80fb47d8-kube-api-access-6f5hw\") pod \"iptables-alerter-2rbh6\" (UID: \"c65ed945-ecba-4055-a1e4-c1ab80fb47d8\") " pod="openshift-network-operator/iptables-alerter-2rbh6" Apr 20 14:26:56.016411 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.016380 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c65ed945-ecba-4055-a1e4-c1ab80fb47d8-host-slash\") pod \"iptables-alerter-2rbh6\" (UID: \"c65ed945-ecba-4055-a1e4-c1ab80fb47d8\") " pod="openshift-network-operator/iptables-alerter-2rbh6" Apr 20 14:26:56.016711 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.016684 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c65ed945-ecba-4055-a1e4-c1ab80fb47d8-iptables-alerter-script\") pod \"iptables-alerter-2rbh6\" (UID: \"c65ed945-ecba-4055-a1e4-c1ab80fb47d8\") " pod="openshift-network-operator/iptables-alerter-2rbh6" Apr 20 14:26:56.025386 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.025346 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f5hw\" (UniqueName: \"kubernetes.io/projected/c65ed945-ecba-4055-a1e4-c1ab80fb47d8-kube-api-access-6f5hw\") pod \"iptables-alerter-2rbh6\" (UID: \"c65ed945-ecba-4055-a1e4-c1ab80fb47d8\") " pod="openshift-network-operator/iptables-alerter-2rbh6" Apr 20 14:26:56.099438 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.099344 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" Apr 20 14:26:56.107443 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.107412 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" Apr 20 14:26:56.117199 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.117164 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mw9b9" Apr 20 14:26:56.123990 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.123962 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ct824" Apr 20 14:26:56.129685 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.129663 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xhmdf" Apr 20 14:26:56.137221 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.137199 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:26:56.143880 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.143850 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2rnmj" Apr 20 14:26:56.152455 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.152430 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dmfdx" Apr 20 14:26:56.160084 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.160057 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2rbh6" Apr 20 14:26:56.179756 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.179727 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:26:56.419271 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.419176 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs\") pod \"network-metrics-daemon-qsqks\" (UID: \"db79a290-5377-45f9-bb87-89588231d8a7\") " pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:26:56.419271 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.419227 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbs6n\" (UniqueName: \"kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n\") pod \"network-check-target-fgfxx\" (UID: \"35f35192-c0aa-4a62-b812-0f0be61d0f8e\") " pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:26:56.419516 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:56.419328 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:26:56.419516 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:56.419351 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:26:56.419516 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:56.419369 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:26:56.419516 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:56.419382 2580 projected.go:194] Error preparing data for projected volume kube-api-access-jbs6n for pod openshift-network-diagnostics/network-check-target-fgfxx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:26:56.419516 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:56.419413 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs podName:db79a290-5377-45f9-bb87-89588231d8a7 nodeName:}" failed. No retries permitted until 2026-04-20 14:26:57.419395614 +0000 UTC m=+4.139697036 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs") pod "network-metrics-daemon-qsqks" (UID: "db79a290-5377-45f9-bb87-89588231d8a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:26:56.419516 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:56.419431 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n podName:35f35192-c0aa-4a62-b812-0f0be61d0f8e nodeName:}" failed. No retries permitted until 2026-04-20 14:26:57.41942422 +0000 UTC m=+4.139725630 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-jbs6n" (UniqueName: "kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n") pod "network-check-target-fgfxx" (UID: "35f35192-c0aa-4a62-b812-0f0be61d0f8e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:26:56.581510 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:56.581467 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd554fe2d_d4e2_4aa6_920b_8f246b30eaf5.slice/crio-83fa4ac14e7695c77d0548b282a54653d051f60aee255756c07e097f234d3e1a WatchSource:0}: Error finding container 83fa4ac14e7695c77d0548b282a54653d051f60aee255756c07e097f234d3e1a: Status 404 returned error can't find the container with id 83fa4ac14e7695c77d0548b282a54653d051f60aee255756c07e097f234d3e1a Apr 20 14:26:56.583088 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:56.583061 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c30eab0_ce99_4717_9b34_99ab3a10543c.slice/crio-fb5193544ee905575c74e80a7d6cb57492bada9c8576d4d3b7a310b2dab819b6 WatchSource:0}: Error finding container fb5193544ee905575c74e80a7d6cb57492bada9c8576d4d3b7a310b2dab819b6: Status 404 returned error can't find the container with id fb5193544ee905575c74e80a7d6cb57492bada9c8576d4d3b7a310b2dab819b6 Apr 20 14:26:56.584240 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:56.584215 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb206eb53_cb8b_4d6e_801e_c9b9fe10e3ad.slice/crio-4a1cc989f702956684225ecb0255ddd4dc0e206f9341bc58cb8a2839907c8b0a WatchSource:0}: Error finding container 4a1cc989f702956684225ecb0255ddd4dc0e206f9341bc58cb8a2839907c8b0a: Status 404 returned error can't find the container with id 4a1cc989f702956684225ecb0255ddd4dc0e206f9341bc58cb8a2839907c8b0a Apr 20 14:26:56.585709 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:56.585678 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod308d3703_6cc2_4cc6_a2e4_187390a83535.slice/crio-287d56e9610f27d9b1b1bd6336f8a1b95bcfd24089c030ba00dfe05e6ec44194 WatchSource:0}: Error finding container 287d56e9610f27d9b1b1bd6336f8a1b95bcfd24089c030ba00dfe05e6ec44194: Status 404 returned error can't find the container with id 287d56e9610f27d9b1b1bd6336f8a1b95bcfd24089c030ba00dfe05e6ec44194 Apr 20 14:26:56.588050 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:26:56.587821 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc65ed945_ecba_4055_a1e4_c1ab80fb47d8.slice/crio-6d7c417702b39ee872a6ea429f2993e22a6b59b9b016a5e5e2c3f27ccb211cbd WatchSource:0}: Error finding container 6d7c417702b39ee872a6ea429f2993e22a6b59b9b016a5e5e2c3f27ccb211cbd: Status 404 returned error can't find the container with id 6d7c417702b39ee872a6ea429f2993e22a6b59b9b016a5e5e2c3f27ccb211cbd Apr 20 14:26:56.653561 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.653370 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-plgss"] Apr 20 14:26:56.656561 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.656537 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:26:56.656681 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:56.656617 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-plgss" podUID="fd645412-a5e8-4419-9535-27ac65a5ee65" Apr 20 14:26:56.721538 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.721438 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fd645412-a5e8-4419-9535-27ac65a5ee65-dbus\") pod \"global-pull-secret-syncer-plgss\" (UID: \"fd645412-a5e8-4419-9535-27ac65a5ee65\") " pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:26:56.721538 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.721477 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fd645412-a5e8-4419-9535-27ac65a5ee65-kubelet-config\") pod \"global-pull-secret-syncer-plgss\" (UID: \"fd645412-a5e8-4419-9535-27ac65a5ee65\") " pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:26:56.721734 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.721557 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd645412-a5e8-4419-9535-27ac65a5ee65-original-pull-secret\") pod \"global-pull-secret-syncer-plgss\" (UID: \"fd645412-a5e8-4419-9535-27ac65a5ee65\") " pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:26:56.821876 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.821836 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fd645412-a5e8-4419-9535-27ac65a5ee65-dbus\") pod \"global-pull-secret-syncer-plgss\" (UID: \"fd645412-a5e8-4419-9535-27ac65a5ee65\") " pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:26:56.822328 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.821896 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fd645412-a5e8-4419-9535-27ac65a5ee65-kubelet-config\") pod \"global-pull-secret-syncer-plgss\" (UID: \"fd645412-a5e8-4419-9535-27ac65a5ee65\") " pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:26:56.822328 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.821924 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd645412-a5e8-4419-9535-27ac65a5ee65-original-pull-secret\") pod \"global-pull-secret-syncer-plgss\" (UID: \"fd645412-a5e8-4419-9535-27ac65a5ee65\") " pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:26:56.822328 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.822034 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/fd645412-a5e8-4419-9535-27ac65a5ee65-dbus\") pod \"global-pull-secret-syncer-plgss\" (UID: \"fd645412-a5e8-4419-9535-27ac65a5ee65\") " pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:26:56.822328 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.822036 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/fd645412-a5e8-4419-9535-27ac65a5ee65-kubelet-config\") pod \"global-pull-secret-syncer-plgss\" (UID: \"fd645412-a5e8-4419-9535-27ac65a5ee65\") " pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:26:56.822328 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:56.822122 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:26:56.822328 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:56.822198 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd645412-a5e8-4419-9535-27ac65a5ee65-original-pull-secret podName:fd645412-a5e8-4419-9535-27ac65a5ee65 nodeName:}" failed. No retries permitted until 2026-04-20 14:26:57.322178286 +0000 UTC m=+4.042479696 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fd645412-a5e8-4419-9535-27ac65a5ee65-original-pull-secret") pod "global-pull-secret-syncer-plgss" (UID: "fd645412-a5e8-4419-9535-27ac65a5ee65") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:26:56.837691 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.837645 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:21:54 +0000 UTC" deadline="2027-12-08 16:23:19.75644799 +0000 UTC" Apr 20 14:26:56.837691 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.837684 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14329h56m22.918766525s" Apr 20 14:26:56.927326 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.927297 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:26:56.927458 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:56.927405 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fgfxx" podUID="35f35192-c0aa-4a62-b812-0f0be61d0f8e" Apr 20 14:26:56.936899 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.936851 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mw9b9" event={"ID":"236542ea-fda7-4b96-ae9e-dd685e15e5ef","Type":"ContainerStarted","Data":"61b27f386df0aa5a281fb3906f4b9b0b2dfaaf982b3f1b6ad4d3062235c61431"} Apr 20 14:26:56.937905 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.937880 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2rnmj" event={"ID":"175a3efc-4aa3-4f7d-ac63-bb40b7cd457b","Type":"ContainerStarted","Data":"385520e428561e8d1231d1138fd053ac503b44d5b76bd7bf46a4058ac1388748"} Apr 20 14:26:56.941031 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.941005 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2rbh6" event={"ID":"c65ed945-ecba-4055-a1e4-c1ab80fb47d8","Type":"ContainerStarted","Data":"6d7c417702b39ee872a6ea429f2993e22a6b59b9b016a5e5e2c3f27ccb211cbd"} Apr 20 14:26:56.943666 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.943638 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" event={"ID":"308d3703-6cc2-4cc6-a2e4-187390a83535","Type":"ContainerStarted","Data":"287d56e9610f27d9b1b1bd6336f8a1b95bcfd24089c030ba00dfe05e6ec44194"} Apr 20 14:26:56.945098 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.945074 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" event={"ID":"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad","Type":"ContainerStarted","Data":"4a1cc989f702956684225ecb0255ddd4dc0e206f9341bc58cb8a2839907c8b0a"} Apr 20 14:26:56.946273 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.946249 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmfdx" event={"ID":"3c30eab0-ce99-4717-9b34-99ab3a10543c","Type":"ContainerStarted","Data":"fb5193544ee905575c74e80a7d6cb57492bada9c8576d4d3b7a310b2dab819b6"} Apr 20 14:26:56.947347 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.947314 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xhmdf" event={"ID":"c2fe8871-85d4-447d-a6ca-09895bb1faae","Type":"ContainerStarted","Data":"1b9e78c9231b838097e7a98896bc26242c0e4d4a36ed71dfb674c205f4d3e7b6"} Apr 20 14:26:56.948354 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.948330 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ct824" event={"ID":"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490","Type":"ContainerStarted","Data":"26896ab44e631af004ace5ec0d074dc2880442e9f5b2f680eecc4aea306eeddb"} Apr 20 14:26:56.949703 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.949658 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" event={"ID":"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5","Type":"ContainerStarted","Data":"83fa4ac14e7695c77d0548b282a54653d051f60aee255756c07e097f234d3e1a"} Apr 20 14:26:56.953684 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:56.952644 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-166.ec2.internal" event={"ID":"b77815b0bb4108444a9fb656311ac214","Type":"ContainerStarted","Data":"ef0e048295c74a1cb0e692cf1bf448ab81df796bfb849b19af8e6134bc3fcdab"} Apr 20 14:26:57.327618 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:57.327573 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd645412-a5e8-4419-9535-27ac65a5ee65-original-pull-secret\") pod \"global-pull-secret-syncer-plgss\" (UID: \"fd645412-a5e8-4419-9535-27ac65a5ee65\") " pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:26:57.327817 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:57.327743 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:26:57.327817 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:57.327808 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd645412-a5e8-4419-9535-27ac65a5ee65-original-pull-secret podName:fd645412-a5e8-4419-9535-27ac65a5ee65 nodeName:}" failed. No retries permitted until 2026-04-20 14:26:58.327788048 +0000 UTC m=+5.048089480 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fd645412-a5e8-4419-9535-27ac65a5ee65-original-pull-secret") pod "global-pull-secret-syncer-plgss" (UID: "fd645412-a5e8-4419-9535-27ac65a5ee65") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:26:57.428140 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:57.428101 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs\") pod \"network-metrics-daemon-qsqks\" (UID: \"db79a290-5377-45f9-bb87-89588231d8a7\") " pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:26:57.428318 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:57.428163 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbs6n\" (UniqueName: \"kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n\") pod \"network-check-target-fgfxx\" (UID: \"35f35192-c0aa-4a62-b812-0f0be61d0f8e\") " pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:26:57.428318 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:57.428311 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:26:57.428429 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:57.428330 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:26:57.428429 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:57.428344 2580 projected.go:194] Error preparing data for projected volume kube-api-access-jbs6n for pod openshift-network-diagnostics/network-check-target-fgfxx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:26:57.428429 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:57.428406 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n podName:35f35192-c0aa-4a62-b812-0f0be61d0f8e nodeName:}" failed. No retries permitted until 2026-04-20 14:26:59.428386473 +0000 UTC m=+6.148687900 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-jbs6n" (UniqueName: "kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n") pod "network-check-target-fgfxx" (UID: "35f35192-c0aa-4a62-b812-0f0be61d0f8e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:26:57.428844 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:57.428824 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:26:57.428935 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:57.428881 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs podName:db79a290-5377-45f9-bb87-89588231d8a7 nodeName:}" failed. No retries permitted until 2026-04-20 14:26:59.42886459 +0000 UTC m=+6.149166016 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs") pod "network-metrics-daemon-qsqks" (UID: "db79a290-5377-45f9-bb87-89588231d8a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:26:57.930243 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:57.930165 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:26:57.930696 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:57.930305 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qsqks" podUID="db79a290-5377-45f9-bb87-89588231d8a7" Apr 20 14:26:57.976766 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:57.976727 2580 generic.go:358] "Generic (PLEG): container finished" podID="018517f7eeb11abdb65728aa3a7d830b" containerID="6a2d16436c30a3e2694e4ea483c8285e96f997edea9c65a28f97c74c600bae0d" exitCode=0 Apr 20 14:26:57.978101 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:57.977258 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" event={"ID":"018517f7eeb11abdb65728aa3a7d830b","Type":"ContainerDied","Data":"6a2d16436c30a3e2694e4ea483c8285e96f997edea9c65a28f97c74c600bae0d"} Apr 20 14:26:57.994916 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:57.994854 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-166.ec2.internal" podStartSLOduration=2.9948353709999997 podStartE2EDuration="2.994835371s" podCreationTimestamp="2026-04-20 14:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:26:56.967906966 +0000 UTC m=+3.688208428" watchObservedRunningTime="2026-04-20 14:26:57.994835371 +0000 UTC m=+4.715136804" Apr 20 14:26:58.340451 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:58.340115 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd645412-a5e8-4419-9535-27ac65a5ee65-original-pull-secret\") pod \"global-pull-secret-syncer-plgss\" (UID: \"fd645412-a5e8-4419-9535-27ac65a5ee65\") " pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:26:58.340451 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:58.340281 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:26:58.340451 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:58.340340 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd645412-a5e8-4419-9535-27ac65a5ee65-original-pull-secret podName:fd645412-a5e8-4419-9535-27ac65a5ee65 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:00.340321476 +0000 UTC m=+7.060622889 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fd645412-a5e8-4419-9535-27ac65a5ee65-original-pull-secret") pod "global-pull-secret-syncer-plgss" (UID: "fd645412-a5e8-4419-9535-27ac65a5ee65") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:26:58.927477 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:58.927440 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:26:58.927679 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:58.927594 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fgfxx" podUID="35f35192-c0aa-4a62-b812-0f0be61d0f8e" Apr 20 14:26:58.927744 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:58.927729 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:26:58.927850 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:58.927816 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-plgss" podUID="fd645412-a5e8-4419-9535-27ac65a5ee65" Apr 20 14:26:58.990940 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:58.990902 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" event={"ID":"018517f7eeb11abdb65728aa3a7d830b","Type":"ContainerStarted","Data":"9c388a01807c179dd9d4345f433bdcde04378d4f65d398d49c212101233dd58f"} Apr 20 14:26:59.450384 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:59.449549 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs\") pod \"network-metrics-daemon-qsqks\" (UID: \"db79a290-5377-45f9-bb87-89588231d8a7\") " pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:26:59.450384 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:59.449601 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbs6n\" (UniqueName: \"kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n\") pod \"network-check-target-fgfxx\" (UID: \"35f35192-c0aa-4a62-b812-0f0be61d0f8e\") " pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:26:59.450384 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:59.449756 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:26:59.450384 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:59.449778 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:26:59.450384 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:59.449790 2580 projected.go:194] Error preparing data for projected volume kube-api-access-jbs6n for pod openshift-network-diagnostics/network-check-target-fgfxx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:26:59.450384 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:59.449849 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n podName:35f35192-c0aa-4a62-b812-0f0be61d0f8e nodeName:}" failed. No retries permitted until 2026-04-20 14:27:03.449830471 +0000 UTC m=+10.170131896 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-jbs6n" (UniqueName: "kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n") pod "network-check-target-fgfxx" (UID: "35f35192-c0aa-4a62-b812-0f0be61d0f8e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:26:59.450384 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:59.450276 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:26:59.450384 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:59.450329 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs podName:db79a290-5377-45f9-bb87-89588231d8a7 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:03.450312849 +0000 UTC m=+10.170614273 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs") pod "network-metrics-daemon-qsqks" (UID: "db79a290-5377-45f9-bb87-89588231d8a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:26:59.930621 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:26:59.930537 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:26:59.930793 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:26:59.930680 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qsqks" podUID="db79a290-5377-45f9-bb87-89588231d8a7" Apr 20 14:27:00.358287 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:00.357633 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd645412-a5e8-4419-9535-27ac65a5ee65-original-pull-secret\") pod \"global-pull-secret-syncer-plgss\" (UID: \"fd645412-a5e8-4419-9535-27ac65a5ee65\") " pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:27:00.358287 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:00.357834 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:27:00.358287 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:00.357899 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd645412-a5e8-4419-9535-27ac65a5ee65-original-pull-secret podName:fd645412-a5e8-4419-9535-27ac65a5ee65 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:04.357881609 +0000 UTC m=+11.078183024 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fd645412-a5e8-4419-9535-27ac65a5ee65-original-pull-secret") pod "global-pull-secret-syncer-plgss" (UID: "fd645412-a5e8-4419-9535-27ac65a5ee65") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:27:00.927904 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:00.927861 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:27:00.928098 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:00.927860 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:27:00.928098 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:00.928039 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-plgss" podUID="fd645412-a5e8-4419-9535-27ac65a5ee65" Apr 20 14:27:00.928230 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:00.928103 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fgfxx" podUID="35f35192-c0aa-4a62-b812-0f0be61d0f8e" Apr 20 14:27:01.927858 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:01.927307 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:27:01.927858 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:01.927459 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qsqks" podUID="db79a290-5377-45f9-bb87-89588231d8a7" Apr 20 14:27:02.928058 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:02.928021 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:27:02.928562 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:02.928079 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:27:02.928562 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:02.928165 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-plgss" podUID="fd645412-a5e8-4419-9535-27ac65a5ee65" Apr 20 14:27:02.928562 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:02.928283 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fgfxx" podUID="35f35192-c0aa-4a62-b812-0f0be61d0f8e" Apr 20 14:27:03.486017 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:03.485981 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs\") pod \"network-metrics-daemon-qsqks\" (UID: \"db79a290-5377-45f9-bb87-89588231d8a7\") " pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:27:03.486207 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:03.486030 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbs6n\" (UniqueName: \"kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n\") pod \"network-check-target-fgfxx\" (UID: \"35f35192-c0aa-4a62-b812-0f0be61d0f8e\") " pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:27:03.486207 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:03.486137 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:03.486207 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:03.486149 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:27:03.486207 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:03.486164 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:27:03.486207 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:03.486176 2580 projected.go:194] Error preparing data for projected volume kube-api-access-jbs6n for pod openshift-network-diagnostics/network-check-target-fgfxx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:03.486207 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:03.486197 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs podName:db79a290-5377-45f9-bb87-89588231d8a7 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:11.486178557 +0000 UTC m=+18.206479985 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs") pod "network-metrics-daemon-qsqks" (UID: "db79a290-5377-45f9-bb87-89588231d8a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:03.486575 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:03.486217 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n podName:35f35192-c0aa-4a62-b812-0f0be61d0f8e nodeName:}" failed. No retries permitted until 2026-04-20 14:27:11.48620611 +0000 UTC m=+18.206507519 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-jbs6n" (UniqueName: "kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n") pod "network-check-target-fgfxx" (UID: "35f35192-c0aa-4a62-b812-0f0be61d0f8e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:03.929705 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:03.928766 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:27:03.929705 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:03.928908 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qsqks" podUID="db79a290-5377-45f9-bb87-89588231d8a7" Apr 20 14:27:04.393394 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:04.393352 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd645412-a5e8-4419-9535-27ac65a5ee65-original-pull-secret\") pod \"global-pull-secret-syncer-plgss\" (UID: \"fd645412-a5e8-4419-9535-27ac65a5ee65\") " pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:27:04.393630 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:04.393535 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:27:04.393630 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:04.393619 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd645412-a5e8-4419-9535-27ac65a5ee65-original-pull-secret podName:fd645412-a5e8-4419-9535-27ac65a5ee65 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:12.393598436 +0000 UTC m=+19.113899848 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fd645412-a5e8-4419-9535-27ac65a5ee65-original-pull-secret") pod "global-pull-secret-syncer-plgss" (UID: "fd645412-a5e8-4419-9535-27ac65a5ee65") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:27:04.927584 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:04.927547 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:27:04.927770 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:04.927547 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:27:04.927770 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:04.927691 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fgfxx" podUID="35f35192-c0aa-4a62-b812-0f0be61d0f8e" Apr 20 14:27:04.927770 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:04.927746 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-plgss" podUID="fd645412-a5e8-4419-9535-27ac65a5ee65" Apr 20 14:27:05.927980 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:05.927904 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:27:05.928409 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:05.928055 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qsqks" podUID="db79a290-5377-45f9-bb87-89588231d8a7" Apr 20 14:27:06.927331 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:06.927291 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:27:06.927544 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:06.927291 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:27:06.927544 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:06.927408 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fgfxx" podUID="35f35192-c0aa-4a62-b812-0f0be61d0f8e" Apr 20 14:27:06.927544 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:06.927514 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-plgss" podUID="fd645412-a5e8-4419-9535-27ac65a5ee65" Apr 20 14:27:07.927607 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:07.927559 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:27:07.928096 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:07.927707 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qsqks" podUID="db79a290-5377-45f9-bb87-89588231d8a7" Apr 20 14:27:08.927257 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:08.927218 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:27:08.927459 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:08.927222 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:27:08.927459 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:08.927342 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fgfxx" podUID="35f35192-c0aa-4a62-b812-0f0be61d0f8e" Apr 20 14:27:08.927459 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:08.927426 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-plgss" podUID="fd645412-a5e8-4419-9535-27ac65a5ee65" Apr 20 14:27:09.928207 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:09.928170 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:27:09.928666 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:09.928362 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qsqks" podUID="db79a290-5377-45f9-bb87-89588231d8a7" Apr 20 14:27:10.927546 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:10.927511 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:27:10.927748 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:10.927512 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:27:10.927748 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:10.927646 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-plgss" podUID="fd645412-a5e8-4419-9535-27ac65a5ee65" Apr 20 14:27:10.927748 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:10.927718 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fgfxx" podUID="35f35192-c0aa-4a62-b812-0f0be61d0f8e" Apr 20 14:27:11.545402 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:11.545358 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs\") pod \"network-metrics-daemon-qsqks\" (UID: \"db79a290-5377-45f9-bb87-89588231d8a7\") " pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:27:11.545402 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:11.545406 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbs6n\" (UniqueName: \"kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n\") pod \"network-check-target-fgfxx\" (UID: \"35f35192-c0aa-4a62-b812-0f0be61d0f8e\") " pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:27:11.545942 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:11.545583 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:11.545942 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:11.545615 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:27:11.545942 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:11.545635 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:27:11.545942 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:11.545646 2580 projected.go:194] Error preparing data for projected volume kube-api-access-jbs6n for pod openshift-network-diagnostics/network-check-target-fgfxx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:11.545942 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:11.545664 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs podName:db79a290-5377-45f9-bb87-89588231d8a7 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:27.545639317 +0000 UTC m=+34.265940738 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs") pod "network-metrics-daemon-qsqks" (UID: "db79a290-5377-45f9-bb87-89588231d8a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:11.545942 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:11.545701 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n podName:35f35192-c0aa-4a62-b812-0f0be61d0f8e nodeName:}" failed. No retries permitted until 2026-04-20 14:27:27.545688083 +0000 UTC m=+34.265989492 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-jbs6n" (UniqueName: "kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n") pod "network-check-target-fgfxx" (UID: "35f35192-c0aa-4a62-b812-0f0be61d0f8e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:11.927556 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:11.927515 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:27:11.927731 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:11.927691 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qsqks" podUID="db79a290-5377-45f9-bb87-89588231d8a7" Apr 20 14:27:12.452548 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:12.452512 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd645412-a5e8-4419-9535-27ac65a5ee65-original-pull-secret\") pod \"global-pull-secret-syncer-plgss\" (UID: \"fd645412-a5e8-4419-9535-27ac65a5ee65\") " pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:27:12.452715 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:12.452632 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:27:12.452715 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:12.452691 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd645412-a5e8-4419-9535-27ac65a5ee65-original-pull-secret podName:fd645412-a5e8-4419-9535-27ac65a5ee65 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:28.452676604 +0000 UTC m=+35.172978014 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/fd645412-a5e8-4419-9535-27ac65a5ee65-original-pull-secret") pod "global-pull-secret-syncer-plgss" (UID: "fd645412-a5e8-4419-9535-27ac65a5ee65") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:27:12.928042 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:12.928008 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:27:12.928459 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:12.928008 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:27:12.928459 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:12.928146 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-plgss" podUID="fd645412-a5e8-4419-9535-27ac65a5ee65" Apr 20 14:27:12.928459 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:12.928219 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fgfxx" podUID="35f35192-c0aa-4a62-b812-0f0be61d0f8e" Apr 20 14:27:13.928929 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:13.928709 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:27:13.929339 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:13.929063 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qsqks" podUID="db79a290-5377-45f9-bb87-89588231d8a7" Apr 20 14:27:14.015979 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:14.015941 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mw9b9" event={"ID":"236542ea-fda7-4b96-ae9e-dd685e15e5ef","Type":"ContainerStarted","Data":"0fa0df931dfc42a96da7fcb6c4c24b1acad499a3c8a03e5d9103d058d3da455e"} Apr 20 14:27:14.017366 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:14.017337 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2rnmj" event={"ID":"175a3efc-4aa3-4f7d-ac63-bb40b7cd457b","Type":"ContainerStarted","Data":"8aa023aa0d762b08217a47b77a2fd12bf4be5bba2144e45e3e253609d6ebd443"} Apr 20 14:27:14.018638 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:14.018609 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" event={"ID":"308d3703-6cc2-4cc6-a2e4-187390a83535","Type":"ContainerStarted","Data":"6014c7010daa2d8df0991af4a63d67d836340408ab3db71ed0a9431379724871"} Apr 20 14:27:14.019902 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:14.019878 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" event={"ID":"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad","Type":"ContainerStarted","Data":"6cb9444df7d2675b0a4c9b91da0640d07da277c70eac31b1ed8c4b4f4b825dea"} Apr 20 14:27:14.021144 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:14.021117 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmfdx" event={"ID":"3c30eab0-ce99-4717-9b34-99ab3a10543c","Type":"ContainerStarted","Data":"11fe1292f9532bb5a045a532962c4e6bc56a21b6e5c21d98a1403bab69f47672"} Apr 20 14:27:14.022343 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:14.022325 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xhmdf" event={"ID":"c2fe8871-85d4-447d-a6ca-09895bb1faae","Type":"ContainerStarted","Data":"6c9b2e86b88849a413632f8e37e1402377cebf8eae44bdf5caddd1dbb4604ec8"} Apr 20 14:27:14.025092 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:14.025052 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ct824" event={"ID":"5ab1d81b-18bf-4cdd-80f4-67fa99ae9490","Type":"ContainerStarted","Data":"05bbfda69085c1e75c2b7e25f9d7d9dd03753ad70425a4ac57dec5fdfd3efaeb"} Apr 20 14:27:14.026244 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:14.026224 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" event={"ID":"d554fe2d-d4e2-4aa6-920b-8f246b30eaf5","Type":"ContainerStarted","Data":"7cb58871af50ce765e4e83bfc2edbcecf362b056a9250e6cd17e8264a98f745e"} Apr 20 14:27:14.030740 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:14.030706 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-166.ec2.internal" podStartSLOduration=19.030672983 podStartE2EDuration="19.030672983s" podCreationTimestamp="2026-04-20 14:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:26:59.007388072 +0000 UTC m=+5.727689507" watchObservedRunningTime="2026-04-20 14:27:14.030672983 +0000 UTC m=+20.750974398" Apr 20 14:27:14.031405 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:14.031376 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-mw9b9" podStartSLOduration=3.010422059 podStartE2EDuration="20.03136703s" podCreationTimestamp="2026-04-20 14:26:54 +0000 UTC" firstStartedPulling="2026-04-20 14:26:56.595076233 +0000 UTC m=+3.315377651" lastFinishedPulling="2026-04-20 14:27:13.616021201 +0000 UTC m=+20.336322622" observedRunningTime="2026-04-20 14:27:14.030819468 +0000 UTC m=+20.751120925" watchObservedRunningTime="2026-04-20 14:27:14.03136703 +0000 UTC m=+20.751668461" Apr 20 14:27:14.087222 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:14.087111 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-5fdfv" podStartSLOduration=3.298910995 podStartE2EDuration="20.087091807s" podCreationTimestamp="2026-04-20 14:26:54 +0000 UTC" firstStartedPulling="2026-04-20 14:26:56.583136706 +0000 UTC m=+3.303438116" lastFinishedPulling="2026-04-20 14:27:13.371317518 +0000 UTC m=+20.091618928" observedRunningTime="2026-04-20 14:27:14.056899332 +0000 UTC m=+20.777200763" watchObservedRunningTime="2026-04-20 14:27:14.087091807 +0000 UTC m=+20.807393239" Apr 20 14:27:14.087369 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:14.087325 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ct824" podStartSLOduration=3.049295282 podStartE2EDuration="20.087317827s" podCreationTimestamp="2026-04-20 14:26:54 +0000 UTC" firstStartedPulling="2026-04-20 14:26:56.593138827 +0000 UTC m=+3.313440240" lastFinishedPulling="2026-04-20 14:27:13.631161373 +0000 UTC m=+20.351462785" observedRunningTime="2026-04-20 14:27:14.086762006 +0000 UTC m=+20.807063439" watchObservedRunningTime="2026-04-20 14:27:14.087317827 +0000 UTC m=+20.807619260" Apr 20 14:27:14.146648 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:14.146534 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2rnmj" podStartSLOduration=3.3673862 podStartE2EDuration="20.146516026s" podCreationTimestamp="2026-04-20 14:26:54 +0000 UTC" firstStartedPulling="2026-04-20 14:26:56.592174029 +0000 UTC m=+3.312475454" lastFinishedPulling="2026-04-20 14:27:13.371303851 +0000 UTC m=+20.091605280" observedRunningTime="2026-04-20 14:27:14.121378395 +0000 UTC m=+20.841679828" watchObservedRunningTime="2026-04-20 14:27:14.146516026 +0000 UTC m=+20.866817458" Apr 20 14:27:14.146788 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:14.146764 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-xhmdf" podStartSLOduration=3.128466919 podStartE2EDuration="20.146757143s" podCreationTimestamp="2026-04-20 14:26:54 +0000 UTC" firstStartedPulling="2026-04-20 14:26:56.595449545 +0000 UTC m=+3.315750956" lastFinishedPulling="2026-04-20 14:27:13.613739765 +0000 UTC m=+20.334041180" observedRunningTime="2026-04-20 14:27:14.146191013 +0000 UTC m=+20.866492472" watchObservedRunningTime="2026-04-20 14:27:14.146757143 +0000 UTC m=+20.867058574" Apr 20 14:27:14.927818 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:14.927559 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:27:14.927957 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:14.927559 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:27:14.927957 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:14.927838 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fgfxx" podUID="35f35192-c0aa-4a62-b812-0f0be61d0f8e" Apr 20 14:27:14.927957 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:14.927919 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-plgss" podUID="fd645412-a5e8-4419-9535-27ac65a5ee65" Apr 20 14:27:15.029543 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:15.029493 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2rbh6" event={"ID":"c65ed945-ecba-4055-a1e4-c1ab80fb47d8","Type":"ContainerStarted","Data":"384feec9766091acf7195bac8a355edc2d4cc174260f59f5eb4feea8eefe2eb3"} Apr 20 14:27:15.031789 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:15.031764 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s4vzx_b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad/ovn-acl-logging/0.log" Apr 20 14:27:15.032056 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:15.032035 2580 generic.go:358] "Generic (PLEG): container finished" podID="b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad" containerID="0728c60c5e4742a908dc9c2e777ffaf26fc0bd0c2c68cf983930ddf93ad0594e" exitCode=1 Apr 20 14:27:15.032106 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:15.032094 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" event={"ID":"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad","Type":"ContainerDied","Data":"0728c60c5e4742a908dc9c2e777ffaf26fc0bd0c2c68cf983930ddf93ad0594e"} Apr 20 14:27:15.032150 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:15.032114 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" event={"ID":"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad","Type":"ContainerStarted","Data":"477feea7494e63c9d3c1a82db7c015f426c236281a0172be922227a6cb82ece9"} Apr 20 14:27:15.032150 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:15.032123 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" event={"ID":"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad","Type":"ContainerStarted","Data":"81d051d35e77e8c19bdde04e89a413e6ae842d55438c73cb5662ab9bb434d985"} Apr 20 14:27:15.032150 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:15.032131 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" event={"ID":"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad","Type":"ContainerStarted","Data":"4e5e66b5496278d4450f5b336f80ac0f9f010e67f0f2b3460ee78502f5ec2635"} Apr 20 14:27:15.032150 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:15.032142 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" event={"ID":"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad","Type":"ContainerStarted","Data":"251bc2f5e26e4ac2529bb25b68a3516167c2793a539bda1d3b52b2fc2f3404ad"} Apr 20 14:27:15.033247 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:15.033221 2580 generic.go:358] "Generic (PLEG): container finished" podID="3c30eab0-ce99-4717-9b34-99ab3a10543c" containerID="11fe1292f9532bb5a045a532962c4e6bc56a21b6e5c21d98a1403bab69f47672" exitCode=0 Apr 20 14:27:15.033334 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:15.033312 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmfdx" event={"ID":"3c30eab0-ce99-4717-9b34-99ab3a10543c","Type":"ContainerDied","Data":"11fe1292f9532bb5a045a532962c4e6bc56a21b6e5c21d98a1403bab69f47672"} Apr 20 14:27:15.043205 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:15.043167 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2rbh6" podStartSLOduration=4.019923199 podStartE2EDuration="21.043155208s" podCreationTimestamp="2026-04-20 14:26:54 +0000 UTC" firstStartedPulling="2026-04-20 14:26:56.590548276 +0000 UTC m=+3.310849690" lastFinishedPulling="2026-04-20 14:27:13.613780285 +0000 UTC m=+20.334081699" observedRunningTime="2026-04-20 14:27:15.042620602 +0000 UTC m=+21.762922034" watchObservedRunningTime="2026-04-20 14:27:15.043155208 +0000 UTC m=+21.763456678" Apr 20 14:27:15.429205 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:15.429177 2580 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 14:27:15.886836 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:15.886722 2580 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T14:27:15.429198619Z","UUID":"337c48d5-d651-4b7f-b2c6-5685d35e3df4","Handler":null,"Name":"","Endpoint":""} Apr 20 14:27:15.890868 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:15.890844 2580 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 14:27:15.890868 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:15.890877 2580 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 14:27:15.927771 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:15.927738 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:27:15.927944 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:15.927873 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qsqks" podUID="db79a290-5377-45f9-bb87-89588231d8a7" Apr 20 14:27:16.036689 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:16.036646 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" event={"ID":"308d3703-6cc2-4cc6-a2e4-187390a83535","Type":"ContainerStarted","Data":"13db8507df5e1460d97f00bf29aaf50984f49f5deafdfb0a615efa744a9873b2"} Apr 20 14:27:16.927921 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:16.927887 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:27:16.928099 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:16.927887 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:27:16.928099 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:16.928009 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-plgss" podUID="fd645412-a5e8-4419-9535-27ac65a5ee65" Apr 20 14:27:16.928099 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:16.928065 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fgfxx" podUID="35f35192-c0aa-4a62-b812-0f0be61d0f8e" Apr 20 14:27:17.041587 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:17.041291 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" event={"ID":"308d3703-6cc2-4cc6-a2e4-187390a83535","Type":"ContainerStarted","Data":"5fc6f845724abb53e65e6a40d2f4190de4829a4cc15283ec89b8287e838dd2ce"} Apr 20 14:27:17.044322 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:17.044297 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s4vzx_b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad/ovn-acl-logging/0.log" Apr 20 14:27:17.044763 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:17.044697 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" event={"ID":"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad","Type":"ContainerStarted","Data":"45783cf493098b0b69bd4b5473e8da9059d184d415a9a0ab35648f65394ef0a6"} Apr 20 14:27:17.927570 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:17.927544 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:27:17.927749 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:17.927638 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qsqks" podUID="db79a290-5377-45f9-bb87-89588231d8a7" Apr 20 14:27:18.201673 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:18.201594 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-xhmdf" Apr 20 14:27:18.202301 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:18.202283 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-xhmdf" Apr 20 14:27:18.218097 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:18.218055 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2vmqw" podStartSLOduration=4.131861425 podStartE2EDuration="24.218039925s" podCreationTimestamp="2026-04-20 14:26:54 +0000 UTC" firstStartedPulling="2026-04-20 14:26:56.589029721 +0000 UTC m=+3.309331145" lastFinishedPulling="2026-04-20 14:27:16.675208221 +0000 UTC m=+23.395509645" observedRunningTime="2026-04-20 14:27:17.079145106 +0000 UTC m=+23.799446540" watchObservedRunningTime="2026-04-20 14:27:18.218039925 +0000 UTC m=+24.938341356" Apr 20 14:27:18.928205 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:18.928177 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:27:18.928390 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:18.928249 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:27:18.928390 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:18.928361 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fgfxx" podUID="35f35192-c0aa-4a62-b812-0f0be61d0f8e" Apr 20 14:27:18.928515 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:18.928459 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-plgss" podUID="fd645412-a5e8-4419-9535-27ac65a5ee65" Apr 20 14:27:19.048772 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:19.048742 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-xhmdf" Apr 20 14:27:19.049320 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:19.049301 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-xhmdf" Apr 20 14:27:19.927354 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:19.927317 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:27:19.927919 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:19.927463 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qsqks" podUID="db79a290-5377-45f9-bb87-89588231d8a7" Apr 20 14:27:20.927399 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:20.927215 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:27:20.927980 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:20.927221 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:27:20.927980 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:20.927489 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-plgss" podUID="fd645412-a5e8-4419-9535-27ac65a5ee65" Apr 20 14:27:20.927980 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:20.927568 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fgfxx" podUID="35f35192-c0aa-4a62-b812-0f0be61d0f8e" Apr 20 14:27:21.054772 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:21.054746 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s4vzx_b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad/ovn-acl-logging/0.log" Apr 20 14:27:21.055097 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:21.055064 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" event={"ID":"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad","Type":"ContainerStarted","Data":"1b7441ab9f187edf9d7814c66a77277237ce474ba37798ba9f39d932e4e4fc1b"} Apr 20 14:27:21.055353 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:21.055330 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:27:21.055612 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:21.055594 2580 scope.go:117] "RemoveContainer" containerID="0728c60c5e4742a908dc9c2e777ffaf26fc0bd0c2c68cf983930ddf93ad0594e" Apr 20 14:27:21.056898 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:21.056872 2580 generic.go:358] "Generic (PLEG): container finished" podID="3c30eab0-ce99-4717-9b34-99ab3a10543c" containerID="0069fcca3c2adc32d5aea3f0480b6fc86362b0c238028b29882fc15ed9dd8d3c" exitCode=0 Apr 20 14:27:21.057001 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:21.056953 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmfdx" event={"ID":"3c30eab0-ce99-4717-9b34-99ab3a10543c","Type":"ContainerDied","Data":"0069fcca3c2adc32d5aea3f0480b6fc86362b0c238028b29882fc15ed9dd8d3c"} Apr 20 14:27:21.070868 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:21.070853 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:27:21.928276 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:21.928243 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:27:21.928700 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:21.928399 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qsqks" podUID="db79a290-5377-45f9-bb87-89588231d8a7" Apr 20 14:27:22.061845 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:22.061770 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s4vzx_b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad/ovn-acl-logging/0.log" Apr 20 14:27:22.062154 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:22.062128 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" event={"ID":"b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad","Type":"ContainerStarted","Data":"b7976b130382ec1b1060679ecb864933a404f8e0a7baab5aae237fb36ca2eb2f"} Apr 20 14:27:22.062516 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:22.062482 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:27:22.062597 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:22.062529 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:27:22.064023 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:22.063995 2580 generic.go:358] "Generic (PLEG): container finished" podID="3c30eab0-ce99-4717-9b34-99ab3a10543c" containerID="1997b3de9d3543d538989fd18a659936ddbfd743b52997d66404776c29496195" exitCode=0 Apr 20 14:27:22.064117 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:22.064064 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmfdx" event={"ID":"3c30eab0-ce99-4717-9b34-99ab3a10543c","Type":"ContainerDied","Data":"1997b3de9d3543d538989fd18a659936ddbfd743b52997d66404776c29496195"} Apr 20 14:27:22.076615 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:22.076594 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:27:22.113310 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:22.113273 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" podStartSLOduration=11.03881858 podStartE2EDuration="28.113262046s" podCreationTimestamp="2026-04-20 14:26:54 +0000 UTC" firstStartedPulling="2026-04-20 14:26:56.587410799 +0000 UTC m=+3.307712227" lastFinishedPulling="2026-04-20 14:27:13.661854268 +0000 UTC m=+20.382155693" observedRunningTime="2026-04-20 14:27:22.089523335 +0000 UTC m=+28.809824768" watchObservedRunningTime="2026-04-20 14:27:22.113262046 +0000 UTC m=+28.833563478" Apr 20 14:27:22.147711 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:22.147685 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-plgss"] Apr 20 14:27:22.147816 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:22.147804 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:27:22.147942 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:22.147910 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-plgss" podUID="fd645412-a5e8-4419-9535-27ac65a5ee65" Apr 20 14:27:22.150624 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:22.150601 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qsqks"] Apr 20 14:27:22.150719 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:22.150684 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:27:22.150809 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:22.150790 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qsqks" podUID="db79a290-5377-45f9-bb87-89588231d8a7" Apr 20 14:27:22.161475 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:22.161454 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fgfxx"] Apr 20 14:27:22.161612 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:22.161572 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:27:22.161717 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:22.161698 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fgfxx" podUID="35f35192-c0aa-4a62-b812-0f0be61d0f8e" Apr 20 14:27:23.069920 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:23.069893 2580 generic.go:358] "Generic (PLEG): container finished" podID="3c30eab0-ce99-4717-9b34-99ab3a10543c" containerID="1f1df9529441f5103485eefdaf6ab9ce983f5010f10d22d7901980867106c0d4" exitCode=0 Apr 20 14:27:23.070337 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:23.069972 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmfdx" event={"ID":"3c30eab0-ce99-4717-9b34-99ab3a10543c","Type":"ContainerDied","Data":"1f1df9529441f5103485eefdaf6ab9ce983f5010f10d22d7901980867106c0d4"} Apr 20 14:27:23.928601 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:23.928343 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:27:23.928774 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:23.928403 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:27:23.928774 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:23.928696 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-plgss" podUID="fd645412-a5e8-4419-9535-27ac65a5ee65" Apr 20 14:27:23.928875 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:23.928796 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fgfxx" podUID="35f35192-c0aa-4a62-b812-0f0be61d0f8e" Apr 20 14:27:23.928875 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:23.928424 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:27:23.928972 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:23.928910 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qsqks" podUID="db79a290-5377-45f9-bb87-89588231d8a7" Apr 20 14:27:25.927912 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:25.927883 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:27:25.928462 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:25.928006 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-plgss" podUID="fd645412-a5e8-4419-9535-27ac65a5ee65" Apr 20 14:27:25.928462 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:25.928014 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:27:25.928462 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:25.928107 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qsqks" podUID="db79a290-5377-45f9-bb87-89588231d8a7" Apr 20 14:27:25.928462 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:25.928128 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:27:25.928462 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:25.928190 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fgfxx" podUID="35f35192-c0aa-4a62-b812-0f0be61d0f8e" Apr 20 14:27:26.538626 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.538596 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-166.ec2.internal" event="NodeReady" Apr 20 14:27:26.538890 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.538750 2580 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 14:27:26.574341 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.574309 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-556bcd8745-tx4hg"] Apr 20 14:27:26.590117 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.590082 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vlqz9"] Apr 20 14:27:26.590300 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.590273 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.593027 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.592800 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 14:27:26.593027 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.592832 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 14:27:26.593027 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.592946 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-bsqq7\"" Apr 20 14:27:26.593271 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.593065 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 14:27:26.599877 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.599514 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 14:27:26.600619 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.600570 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-556bcd8745-tx4hg"] Apr 20 14:27:26.600619 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.600599 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8g5vx"] Apr 20 14:27:26.600785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.600734 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vlqz9" Apr 20 14:27:26.603012 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.602705 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 14:27:26.603012 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.602735 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5k2mg\"" Apr 20 14:27:26.603012 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.602898 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 14:27:26.610640 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.610620 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vlqz9"] Apr 20 14:27:26.610745 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.610732 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8g5vx" Apr 20 14:27:26.612632 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.612615 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8g5vx"] Apr 20 14:27:26.613043 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.612935 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 14:27:26.613300 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.612946 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 14:27:26.613300 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.612999 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-79jcr\"" Apr 20 14:27:26.613559 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.613544 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 14:27:26.760293 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.760199 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz6l9\" (UniqueName: \"kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-kube-api-access-zz6l9\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.760458 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.760322 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.760458 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.760367 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert\") pod \"ingress-canary-8g5vx\" (UID: \"d6c411bc-80c8-4d9c-993c-cb6aeb232750\") " pod="openshift-ingress-canary/ingress-canary-8g5vx" Apr 20 14:27:26.760458 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.760431 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26882095-42a8-4889-9983-45d2dc2d0fc6-config-volume\") pod \"dns-default-vlqz9\" (UID: \"26882095-42a8-4889-9983-45d2dc2d0fc6\") " pod="openshift-dns/dns-default-vlqz9" Apr 20 14:27:26.760624 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.760465 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7z4v\" (UniqueName: \"kubernetes.io/projected/d6c411bc-80c8-4d9c-993c-cb6aeb232750-kube-api-access-c7z4v\") pod \"ingress-canary-8g5vx\" (UID: \"d6c411bc-80c8-4d9c-993c-cb6aeb232750\") " pod="openshift-ingress-canary/ingress-canary-8g5vx" Apr 20 14:27:26.760624 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.760546 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-certificates\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.760624 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.760582 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-bound-sa-token\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.760753 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.760625 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/26882095-42a8-4889-9983-45d2dc2d0fc6-tmp-dir\") pod \"dns-default-vlqz9\" (UID: \"26882095-42a8-4889-9983-45d2dc2d0fc6\") " pod="openshift-dns/dns-default-vlqz9" Apr 20 14:27:26.760753 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.760651 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/150936cc-56bb-4f1a-9468-4a8527e8cec7-ca-trust-extracted\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.760753 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.760681 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/150936cc-56bb-4f1a-9468-4a8527e8cec7-trusted-ca\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.760753 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.760730 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/150936cc-56bb-4f1a-9468-4a8527e8cec7-image-registry-private-configuration\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.760897 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.760795 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls\") pod \"dns-default-vlqz9\" (UID: \"26882095-42a8-4889-9983-45d2dc2d0fc6\") " pod="openshift-dns/dns-default-vlqz9" Apr 20 14:27:26.760897 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.760818 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t79l9\" (UniqueName: \"kubernetes.io/projected/26882095-42a8-4889-9983-45d2dc2d0fc6-kube-api-access-t79l9\") pod \"dns-default-vlqz9\" (UID: \"26882095-42a8-4889-9983-45d2dc2d0fc6\") " pod="openshift-dns/dns-default-vlqz9" Apr 20 14:27:26.760897 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.760841 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/150936cc-56bb-4f1a-9468-4a8527e8cec7-installation-pull-secrets\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.862077 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.862040 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.862077 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.862078 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert\") pod \"ingress-canary-8g5vx\" (UID: \"d6c411bc-80c8-4d9c-993c-cb6aeb232750\") " pod="openshift-ingress-canary/ingress-canary-8g5vx" Apr 20 14:27:26.862330 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.862105 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26882095-42a8-4889-9983-45d2dc2d0fc6-config-volume\") pod \"dns-default-vlqz9\" (UID: \"26882095-42a8-4889-9983-45d2dc2d0fc6\") " pod="openshift-dns/dns-default-vlqz9" Apr 20 14:27:26.862330 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.862130 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7z4v\" (UniqueName: \"kubernetes.io/projected/d6c411bc-80c8-4d9c-993c-cb6aeb232750-kube-api-access-c7z4v\") pod \"ingress-canary-8g5vx\" (UID: \"d6c411bc-80c8-4d9c-993c-cb6aeb232750\") " pod="openshift-ingress-canary/ingress-canary-8g5vx" Apr 20 14:27:26.862330 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.862158 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-certificates\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.862330 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.862181 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-bound-sa-token\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.862330 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:26.862202 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:27:26.862330 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.862218 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/26882095-42a8-4889-9983-45d2dc2d0fc6-tmp-dir\") pod \"dns-default-vlqz9\" (UID: \"26882095-42a8-4889-9983-45d2dc2d0fc6\") " pod="openshift-dns/dns-default-vlqz9" Apr 20 14:27:26.862330 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:26.862225 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-556bcd8745-tx4hg: secret "image-registry-tls" not found Apr 20 14:27:26.862330 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.862244 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/150936cc-56bb-4f1a-9468-4a8527e8cec7-ca-trust-extracted\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.862330 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:26.862264 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:27:26.862330 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:26.862291 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls podName:150936cc-56bb-4f1a-9468-4a8527e8cec7 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:27.362271159 +0000 UTC m=+34.082572574 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls") pod "image-registry-556bcd8745-tx4hg" (UID: "150936cc-56bb-4f1a-9468-4a8527e8cec7") : secret "image-registry-tls" not found Apr 20 14:27:26.862844 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:26.862331 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert podName:d6c411bc-80c8-4d9c-993c-cb6aeb232750 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:27.362311773 +0000 UTC m=+34.082613196 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert") pod "ingress-canary-8g5vx" (UID: "d6c411bc-80c8-4d9c-993c-cb6aeb232750") : secret "canary-serving-cert" not found Apr 20 14:27:26.862844 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.862272 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/150936cc-56bb-4f1a-9468-4a8527e8cec7-trusted-ca\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.862844 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.862407 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/150936cc-56bb-4f1a-9468-4a8527e8cec7-image-registry-private-configuration\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.862844 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.862477 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls\") pod \"dns-default-vlqz9\" (UID: \"26882095-42a8-4889-9983-45d2dc2d0fc6\") " pod="openshift-dns/dns-default-vlqz9" Apr 20 14:27:26.862844 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.862527 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t79l9\" (UniqueName: \"kubernetes.io/projected/26882095-42a8-4889-9983-45d2dc2d0fc6-kube-api-access-t79l9\") pod \"dns-default-vlqz9\" (UID: \"26882095-42a8-4889-9983-45d2dc2d0fc6\") " pod="openshift-dns/dns-default-vlqz9" Apr 20 14:27:26.862844 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.862553 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/150936cc-56bb-4f1a-9468-4a8527e8cec7-installation-pull-secrets\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.862844 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.862600 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zz6l9\" (UniqueName: \"kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-kube-api-access-zz6l9\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.862844 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.862689 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/26882095-42a8-4889-9983-45d2dc2d0fc6-tmp-dir\") pod \"dns-default-vlqz9\" (UID: \"26882095-42a8-4889-9983-45d2dc2d0fc6\") " pod="openshift-dns/dns-default-vlqz9" Apr 20 14:27:26.862844 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:26.862734 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:27:26.862844 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.862814 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26882095-42a8-4889-9983-45d2dc2d0fc6-config-volume\") pod \"dns-default-vlqz9\" (UID: \"26882095-42a8-4889-9983-45d2dc2d0fc6\") " pod="openshift-dns/dns-default-vlqz9" Apr 20 14:27:26.863275 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.862912 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/150936cc-56bb-4f1a-9468-4a8527e8cec7-ca-trust-extracted\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.863275 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.862950 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-certificates\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.863275 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:26.862999 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls podName:26882095-42a8-4889-9983-45d2dc2d0fc6 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:27.36298477 +0000 UTC m=+34.083286187 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls") pod "dns-default-vlqz9" (UID: "26882095-42a8-4889-9983-45d2dc2d0fc6") : secret "dns-default-metrics-tls" not found Apr 20 14:27:26.864819 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.864795 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/150936cc-56bb-4f1a-9468-4a8527e8cec7-trusted-ca\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.867179 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.867156 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/150936cc-56bb-4f1a-9468-4a8527e8cec7-installation-pull-secrets\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.867294 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.867156 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/150936cc-56bb-4f1a-9468-4a8527e8cec7-image-registry-private-configuration\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.871614 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.871579 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-bound-sa-token\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:26.871821 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.871658 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t79l9\" (UniqueName: \"kubernetes.io/projected/26882095-42a8-4889-9983-45d2dc2d0fc6-kube-api-access-t79l9\") pod \"dns-default-vlqz9\" (UID: \"26882095-42a8-4889-9983-45d2dc2d0fc6\") " pod="openshift-dns/dns-default-vlqz9" Apr 20 14:27:26.871821 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.871736 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7z4v\" (UniqueName: \"kubernetes.io/projected/d6c411bc-80c8-4d9c-993c-cb6aeb232750-kube-api-access-c7z4v\") pod \"ingress-canary-8g5vx\" (UID: \"d6c411bc-80c8-4d9c-993c-cb6aeb232750\") " pod="openshift-ingress-canary/ingress-canary-8g5vx" Apr 20 14:27:26.871821 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:26.871785 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz6l9\" (UniqueName: \"kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-kube-api-access-zz6l9\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:27.367295 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:27.367250 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls\") pod \"dns-default-vlqz9\" (UID: \"26882095-42a8-4889-9983-45d2dc2d0fc6\") " pod="openshift-dns/dns-default-vlqz9" Apr 20 14:27:27.368084 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:27.367357 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:27.368084 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:27.367393 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert\") pod \"ingress-canary-8g5vx\" (UID: \"d6c411bc-80c8-4d9c-993c-cb6aeb232750\") " pod="openshift-ingress-canary/ingress-canary-8g5vx" Apr 20 14:27:27.368084 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:27.367419 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:27:27.368084 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:27.367510 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls podName:26882095-42a8-4889-9983-45d2dc2d0fc6 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:28.367474461 +0000 UTC m=+35.087775886 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls") pod "dns-default-vlqz9" (UID: "26882095-42a8-4889-9983-45d2dc2d0fc6") : secret "dns-default-metrics-tls" not found Apr 20 14:27:27.368084 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:27.367531 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:27:27.368084 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:27.367550 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-556bcd8745-tx4hg: secret "image-registry-tls" not found Apr 20 14:27:27.368084 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:27.367555 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:27:27.368084 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:27.367604 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls podName:150936cc-56bb-4f1a-9468-4a8527e8cec7 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:28.367586537 +0000 UTC m=+35.087887946 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls") pod "image-registry-556bcd8745-tx4hg" (UID: "150936cc-56bb-4f1a-9468-4a8527e8cec7") : secret "image-registry-tls" not found Apr 20 14:27:27.368084 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:27.367620 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert podName:d6c411bc-80c8-4d9c-993c-cb6aeb232750 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:28.367612328 +0000 UTC m=+35.087913739 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert") pod "ingress-canary-8g5vx" (UID: "d6c411bc-80c8-4d9c-993c-cb6aeb232750") : secret "canary-serving-cert" not found Apr 20 14:27:27.568884 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:27.568849 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs\") pod \"network-metrics-daemon-qsqks\" (UID: \"db79a290-5377-45f9-bb87-89588231d8a7\") " pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:27:27.569100 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:27.568894 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbs6n\" (UniqueName: \"kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n\") pod \"network-check-target-fgfxx\" (UID: \"35f35192-c0aa-4a62-b812-0f0be61d0f8e\") " pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:27:27.569100 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:27.569016 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:27.569100 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:27.569037 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:27:27.569100 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:27.569064 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:27:27.569100 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:27.569077 2580 projected.go:194] Error preparing data for projected volume kube-api-access-jbs6n for pod openshift-network-diagnostics/network-check-target-fgfxx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:27.569100 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:27.569098 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs podName:db79a290-5377-45f9-bb87-89588231d8a7 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:59.569080222 +0000 UTC m=+66.289381635 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs") pod "network-metrics-daemon-qsqks" (UID: "db79a290-5377-45f9-bb87-89588231d8a7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:27:27.569326 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:27.569128 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n podName:35f35192-c0aa-4a62-b812-0f0be61d0f8e nodeName:}" failed. No retries permitted until 2026-04-20 14:27:59.569112127 +0000 UTC m=+66.289413539 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-jbs6n" (UniqueName: "kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n") pod "network-check-target-fgfxx" (UID: "35f35192-c0aa-4a62-b812-0f0be61d0f8e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:27:27.928072 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:27.928036 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:27:27.928251 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:27.928036 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:27:27.928251 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:27.928045 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:27:27.931525 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:27.931435 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 14:27:27.932340 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:27.931986 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 14:27:27.932340 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:27.932043 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 14:27:27.932340 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:27.932050 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xmk8q\"" Apr 20 14:27:27.932340 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:27.932184 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xgs6p\"" Apr 20 14:27:27.932340 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:27.932313 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 14:27:28.375417 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:28.375376 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls\") pod \"dns-default-vlqz9\" (UID: \"26882095-42a8-4889-9983-45d2dc2d0fc6\") " pod="openshift-dns/dns-default-vlqz9" Apr 20 14:27:28.375941 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:28.375443 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:28.375941 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:28.375469 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert\") pod \"ingress-canary-8g5vx\" (UID: \"d6c411bc-80c8-4d9c-993c-cb6aeb232750\") " pod="openshift-ingress-canary/ingress-canary-8g5vx" Apr 20 14:27:28.375941 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:28.375561 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:27:28.375941 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:28.375603 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:27:28.375941 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:28.375653 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert podName:d6c411bc-80c8-4d9c-993c-cb6aeb232750 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:30.375632359 +0000 UTC m=+37.095933770 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert") pod "ingress-canary-8g5vx" (UID: "d6c411bc-80c8-4d9c-993c-cb6aeb232750") : secret "canary-serving-cert" not found Apr 20 14:27:28.375941 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:28.375565 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:27:28.375941 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:28.375685 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-556bcd8745-tx4hg: secret "image-registry-tls" not found Apr 20 14:27:28.375941 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:28.375669 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls podName:26882095-42a8-4889-9983-45d2dc2d0fc6 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:30.375661485 +0000 UTC m=+37.095962895 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls") pod "dns-default-vlqz9" (UID: "26882095-42a8-4889-9983-45d2dc2d0fc6") : secret "dns-default-metrics-tls" not found Apr 20 14:27:28.375941 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:28.375738 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls podName:150936cc-56bb-4f1a-9468-4a8527e8cec7 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:30.375726337 +0000 UTC m=+37.096027747 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls") pod "image-registry-556bcd8745-tx4hg" (UID: "150936cc-56bb-4f1a-9468-4a8527e8cec7") : secret "image-registry-tls" not found Apr 20 14:27:28.476127 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:28.476079 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd645412-a5e8-4419-9535-27ac65a5ee65-original-pull-secret\") pod \"global-pull-secret-syncer-plgss\" (UID: \"fd645412-a5e8-4419-9535-27ac65a5ee65\") " pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:27:28.478971 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:28.478947 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/fd645412-a5e8-4419-9535-27ac65a5ee65-original-pull-secret\") pod \"global-pull-secret-syncer-plgss\" (UID: \"fd645412-a5e8-4419-9535-27ac65a5ee65\") " pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:27:28.540689 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:28.540653 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-plgss" Apr 20 14:27:28.722709 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:28.722676 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-plgss"] Apr 20 14:27:28.771052 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:27:28.771012 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd645412_a5e8_4419_9535_27ac65a5ee65.slice/crio-ada1f17e745d2130fa4b1f47b3e82ed92e11b97f42a29025d481e67d1694d9f5 WatchSource:0}: Error finding container ada1f17e745d2130fa4b1f47b3e82ed92e11b97f42a29025d481e67d1694d9f5: Status 404 returned error can't find the container with id ada1f17e745d2130fa4b1f47b3e82ed92e11b97f42a29025d481e67d1694d9f5 Apr 20 14:27:29.085788 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:29.085476 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmfdx" event={"ID":"3c30eab0-ce99-4717-9b34-99ab3a10543c","Type":"ContainerStarted","Data":"999674e0e954382fb4fc7001217d56e719cb43526eae905ce81fea731e513908"} Apr 20 14:27:29.086581 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:29.086558 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-plgss" event={"ID":"fd645412-a5e8-4419-9535-27ac65a5ee65","Type":"ContainerStarted","Data":"ada1f17e745d2130fa4b1f47b3e82ed92e11b97f42a29025d481e67d1694d9f5"} Apr 20 14:27:30.091087 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:30.091054 2580 generic.go:358] "Generic (PLEG): container finished" podID="3c30eab0-ce99-4717-9b34-99ab3a10543c" containerID="999674e0e954382fb4fc7001217d56e719cb43526eae905ce81fea731e513908" exitCode=0 Apr 20 14:27:30.091662 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:30.091103 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmfdx" event={"ID":"3c30eab0-ce99-4717-9b34-99ab3a10543c","Type":"ContainerDied","Data":"999674e0e954382fb4fc7001217d56e719cb43526eae905ce81fea731e513908"} Apr 20 14:27:30.392654 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:30.392577 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:30.392654 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:30.392619 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert\") pod \"ingress-canary-8g5vx\" (UID: \"d6c411bc-80c8-4d9c-993c-cb6aeb232750\") " pod="openshift-ingress-canary/ingress-canary-8g5vx" Apr 20 14:27:30.392831 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:30.392679 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls\") pod \"dns-default-vlqz9\" (UID: \"26882095-42a8-4889-9983-45d2dc2d0fc6\") " pod="openshift-dns/dns-default-vlqz9" Apr 20 14:27:30.392831 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:30.392728 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:27:30.392831 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:30.392747 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-556bcd8745-tx4hg: secret "image-registry-tls" not found Apr 20 14:27:30.392831 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:30.392763 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:27:30.392831 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:30.392785 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:27:30.392831 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:30.392809 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls podName:150936cc-56bb-4f1a-9468-4a8527e8cec7 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:34.392794462 +0000 UTC m=+41.113095878 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls") pod "image-registry-556bcd8745-tx4hg" (UID: "150936cc-56bb-4f1a-9468-4a8527e8cec7") : secret "image-registry-tls" not found Apr 20 14:27:30.392831 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:30.392825 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls podName:26882095-42a8-4889-9983-45d2dc2d0fc6 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:34.392819037 +0000 UTC m=+41.113120447 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls") pod "dns-default-vlqz9" (UID: "26882095-42a8-4889-9983-45d2dc2d0fc6") : secret "dns-default-metrics-tls" not found Apr 20 14:27:30.392831 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:30.392836 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert podName:d6c411bc-80c8-4d9c-993c-cb6aeb232750 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:34.392831092 +0000 UTC m=+41.113132502 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert") pod "ingress-canary-8g5vx" (UID: "d6c411bc-80c8-4d9c-993c-cb6aeb232750") : secret "canary-serving-cert" not found Apr 20 14:27:31.096042 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:31.096009 2580 generic.go:358] "Generic (PLEG): container finished" podID="3c30eab0-ce99-4717-9b34-99ab3a10543c" containerID="4e2bf1ae636af5277e0befe1863efcf264462163ecdca619d9eb85a80589480d" exitCode=0 Apr 20 14:27:31.096394 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:31.096056 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmfdx" event={"ID":"3c30eab0-ce99-4717-9b34-99ab3a10543c","Type":"ContainerDied","Data":"4e2bf1ae636af5277e0befe1863efcf264462163ecdca619d9eb85a80589480d"} Apr 20 14:27:32.100973 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:32.100791 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmfdx" event={"ID":"3c30eab0-ce99-4717-9b34-99ab3a10543c","Type":"ContainerStarted","Data":"c1953376747f3183888f1734dd6cf09391240601f8e29cb3ae6d31b3034546b1"} Apr 20 14:27:32.122150 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:32.122106 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dmfdx" podStartSLOduration=5.890464581 podStartE2EDuration="38.122090033s" podCreationTimestamp="2026-04-20 14:26:54 +0000 UTC" firstStartedPulling="2026-04-20 14:26:56.585903931 +0000 UTC m=+3.306205341" lastFinishedPulling="2026-04-20 14:27:28.817529379 +0000 UTC m=+35.537830793" observedRunningTime="2026-04-20 14:27:32.121975288 +0000 UTC m=+38.842276719" watchObservedRunningTime="2026-04-20 14:27:32.122090033 +0000 UTC m=+38.842391465" Apr 20 14:27:34.425055 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:34.425010 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:34.425055 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:34.425058 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert\") pod \"ingress-canary-8g5vx\" (UID: \"d6c411bc-80c8-4d9c-993c-cb6aeb232750\") " pod="openshift-ingress-canary/ingress-canary-8g5vx" Apr 20 14:27:34.425623 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:34.425100 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls\") pod \"dns-default-vlqz9\" (UID: \"26882095-42a8-4889-9983-45d2dc2d0fc6\") " pod="openshift-dns/dns-default-vlqz9" Apr 20 14:27:34.425623 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:34.425182 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:27:34.425623 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:34.425181 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:27:34.425623 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:34.425209 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:27:34.425623 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:34.425236 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls podName:26882095-42a8-4889-9983-45d2dc2d0fc6 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:42.425221878 +0000 UTC m=+49.145523288 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls") pod "dns-default-vlqz9" (UID: "26882095-42a8-4889-9983-45d2dc2d0fc6") : secret "dns-default-metrics-tls" not found Apr 20 14:27:34.425623 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:34.425259 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert podName:d6c411bc-80c8-4d9c-993c-cb6aeb232750 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:42.425245911 +0000 UTC m=+49.145547321 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert") pod "ingress-canary-8g5vx" (UID: "d6c411bc-80c8-4d9c-993c-cb6aeb232750") : secret "canary-serving-cert" not found Apr 20 14:27:34.425623 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:34.425210 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-556bcd8745-tx4hg: secret "image-registry-tls" not found Apr 20 14:27:34.425623 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:34.425285 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls podName:150936cc-56bb-4f1a-9468-4a8527e8cec7 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:42.425280065 +0000 UTC m=+49.145581474 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls") pod "image-registry-556bcd8745-tx4hg" (UID: "150936cc-56bb-4f1a-9468-4a8527e8cec7") : secret "image-registry-tls" not found Apr 20 14:27:35.108709 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:35.108671 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-plgss" event={"ID":"fd645412-a5e8-4419-9535-27ac65a5ee65","Type":"ContainerStarted","Data":"f127a9e23a6ed419ea638afd196ae47c46e4146a62fe08f8c96b415816be4e08"} Apr 20 14:27:35.123166 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:35.123106 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-plgss" podStartSLOduration=33.863514108 podStartE2EDuration="39.123088462s" podCreationTimestamp="2026-04-20 14:26:56 +0000 UTC" firstStartedPulling="2026-04-20 14:27:28.795510828 +0000 UTC m=+35.515812251" lastFinishedPulling="2026-04-20 14:27:34.055085192 +0000 UTC m=+40.775386605" observedRunningTime="2026-04-20 14:27:35.122738103 +0000 UTC m=+41.843039535" watchObservedRunningTime="2026-04-20 14:27:35.123088462 +0000 UTC m=+41.843389895" Apr 20 14:27:42.481202 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:42.481164 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:42.481202 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:42.481204 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert\") pod \"ingress-canary-8g5vx\" (UID: \"d6c411bc-80c8-4d9c-993c-cb6aeb232750\") " pod="openshift-ingress-canary/ingress-canary-8g5vx" Apr 20 14:27:42.481744 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:42.481307 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:27:42.481744 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:42.481318 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:27:42.481744 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:42.481336 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-556bcd8745-tx4hg: secret "image-registry-tls" not found Apr 20 14:27:42.481744 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:42.481353 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls\") pod \"dns-default-vlqz9\" (UID: \"26882095-42a8-4889-9983-45d2dc2d0fc6\") " pod="openshift-dns/dns-default-vlqz9" Apr 20 14:27:42.481744 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:42.481375 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert podName:d6c411bc-80c8-4d9c-993c-cb6aeb232750 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:58.481356367 +0000 UTC m=+65.201657850 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert") pod "ingress-canary-8g5vx" (UID: "d6c411bc-80c8-4d9c-993c-cb6aeb232750") : secret "canary-serving-cert" not found Apr 20 14:27:42.481744 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:42.481395 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls podName:150936cc-56bb-4f1a-9468-4a8527e8cec7 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:58.481387191 +0000 UTC m=+65.201688607 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls") pod "image-registry-556bcd8745-tx4hg" (UID: "150936cc-56bb-4f1a-9468-4a8527e8cec7") : secret "image-registry-tls" not found Apr 20 14:27:42.481744 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:42.481442 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:27:42.481744 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:42.481520 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls podName:26882095-42a8-4889-9983-45d2dc2d0fc6 nodeName:}" failed. No retries permitted until 2026-04-20 14:27:58.48147951 +0000 UTC m=+65.201780925 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls") pod "dns-default-vlqz9" (UID: "26882095-42a8-4889-9983-45d2dc2d0fc6") : secret "dns-default-metrics-tls" not found Apr 20 14:27:54.085484 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:54.085455 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-s4vzx" Apr 20 14:27:58.491310 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:58.491257 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls\") pod \"dns-default-vlqz9\" (UID: \"26882095-42a8-4889-9983-45d2dc2d0fc6\") " pod="openshift-dns/dns-default-vlqz9" Apr 20 14:27:58.491788 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:58.491358 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:27:58.491788 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:58.491386 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert\") pod \"ingress-canary-8g5vx\" (UID: \"d6c411bc-80c8-4d9c-993c-cb6aeb232750\") " pod="openshift-ingress-canary/ingress-canary-8g5vx" Apr 20 14:27:58.491788 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:58.491417 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:27:58.491788 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:58.491492 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls podName:26882095-42a8-4889-9983-45d2dc2d0fc6 nodeName:}" failed. No retries permitted until 2026-04-20 14:28:30.491470801 +0000 UTC m=+97.211772223 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls") pod "dns-default-vlqz9" (UID: "26882095-42a8-4889-9983-45d2dc2d0fc6") : secret "dns-default-metrics-tls" not found Apr 20 14:27:58.491788 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:58.491513 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:27:58.491788 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:58.491515 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:27:58.491788 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:58.491536 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-556bcd8745-tx4hg: secret "image-registry-tls" not found Apr 20 14:27:58.491788 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:58.491560 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert podName:d6c411bc-80c8-4d9c-993c-cb6aeb232750 nodeName:}" failed. No retries permitted until 2026-04-20 14:28:30.491547326 +0000 UTC m=+97.211848737 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert") pod "ingress-canary-8g5vx" (UID: "d6c411bc-80c8-4d9c-993c-cb6aeb232750") : secret "canary-serving-cert" not found Apr 20 14:27:58.491788 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:58.491594 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls podName:150936cc-56bb-4f1a-9468-4a8527e8cec7 nodeName:}" failed. No retries permitted until 2026-04-20 14:28:30.491580258 +0000 UTC m=+97.211881680 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls") pod "image-registry-556bcd8745-tx4hg" (UID: "150936cc-56bb-4f1a-9468-4a8527e8cec7") : secret "image-registry-tls" not found Apr 20 14:27:59.600008 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:59.599952 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs\") pod \"network-metrics-daemon-qsqks\" (UID: \"db79a290-5377-45f9-bb87-89588231d8a7\") " pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:27:59.600008 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:59.600012 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbs6n\" (UniqueName: \"kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n\") pod \"network-check-target-fgfxx\" (UID: \"35f35192-c0aa-4a62-b812-0f0be61d0f8e\") " pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:27:59.606217 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:59.606187 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 14:27:59.606415 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:59.606398 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 14:27:59.610527 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:59.610511 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 14:27:59.610600 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:27:59.610569 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs podName:db79a290-5377-45f9-bb87-89588231d8a7 nodeName:}" failed. No retries permitted until 2026-04-20 14:29:03.61055182 +0000 UTC m=+130.330853230 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs") pod "network-metrics-daemon-qsqks" (UID: "db79a290-5377-45f9-bb87-89588231d8a7") : secret "metrics-daemon-secret" not found Apr 20 14:27:59.613540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:59.613524 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 14:27:59.625376 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:59.625353 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbs6n\" (UniqueName: \"kubernetes.io/projected/35f35192-c0aa-4a62-b812-0f0be61d0f8e-kube-api-access-jbs6n\") pod \"network-check-target-fgfxx\" (UID: \"35f35192-c0aa-4a62-b812-0f0be61d0f8e\") " pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:27:59.748938 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:59.748906 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xgs6p\"" Apr 20 14:27:59.756945 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:59.756926 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:27:59.867292 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:27:59.867214 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fgfxx"] Apr 20 14:27:59.870204 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:27:59.870177 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35f35192_c0aa_4a62_b812_0f0be61d0f8e.slice/crio-594e215662732dce7d07406c505afdb64961c2693d7405694d440efa7d146881 WatchSource:0}: Error finding container 594e215662732dce7d07406c505afdb64961c2693d7405694d440efa7d146881: Status 404 returned error can't find the container with id 594e215662732dce7d07406c505afdb64961c2693d7405694d440efa7d146881 Apr 20 14:28:00.154730 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:00.154642 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fgfxx" event={"ID":"35f35192-c0aa-4a62-b812-0f0be61d0f8e","Type":"ContainerStarted","Data":"594e215662732dce7d07406c505afdb64961c2693d7405694d440efa7d146881"} Apr 20 14:28:03.164093 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:03.164050 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fgfxx" event={"ID":"35f35192-c0aa-4a62-b812-0f0be61d0f8e","Type":"ContainerStarted","Data":"b9d4fb174e8a7494e3083ad72b83d70b11958c761f00d53ba38fe1dc17812bd5"} Apr 20 14:28:03.164472 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:03.164180 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:28:03.200955 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:03.200901 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-fgfxx" podStartSLOduration=66.557461566 podStartE2EDuration="1m9.200884054s" podCreationTimestamp="2026-04-20 14:26:54 +0000 UTC" firstStartedPulling="2026-04-20 14:27:59.872018305 +0000 UTC m=+66.592319716" lastFinishedPulling="2026-04-20 14:28:02.515440775 +0000 UTC m=+69.235742204" observedRunningTime="2026-04-20 14:28:03.200660201 +0000 UTC m=+69.920961632" watchObservedRunningTime="2026-04-20 14:28:03.200884054 +0000 UTC m=+69.921185485" Apr 20 14:28:22.281334 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.281207 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj"] Apr 20 14:28:22.284135 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.284101 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj" Apr 20 14:28:22.285266 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.285242 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zjdjg"] Apr 20 14:28:22.287188 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.287173 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zjdjg" Apr 20 14:28:22.287797 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.287774 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 20 14:28:22.288855 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.288837 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 14:28:22.288855 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.288851 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 20 14:28:22.289023 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.288899 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-2bfrt\"" Apr 20 14:28:22.289023 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.288901 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 14:28:22.289347 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.289332 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:28:22.289457 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.289382 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 20 14:28:22.289639 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.289618 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-g7tp7\"" Apr 20 14:28:22.294746 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.294724 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj"] Apr 20 14:28:22.297273 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.297252 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zjdjg"] Apr 20 14:28:22.359136 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.359099 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42ncs\" (UniqueName: \"kubernetes.io/projected/7dfb4e40-e179-469e-adae-c6d6a0f119db-kube-api-access-42ncs\") pod \"volume-data-source-validator-7c6cbb6c87-zjdjg\" (UID: \"7dfb4e40-e179-469e-adae-c6d6a0f119db\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zjdjg" Apr 20 14:28:22.359136 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.359141 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a04aab45-434d-4c28-a9f8-631e889ece22-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-lcqfj\" (UID: \"a04aab45-434d-4c28-a9f8-631e889ece22\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj" Apr 20 14:28:22.359348 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.359174 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv27z\" (UniqueName: \"kubernetes.io/projected/a04aab45-434d-4c28-a9f8-631e889ece22-kube-api-access-sv27z\") pod \"cluster-monitoring-operator-75587bd455-lcqfj\" (UID: \"a04aab45-434d-4c28-a9f8-631e889ece22\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj" Apr 20 14:28:22.359348 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.359234 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a04aab45-434d-4c28-a9f8-631e889ece22-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lcqfj\" (UID: \"a04aab45-434d-4c28-a9f8-631e889ece22\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj" Apr 20 14:28:22.385932 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.385905 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gc5dt"] Apr 20 14:28:22.388033 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.388017 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bddmc"] Apr 20 14:28:22.388176 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.388158 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gc5dt" Apr 20 14:28:22.390347 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.390323 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 20 14:28:22.390347 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.390327 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 20 14:28:22.390569 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.390391 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-wdznl\"" Apr 20 14:28:22.390569 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.390404 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bddmc" Apr 20 14:28:22.390793 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.390775 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 20 14:28:22.390843 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.390821 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:28:22.392268 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.392252 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 20 14:28:22.392550 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.392532 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-5brfk\"" Apr 20 14:28:22.392550 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.392543 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:28:22.392692 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.392594 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 20 14:28:22.398012 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.397993 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bddmc"] Apr 20 14:28:22.399005 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.398980 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gc5dt"] Apr 20 14:28:22.460323 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.460289 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sv27z\" (UniqueName: \"kubernetes.io/projected/a04aab45-434d-4c28-a9f8-631e889ece22-kube-api-access-sv27z\") pod \"cluster-monitoring-operator-75587bd455-lcqfj\" (UID: \"a04aab45-434d-4c28-a9f8-631e889ece22\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj" Apr 20 14:28:22.460323 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.460329 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6wxf\" (UniqueName: \"kubernetes.io/projected/ecd715e3-8d05-475b-82b1-7d47843f2c8e-kube-api-access-x6wxf\") pod \"cluster-samples-operator-6dc5bdb6b4-bddmc\" (UID: \"ecd715e3-8d05-475b-82b1-7d47843f2c8e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bddmc" Apr 20 14:28:22.460617 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.460377 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a04aab45-434d-4c28-a9f8-631e889ece22-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lcqfj\" (UID: \"a04aab45-434d-4c28-a9f8-631e889ece22\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj" Apr 20 14:28:22.460617 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.460396 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/898273ba-9057-4ee0-9211-0db4c4234ca3-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gc5dt\" (UID: \"898273ba-9057-4ee0-9211-0db4c4234ca3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gc5dt" Apr 20 14:28:22.460617 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.460414 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsfgb\" (UniqueName: \"kubernetes.io/projected/898273ba-9057-4ee0-9211-0db4c4234ca3-kube-api-access-nsfgb\") pod \"kube-storage-version-migrator-operator-6769c5d45-gc5dt\" (UID: \"898273ba-9057-4ee0-9211-0db4c4234ca3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gc5dt" Apr 20 14:28:22.460617 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:22.460538 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 14:28:22.460617 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.460564 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42ncs\" (UniqueName: \"kubernetes.io/projected/7dfb4e40-e179-469e-adae-c6d6a0f119db-kube-api-access-42ncs\") pod \"volume-data-source-validator-7c6cbb6c87-zjdjg\" (UID: \"7dfb4e40-e179-469e-adae-c6d6a0f119db\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zjdjg" Apr 20 14:28:22.461027 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:22.460992 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a04aab45-434d-4c28-a9f8-631e889ece22-cluster-monitoring-operator-tls podName:a04aab45-434d-4c28-a9f8-631e889ece22 nodeName:}" failed. No retries permitted until 2026-04-20 14:28:22.960953083 +0000 UTC m=+89.681254514 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a04aab45-434d-4c28-a9f8-631e889ece22-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lcqfj" (UID: "a04aab45-434d-4c28-a9f8-631e889ece22") : secret "cluster-monitoring-operator-tls" not found Apr 20 14:28:22.461027 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.460996 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ecd715e3-8d05-475b-82b1-7d47843f2c8e-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bddmc\" (UID: \"ecd715e3-8d05-475b-82b1-7d47843f2c8e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bddmc" Apr 20 14:28:22.461200 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.461065 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a04aab45-434d-4c28-a9f8-631e889ece22-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-lcqfj\" (UID: \"a04aab45-434d-4c28-a9f8-631e889ece22\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj" Apr 20 14:28:22.461200 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.461115 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/898273ba-9057-4ee0-9211-0db4c4234ca3-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gc5dt\" (UID: \"898273ba-9057-4ee0-9211-0db4c4234ca3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gc5dt" Apr 20 14:28:22.464612 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.464590 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a04aab45-434d-4c28-a9f8-631e889ece22-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-lcqfj\" (UID: \"a04aab45-434d-4c28-a9f8-631e889ece22\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj" Apr 20 14:28:22.468916 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.468886 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42ncs\" (UniqueName: \"kubernetes.io/projected/7dfb4e40-e179-469e-adae-c6d6a0f119db-kube-api-access-42ncs\") pod \"volume-data-source-validator-7c6cbb6c87-zjdjg\" (UID: \"7dfb4e40-e179-469e-adae-c6d6a0f119db\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zjdjg" Apr 20 14:28:22.469016 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.468924 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv27z\" (UniqueName: \"kubernetes.io/projected/a04aab45-434d-4c28-a9f8-631e889ece22-kube-api-access-sv27z\") pod \"cluster-monitoring-operator-75587bd455-lcqfj\" (UID: \"a04aab45-434d-4c28-a9f8-631e889ece22\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj" Apr 20 14:28:22.479975 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.479953 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpkpd"] Apr 20 14:28:22.482329 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.482316 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpkpd" Apr 20 14:28:22.484778 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.484751 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 20 14:28:22.484778 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.484762 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:28:22.485052 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.485030 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 20 14:28:22.485052 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.485044 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 20 14:28:22.485234 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.485123 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-c4js6\"" Apr 20 14:28:22.493627 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.493605 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpkpd"] Apr 20 14:28:22.562289 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.562191 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2xcx\" (UniqueName: \"kubernetes.io/projected/f5c1dc75-f6fa-4a93-af4f-753f62471c29-kube-api-access-r2xcx\") pod \"service-ca-operator-d6fc45fc5-zpkpd\" (UID: \"f5c1dc75-f6fa-4a93-af4f-753f62471c29\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpkpd" Apr 20 14:28:22.562289 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.562248 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ecd715e3-8d05-475b-82b1-7d47843f2c8e-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bddmc\" (UID: \"ecd715e3-8d05-475b-82b1-7d47843f2c8e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bddmc" Apr 20 14:28:22.562474 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.562299 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5c1dc75-f6fa-4a93-af4f-753f62471c29-config\") pod \"service-ca-operator-d6fc45fc5-zpkpd\" (UID: \"f5c1dc75-f6fa-4a93-af4f-753f62471c29\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpkpd" Apr 20 14:28:22.562474 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.562341 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/898273ba-9057-4ee0-9211-0db4c4234ca3-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gc5dt\" (UID: \"898273ba-9057-4ee0-9211-0db4c4234ca3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gc5dt" Apr 20 14:28:22.562474 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:22.562355 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 14:28:22.562474 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.562382 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6wxf\" (UniqueName: \"kubernetes.io/projected/ecd715e3-8d05-475b-82b1-7d47843f2c8e-kube-api-access-x6wxf\") pod \"cluster-samples-operator-6dc5bdb6b4-bddmc\" (UID: \"ecd715e3-8d05-475b-82b1-7d47843f2c8e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bddmc" Apr 20 14:28:22.562474 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:22.562413 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecd715e3-8d05-475b-82b1-7d47843f2c8e-samples-operator-tls podName:ecd715e3-8d05-475b-82b1-7d47843f2c8e nodeName:}" failed. No retries permitted until 2026-04-20 14:28:23.062394859 +0000 UTC m=+89.782696276 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ecd715e3-8d05-475b-82b1-7d47843f2c8e-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bddmc" (UID: "ecd715e3-8d05-475b-82b1-7d47843f2c8e") : secret "samples-operator-tls" not found Apr 20 14:28:22.562693 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.562566 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/898273ba-9057-4ee0-9211-0db4c4234ca3-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gc5dt\" (UID: \"898273ba-9057-4ee0-9211-0db4c4234ca3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gc5dt" Apr 20 14:28:22.562693 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.562611 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nsfgb\" (UniqueName: \"kubernetes.io/projected/898273ba-9057-4ee0-9211-0db4c4234ca3-kube-api-access-nsfgb\") pod \"kube-storage-version-migrator-operator-6769c5d45-gc5dt\" (UID: \"898273ba-9057-4ee0-9211-0db4c4234ca3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gc5dt" Apr 20 14:28:22.562693 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.562643 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5c1dc75-f6fa-4a93-af4f-753f62471c29-serving-cert\") pod \"service-ca-operator-d6fc45fc5-zpkpd\" (UID: \"f5c1dc75-f6fa-4a93-af4f-753f62471c29\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpkpd" Apr 20 14:28:22.562993 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.562974 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/898273ba-9057-4ee0-9211-0db4c4234ca3-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gc5dt\" (UID: \"898273ba-9057-4ee0-9211-0db4c4234ca3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gc5dt" Apr 20 14:28:22.564581 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.564563 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/898273ba-9057-4ee0-9211-0db4c4234ca3-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gc5dt\" (UID: \"898273ba-9057-4ee0-9211-0db4c4234ca3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gc5dt" Apr 20 14:28:22.570863 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.570835 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsfgb\" (UniqueName: \"kubernetes.io/projected/898273ba-9057-4ee0-9211-0db4c4234ca3-kube-api-access-nsfgb\") pod \"kube-storage-version-migrator-operator-6769c5d45-gc5dt\" (UID: \"898273ba-9057-4ee0-9211-0db4c4234ca3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gc5dt" Apr 20 14:28:22.571107 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.571087 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6wxf\" (UniqueName: \"kubernetes.io/projected/ecd715e3-8d05-475b-82b1-7d47843f2c8e-kube-api-access-x6wxf\") pod \"cluster-samples-operator-6dc5bdb6b4-bddmc\" (UID: \"ecd715e3-8d05-475b-82b1-7d47843f2c8e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bddmc" Apr 20 14:28:22.599971 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.599940 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zjdjg" Apr 20 14:28:22.663033 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.662998 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5c1dc75-f6fa-4a93-af4f-753f62471c29-serving-cert\") pod \"service-ca-operator-d6fc45fc5-zpkpd\" (UID: \"f5c1dc75-f6fa-4a93-af4f-753f62471c29\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpkpd" Apr 20 14:28:22.663227 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.663113 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2xcx\" (UniqueName: \"kubernetes.io/projected/f5c1dc75-f6fa-4a93-af4f-753f62471c29-kube-api-access-r2xcx\") pod \"service-ca-operator-d6fc45fc5-zpkpd\" (UID: \"f5c1dc75-f6fa-4a93-af4f-753f62471c29\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpkpd" Apr 20 14:28:22.663227 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.663149 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5c1dc75-f6fa-4a93-af4f-753f62471c29-config\") pod \"service-ca-operator-d6fc45fc5-zpkpd\" (UID: \"f5c1dc75-f6fa-4a93-af4f-753f62471c29\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpkpd" Apr 20 14:28:22.663659 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.663638 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5c1dc75-f6fa-4a93-af4f-753f62471c29-config\") pod \"service-ca-operator-d6fc45fc5-zpkpd\" (UID: \"f5c1dc75-f6fa-4a93-af4f-753f62471c29\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpkpd" Apr 20 14:28:22.665224 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.665199 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5c1dc75-f6fa-4a93-af4f-753f62471c29-serving-cert\") pod \"service-ca-operator-d6fc45fc5-zpkpd\" (UID: \"f5c1dc75-f6fa-4a93-af4f-753f62471c29\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpkpd" Apr 20 14:28:22.671267 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.671243 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2xcx\" (UniqueName: \"kubernetes.io/projected/f5c1dc75-f6fa-4a93-af4f-753f62471c29-kube-api-access-r2xcx\") pod \"service-ca-operator-d6fc45fc5-zpkpd\" (UID: \"f5c1dc75-f6fa-4a93-af4f-753f62471c29\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpkpd" Apr 20 14:28:22.698126 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.698100 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gc5dt" Apr 20 14:28:22.715055 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.715024 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zjdjg"] Apr 20 14:28:22.718157 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:28:22.718129 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dfb4e40_e179_469e_adae_c6d6a0f119db.slice/crio-4012d88b752126022aea10e6af8a20449c97b42755a7c6479f5d3f6a79a75685 WatchSource:0}: Error finding container 4012d88b752126022aea10e6af8a20449c97b42755a7c6479f5d3f6a79a75685: Status 404 returned error can't find the container with id 4012d88b752126022aea10e6af8a20449c97b42755a7c6479f5d3f6a79a75685 Apr 20 14:28:22.791016 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.790982 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpkpd" Apr 20 14:28:22.814032 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.813863 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gc5dt"] Apr 20 14:28:22.817004 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:28:22.816970 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod898273ba_9057_4ee0_9211_0db4c4234ca3.slice/crio-43a31f0714d8c8d33a27ffbdc9679c8d99e9d3c0aa0ce947cf55edf38d1e9421 WatchSource:0}: Error finding container 43a31f0714d8c8d33a27ffbdc9679c8d99e9d3c0aa0ce947cf55edf38d1e9421: Status 404 returned error can't find the container with id 43a31f0714d8c8d33a27ffbdc9679c8d99e9d3c0aa0ce947cf55edf38d1e9421 Apr 20 14:28:22.905644 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.905612 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpkpd"] Apr 20 14:28:22.908862 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:28:22.908834 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5c1dc75_f6fa_4a93_af4f_753f62471c29.slice/crio-cdbc67a8b95c67c2d9b1767f5c3af85cddf64b3313fb231b76061028e51339d8 WatchSource:0}: Error finding container cdbc67a8b95c67c2d9b1767f5c3af85cddf64b3313fb231b76061028e51339d8: Status 404 returned error can't find the container with id cdbc67a8b95c67c2d9b1767f5c3af85cddf64b3313fb231b76061028e51339d8 Apr 20 14:28:22.966480 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:22.966449 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a04aab45-434d-4c28-a9f8-631e889ece22-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lcqfj\" (UID: \"a04aab45-434d-4c28-a9f8-631e889ece22\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj" Apr 20 14:28:22.966628 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:22.966577 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 14:28:22.966668 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:22.966631 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a04aab45-434d-4c28-a9f8-631e889ece22-cluster-monitoring-operator-tls podName:a04aab45-434d-4c28-a9f8-631e889ece22 nodeName:}" failed. No retries permitted until 2026-04-20 14:28:23.966617038 +0000 UTC m=+90.686918448 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a04aab45-434d-4c28-a9f8-631e889ece22-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lcqfj" (UID: "a04aab45-434d-4c28-a9f8-631e889ece22") : secret "cluster-monitoring-operator-tls" not found Apr 20 14:28:23.066854 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:23.066772 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ecd715e3-8d05-475b-82b1-7d47843f2c8e-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bddmc\" (UID: \"ecd715e3-8d05-475b-82b1-7d47843f2c8e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bddmc" Apr 20 14:28:23.067044 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:23.066921 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 14:28:23.067044 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:23.066983 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecd715e3-8d05-475b-82b1-7d47843f2c8e-samples-operator-tls podName:ecd715e3-8d05-475b-82b1-7d47843f2c8e nodeName:}" failed. No retries permitted until 2026-04-20 14:28:24.066966558 +0000 UTC m=+90.787267981 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ecd715e3-8d05-475b-82b1-7d47843f2c8e-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bddmc" (UID: "ecd715e3-8d05-475b-82b1-7d47843f2c8e") : secret "samples-operator-tls" not found Apr 20 14:28:23.203474 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:23.203437 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpkpd" event={"ID":"f5c1dc75-f6fa-4a93-af4f-753f62471c29","Type":"ContainerStarted","Data":"cdbc67a8b95c67c2d9b1767f5c3af85cddf64b3313fb231b76061028e51339d8"} Apr 20 14:28:23.204356 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:23.204338 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zjdjg" event={"ID":"7dfb4e40-e179-469e-adae-c6d6a0f119db","Type":"ContainerStarted","Data":"4012d88b752126022aea10e6af8a20449c97b42755a7c6479f5d3f6a79a75685"} Apr 20 14:28:23.205346 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:23.205319 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gc5dt" event={"ID":"898273ba-9057-4ee0-9211-0db4c4234ca3","Type":"ContainerStarted","Data":"43a31f0714d8c8d33a27ffbdc9679c8d99e9d3c0aa0ce947cf55edf38d1e9421"} Apr 20 14:28:23.973160 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:23.973076 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a04aab45-434d-4c28-a9f8-631e889ece22-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lcqfj\" (UID: \"a04aab45-434d-4c28-a9f8-631e889ece22\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj" Apr 20 14:28:23.973634 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:23.973220 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 14:28:23.973634 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:23.973304 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a04aab45-434d-4c28-a9f8-631e889ece22-cluster-monitoring-operator-tls podName:a04aab45-434d-4c28-a9f8-631e889ece22 nodeName:}" failed. No retries permitted until 2026-04-20 14:28:25.973283189 +0000 UTC m=+92.693584611 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a04aab45-434d-4c28-a9f8-631e889ece22-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lcqfj" (UID: "a04aab45-434d-4c28-a9f8-631e889ece22") : secret "cluster-monitoring-operator-tls" not found Apr 20 14:28:24.074550 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:24.074488 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ecd715e3-8d05-475b-82b1-7d47843f2c8e-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bddmc\" (UID: \"ecd715e3-8d05-475b-82b1-7d47843f2c8e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bddmc" Apr 20 14:28:24.074737 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:24.074660 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 14:28:24.074737 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:24.074731 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecd715e3-8d05-475b-82b1-7d47843f2c8e-samples-operator-tls podName:ecd715e3-8d05-475b-82b1-7d47843f2c8e nodeName:}" failed. No retries permitted until 2026-04-20 14:28:26.074713011 +0000 UTC m=+92.795014427 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ecd715e3-8d05-475b-82b1-7d47843f2c8e-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bddmc" (UID: "ecd715e3-8d05-475b-82b1-7d47843f2c8e") : secret "samples-operator-tls" not found Apr 20 14:28:25.211193 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:25.211144 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zjdjg" event={"ID":"7dfb4e40-e179-469e-adae-c6d6a0f119db","Type":"ContainerStarted","Data":"4428e045408c58cfbb073ab42fc063f53fefd216c1b050b0b2e8c0e530d10e10"} Apr 20 14:28:25.226181 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:25.226096 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-zjdjg" podStartSLOduration=1.552995763 podStartE2EDuration="3.226079906s" podCreationTimestamp="2026-04-20 14:28:22 +0000 UTC" firstStartedPulling="2026-04-20 14:28:22.719682133 +0000 UTC m=+89.439983544" lastFinishedPulling="2026-04-20 14:28:24.392766273 +0000 UTC m=+91.113067687" observedRunningTime="2026-04-20 14:28:25.225631604 +0000 UTC m=+91.945933036" watchObservedRunningTime="2026-04-20 14:28:25.226079906 +0000 UTC m=+91.946381337" Apr 20 14:28:25.989964 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:25.989922 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a04aab45-434d-4c28-a9f8-631e889ece22-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lcqfj\" (UID: \"a04aab45-434d-4c28-a9f8-631e889ece22\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj" Apr 20 14:28:25.990141 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:25.990082 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 14:28:25.990185 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:25.990167 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a04aab45-434d-4c28-a9f8-631e889ece22-cluster-monitoring-operator-tls podName:a04aab45-434d-4c28-a9f8-631e889ece22 nodeName:}" failed. No retries permitted until 2026-04-20 14:28:29.990148538 +0000 UTC m=+96.710449959 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a04aab45-434d-4c28-a9f8-631e889ece22-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lcqfj" (UID: "a04aab45-434d-4c28-a9f8-631e889ece22") : secret "cluster-monitoring-operator-tls" not found Apr 20 14:28:26.091204 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:26.091161 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ecd715e3-8d05-475b-82b1-7d47843f2c8e-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bddmc\" (UID: \"ecd715e3-8d05-475b-82b1-7d47843f2c8e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bddmc" Apr 20 14:28:26.091367 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:26.091344 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 14:28:26.091445 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:26.091429 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecd715e3-8d05-475b-82b1-7d47843f2c8e-samples-operator-tls podName:ecd715e3-8d05-475b-82b1-7d47843f2c8e nodeName:}" failed. No retries permitted until 2026-04-20 14:28:30.091405899 +0000 UTC m=+96.811707314 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ecd715e3-8d05-475b-82b1-7d47843f2c8e-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bddmc" (UID: "ecd715e3-8d05-475b-82b1-7d47843f2c8e") : secret "samples-operator-tls" not found Apr 20 14:28:26.215244 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:26.215206 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gc5dt" event={"ID":"898273ba-9057-4ee0-9211-0db4c4234ca3","Type":"ContainerStarted","Data":"8e0082fb6aef80ecb897e97e2a7924c60148c7daa4a7e48d46cc50ead1ef4e32"} Apr 20 14:28:26.216543 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:26.216518 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpkpd" event={"ID":"f5c1dc75-f6fa-4a93-af4f-753f62471c29","Type":"ContainerStarted","Data":"56c634f0606f50e246392dc72e6a5d9ea2e4d22fab957c461e3ee7b9f86ef3de"} Apr 20 14:28:26.230218 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:26.230161 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gc5dt" podStartSLOduration=1.300654056 podStartE2EDuration="4.230143982s" podCreationTimestamp="2026-04-20 14:28:22 +0000 UTC" firstStartedPulling="2026-04-20 14:28:22.81894907 +0000 UTC m=+89.539250479" lastFinishedPulling="2026-04-20 14:28:25.748438981 +0000 UTC m=+92.468740405" observedRunningTime="2026-04-20 14:28:26.230112158 +0000 UTC m=+92.950413588" watchObservedRunningTime="2026-04-20 14:28:26.230143982 +0000 UTC m=+92.950445415" Apr 20 14:28:26.243550 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:26.243172 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpkpd" podStartSLOduration=1.405905014 podStartE2EDuration="4.243153148s" podCreationTimestamp="2026-04-20 14:28:22 +0000 UTC" firstStartedPulling="2026-04-20 14:28:22.91062689 +0000 UTC m=+89.630928301" lastFinishedPulling="2026-04-20 14:28:25.747875018 +0000 UTC m=+92.468176435" observedRunningTime="2026-04-20 14:28:26.242844327 +0000 UTC m=+92.963145761" watchObservedRunningTime="2026-04-20 14:28:26.243153148 +0000 UTC m=+92.963454581" Apr 20 14:28:27.253901 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:27.253865 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-cxhm7"] Apr 20 14:28:27.256406 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:27.256386 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-cxhm7" Apr 20 14:28:27.258529 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:27.258490 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-csc7s\"" Apr 20 14:28:27.264710 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:27.264676 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-cxhm7"] Apr 20 14:28:27.301326 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:27.301266 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjlmc\" (UniqueName: \"kubernetes.io/projected/cce09ab0-378b-4e48-ba91-0f2e06874097-kube-api-access-rjlmc\") pod \"network-check-source-8894fc9bd-cxhm7\" (UID: \"cce09ab0-378b-4e48-ba91-0f2e06874097\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-cxhm7" Apr 20 14:28:27.401741 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:27.401703 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjlmc\" (UniqueName: \"kubernetes.io/projected/cce09ab0-378b-4e48-ba91-0f2e06874097-kube-api-access-rjlmc\") pod \"network-check-source-8894fc9bd-cxhm7\" (UID: \"cce09ab0-378b-4e48-ba91-0f2e06874097\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-cxhm7" Apr 20 14:28:27.410174 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:27.410138 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjlmc\" (UniqueName: \"kubernetes.io/projected/cce09ab0-378b-4e48-ba91-0f2e06874097-kube-api-access-rjlmc\") pod \"network-check-source-8894fc9bd-cxhm7\" (UID: \"cce09ab0-378b-4e48-ba91-0f2e06874097\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-cxhm7" Apr 20 14:28:27.566566 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:27.566467 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-cxhm7" Apr 20 14:28:27.685062 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:27.685029 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-cxhm7"] Apr 20 14:28:27.688689 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:28:27.688659 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcce09ab0_378b_4e48_ba91_0f2e06874097.slice/crio-d3a2cf9e2a89849c76a61b9d7a3f0a27741d1bbb783856c54efecc56abee85f5 WatchSource:0}: Error finding container d3a2cf9e2a89849c76a61b9d7a3f0a27741d1bbb783856c54efecc56abee85f5: Status 404 returned error can't find the container with id d3a2cf9e2a89849c76a61b9d7a3f0a27741d1bbb783856c54efecc56abee85f5 Apr 20 14:28:28.223475 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:28.223437 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-cxhm7" event={"ID":"cce09ab0-378b-4e48-ba91-0f2e06874097","Type":"ContainerStarted","Data":"b9607aaa70388da9149074e81d4823d092e2b61f481de77bb04687a8d67adc04"} Apr 20 14:28:28.223475 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:28.223481 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-cxhm7" event={"ID":"cce09ab0-378b-4e48-ba91-0f2e06874097","Type":"ContainerStarted","Data":"d3a2cf9e2a89849c76a61b9d7a3f0a27741d1bbb783856c54efecc56abee85f5"} Apr 20 14:28:28.238137 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:28.238083 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-cxhm7" podStartSLOduration=1.238068951 podStartE2EDuration="1.238068951s" podCreationTimestamp="2026-04-20 14:28:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:28:28.236950471 +0000 UTC m=+94.957251903" watchObservedRunningTime="2026-04-20 14:28:28.238068951 +0000 UTC m=+94.958370380" Apr 20 14:28:30.024691 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:30.024652 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a04aab45-434d-4c28-a9f8-631e889ece22-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lcqfj\" (UID: \"a04aab45-434d-4c28-a9f8-631e889ece22\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj" Apr 20 14:28:30.025101 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:30.024802 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 14:28:30.025101 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:30.024875 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a04aab45-434d-4c28-a9f8-631e889ece22-cluster-monitoring-operator-tls podName:a04aab45-434d-4c28-a9f8-631e889ece22 nodeName:}" failed. No retries permitted until 2026-04-20 14:28:38.02485949 +0000 UTC m=+104.745160900 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a04aab45-434d-4c28-a9f8-631e889ece22-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lcqfj" (UID: "a04aab45-434d-4c28-a9f8-631e889ece22") : secret "cluster-monitoring-operator-tls" not found Apr 20 14:28:30.125664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:30.125632 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ecd715e3-8d05-475b-82b1-7d47843f2c8e-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bddmc\" (UID: \"ecd715e3-8d05-475b-82b1-7d47843f2c8e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bddmc" Apr 20 14:28:30.125799 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:30.125778 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 14:28:30.125849 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:30.125840 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ecd715e3-8d05-475b-82b1-7d47843f2c8e-samples-operator-tls podName:ecd715e3-8d05-475b-82b1-7d47843f2c8e nodeName:}" failed. No retries permitted until 2026-04-20 14:28:38.125825202 +0000 UTC m=+104.846126612 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/ecd715e3-8d05-475b-82b1-7d47843f2c8e-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-bddmc" (UID: "ecd715e3-8d05-475b-82b1-7d47843f2c8e") : secret "samples-operator-tls" not found Apr 20 14:28:30.529014 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:30.528976 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls\") pod \"dns-default-vlqz9\" (UID: \"26882095-42a8-4889-9983-45d2dc2d0fc6\") " pod="openshift-dns/dns-default-vlqz9" Apr 20 14:28:30.529223 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:30.529033 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls\") pod \"image-registry-556bcd8745-tx4hg\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:28:30.529223 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:30.529102 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert\") pod \"ingress-canary-8g5vx\" (UID: \"d6c411bc-80c8-4d9c-993c-cb6aeb232750\") " pod="openshift-ingress-canary/ingress-canary-8g5vx" Apr 20 14:28:30.529223 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:30.529138 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:28:30.529223 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:30.529196 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:28:30.529223 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:30.529200 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls podName:26882095-42a8-4889-9983-45d2dc2d0fc6 nodeName:}" failed. No retries permitted until 2026-04-20 14:29:34.529182279 +0000 UTC m=+161.249483709 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls") pod "dns-default-vlqz9" (UID: "26882095-42a8-4889-9983-45d2dc2d0fc6") : secret "dns-default-metrics-tls" not found Apr 20 14:28:30.529223 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:30.529226 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert podName:d6c411bc-80c8-4d9c-993c-cb6aeb232750 nodeName:}" failed. No retries permitted until 2026-04-20 14:29:34.529217518 +0000 UTC m=+161.249518933 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert") pod "ingress-canary-8g5vx" (UID: "d6c411bc-80c8-4d9c-993c-cb6aeb232750") : secret "canary-serving-cert" not found Apr 20 14:28:30.529469 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:30.529224 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:28:30.529469 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:30.529247 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-556bcd8745-tx4hg: secret "image-registry-tls" not found Apr 20 14:28:30.529469 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:30.529304 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls podName:150936cc-56bb-4f1a-9468-4a8527e8cec7 nodeName:}" failed. No retries permitted until 2026-04-20 14:29:34.529287521 +0000 UTC m=+161.249588944 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls") pod "image-registry-556bcd8745-tx4hg" (UID: "150936cc-56bb-4f1a-9468-4a8527e8cec7") : secret "image-registry-tls" not found Apr 20 14:28:30.724116 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:30.724093 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mw9b9_236542ea-fda7-4b96-ae9e-dd685e15e5ef/dns-node-resolver/0.log" Apr 20 14:28:31.323008 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:31.322981 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2rnmj_175a3efc-4aa3-4f7d-ac63-bb40b7cd457b/node-ca/0.log" Apr 20 14:28:33.126146 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:33.126109 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gc5dt_898273ba-9057-4ee0-9211-0db4c4234ca3/kube-storage-version-migrator-operator/0.log" Apr 20 14:28:34.168824 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:34.168796 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-fgfxx" Apr 20 14:28:38.091954 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:38.091912 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a04aab45-434d-4c28-a9f8-631e889ece22-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lcqfj\" (UID: \"a04aab45-434d-4c28-a9f8-631e889ece22\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj" Apr 20 14:28:38.092341 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:38.092063 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 14:28:38.092341 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:28:38.092155 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a04aab45-434d-4c28-a9f8-631e889ece22-cluster-monitoring-operator-tls podName:a04aab45-434d-4c28-a9f8-631e889ece22 nodeName:}" failed. No retries permitted until 2026-04-20 14:28:54.092136366 +0000 UTC m=+120.812437776 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a04aab45-434d-4c28-a9f8-631e889ece22-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lcqfj" (UID: "a04aab45-434d-4c28-a9f8-631e889ece22") : secret "cluster-monitoring-operator-tls" not found Apr 20 14:28:38.192792 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:38.192755 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ecd715e3-8d05-475b-82b1-7d47843f2c8e-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bddmc\" (UID: \"ecd715e3-8d05-475b-82b1-7d47843f2c8e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bddmc" Apr 20 14:28:38.195218 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:38.195191 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ecd715e3-8d05-475b-82b1-7d47843f2c8e-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-bddmc\" (UID: \"ecd715e3-8d05-475b-82b1-7d47843f2c8e\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bddmc" Apr 20 14:28:38.304452 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:38.304418 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bddmc" Apr 20 14:28:38.441364 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:38.441329 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bddmc"] Apr 20 14:28:39.252890 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:39.252843 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bddmc" event={"ID":"ecd715e3-8d05-475b-82b1-7d47843f2c8e","Type":"ContainerStarted","Data":"0e63e6b02866853248602d24e775ab61304e9c7df66d996ac9de0b5e3b164dd3"} Apr 20 14:28:41.258857 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:41.258794 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bddmc" event={"ID":"ecd715e3-8d05-475b-82b1-7d47843f2c8e","Type":"ContainerStarted","Data":"1bf1369e0389df68740472930027123d8c8245acb1acbc52cc5b8ff003f91fdc"} Apr 20 14:28:41.258857 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:41.258859 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bddmc" event={"ID":"ecd715e3-8d05-475b-82b1-7d47843f2c8e","Type":"ContainerStarted","Data":"94a8c145562fb6772ec3df99ea6c5bd2b9aed3262dd99f903337340d1bf6fedd"} Apr 20 14:28:41.278084 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:41.278035 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-bddmc" podStartSLOduration=17.272320935 podStartE2EDuration="19.278021726s" podCreationTimestamp="2026-04-20 14:28:22 +0000 UTC" firstStartedPulling="2026-04-20 14:28:38.485773269 +0000 UTC m=+105.206074679" lastFinishedPulling="2026-04-20 14:28:40.49147406 +0000 UTC m=+107.211775470" observedRunningTime="2026-04-20 14:28:41.277208538 +0000 UTC m=+107.997509969" watchObservedRunningTime="2026-04-20 14:28:41.278021726 +0000 UTC m=+107.998323223" Apr 20 14:28:51.675725 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.675693 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-q2tq5"] Apr 20 14:28:51.678247 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.678230 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-q2tq5" Apr 20 14:28:51.689126 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.689103 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 14:28:51.689966 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.689951 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 14:28:51.690133 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.690114 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rq5c2\"" Apr 20 14:28:51.690180 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.690158 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 14:28:51.690214 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.690191 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 14:28:51.723405 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.723372 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-q2tq5"] Apr 20 14:28:51.724182 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.724155 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-44kjm"] Apr 20 14:28:51.727183 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.727156 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-44kjm" Apr 20 14:28:51.730071 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.730042 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-bb8fg\"" Apr 20 14:28:51.730205 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.730097 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 20 14:28:51.730205 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.730142 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 20 14:28:51.738391 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.738367 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-44kjm"] Apr 20 14:28:51.797421 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.797373 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6e2b9449-8bdb-4411-bc74-b657c116838a-crio-socket\") pod \"insights-runtime-extractor-q2tq5\" (UID: \"6e2b9449-8bdb-4411-bc74-b657c116838a\") " pod="openshift-insights/insights-runtime-extractor-q2tq5" Apr 20 14:28:51.797647 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.797430 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvrd7\" (UniqueName: \"kubernetes.io/projected/6e2b9449-8bdb-4411-bc74-b657c116838a-kube-api-access-rvrd7\") pod \"insights-runtime-extractor-q2tq5\" (UID: \"6e2b9449-8bdb-4411-bc74-b657c116838a\") " pod="openshift-insights/insights-runtime-extractor-q2tq5" Apr 20 14:28:51.797647 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.797493 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6e2b9449-8bdb-4411-bc74-b657c116838a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q2tq5\" (UID: \"6e2b9449-8bdb-4411-bc74-b657c116838a\") " pod="openshift-insights/insights-runtime-extractor-q2tq5" Apr 20 14:28:51.797647 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.797619 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6e2b9449-8bdb-4411-bc74-b657c116838a-data-volume\") pod \"insights-runtime-extractor-q2tq5\" (UID: \"6e2b9449-8bdb-4411-bc74-b657c116838a\") " pod="openshift-insights/insights-runtime-extractor-q2tq5" Apr 20 14:28:51.797862 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.797686 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6e2b9449-8bdb-4411-bc74-b657c116838a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q2tq5\" (UID: \"6e2b9449-8bdb-4411-bc74-b657c116838a\") " pod="openshift-insights/insights-runtime-extractor-q2tq5" Apr 20 14:28:51.899023 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.898981 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6e2b9449-8bdb-4411-bc74-b657c116838a-crio-socket\") pod \"insights-runtime-extractor-q2tq5\" (UID: \"6e2b9449-8bdb-4411-bc74-b657c116838a\") " pod="openshift-insights/insights-runtime-extractor-q2tq5" Apr 20 14:28:51.899023 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.899021 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/422e9153-0d1c-42db-a5e3-d97c30b99267-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-44kjm\" (UID: \"422e9153-0d1c-42db-a5e3-d97c30b99267\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-44kjm" Apr 20 14:28:51.899252 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.899040 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvrd7\" (UniqueName: \"kubernetes.io/projected/6e2b9449-8bdb-4411-bc74-b657c116838a-kube-api-access-rvrd7\") pod \"insights-runtime-extractor-q2tq5\" (UID: \"6e2b9449-8bdb-4411-bc74-b657c116838a\") " pod="openshift-insights/insights-runtime-extractor-q2tq5" Apr 20 14:28:51.899252 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.899061 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6e2b9449-8bdb-4411-bc74-b657c116838a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q2tq5\" (UID: \"6e2b9449-8bdb-4411-bc74-b657c116838a\") " pod="openshift-insights/insights-runtime-extractor-q2tq5" Apr 20 14:28:51.899252 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.899118 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6e2b9449-8bdb-4411-bc74-b657c116838a-crio-socket\") pod \"insights-runtime-extractor-q2tq5\" (UID: \"6e2b9449-8bdb-4411-bc74-b657c116838a\") " pod="openshift-insights/insights-runtime-extractor-q2tq5" Apr 20 14:28:51.899376 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.899277 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6e2b9449-8bdb-4411-bc74-b657c116838a-data-volume\") pod \"insights-runtime-extractor-q2tq5\" (UID: \"6e2b9449-8bdb-4411-bc74-b657c116838a\") " pod="openshift-insights/insights-runtime-extractor-q2tq5" Apr 20 14:28:51.899376 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.899315 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/422e9153-0d1c-42db-a5e3-d97c30b99267-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-44kjm\" (UID: \"422e9153-0d1c-42db-a5e3-d97c30b99267\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-44kjm" Apr 20 14:28:51.899477 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.899404 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6e2b9449-8bdb-4411-bc74-b657c116838a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q2tq5\" (UID: \"6e2b9449-8bdb-4411-bc74-b657c116838a\") " pod="openshift-insights/insights-runtime-extractor-q2tq5" Apr 20 14:28:51.900291 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.900272 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6e2b9449-8bdb-4411-bc74-b657c116838a-data-volume\") pod \"insights-runtime-extractor-q2tq5\" (UID: \"6e2b9449-8bdb-4411-bc74-b657c116838a\") " pod="openshift-insights/insights-runtime-extractor-q2tq5" Apr 20 14:28:51.900473 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.900457 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6e2b9449-8bdb-4411-bc74-b657c116838a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-q2tq5\" (UID: \"6e2b9449-8bdb-4411-bc74-b657c116838a\") " pod="openshift-insights/insights-runtime-extractor-q2tq5" Apr 20 14:28:51.901538 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.901523 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6e2b9449-8bdb-4411-bc74-b657c116838a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-q2tq5\" (UID: \"6e2b9449-8bdb-4411-bc74-b657c116838a\") " pod="openshift-insights/insights-runtime-extractor-q2tq5" Apr 20 14:28:51.907066 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.907041 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvrd7\" (UniqueName: \"kubernetes.io/projected/6e2b9449-8bdb-4411-bc74-b657c116838a-kube-api-access-rvrd7\") pod \"insights-runtime-extractor-q2tq5\" (UID: \"6e2b9449-8bdb-4411-bc74-b657c116838a\") " pod="openshift-insights/insights-runtime-extractor-q2tq5" Apr 20 14:28:51.987658 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:51.987565 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-q2tq5" Apr 20 14:28:52.000530 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:52.000299 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/422e9153-0d1c-42db-a5e3-d97c30b99267-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-44kjm\" (UID: \"422e9153-0d1c-42db-a5e3-d97c30b99267\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-44kjm" Apr 20 14:28:52.000530 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:52.000417 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/422e9153-0d1c-42db-a5e3-d97c30b99267-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-44kjm\" (UID: \"422e9153-0d1c-42db-a5e3-d97c30b99267\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-44kjm" Apr 20 14:28:52.001276 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:52.001252 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/422e9153-0d1c-42db-a5e3-d97c30b99267-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-44kjm\" (UID: \"422e9153-0d1c-42db-a5e3-d97c30b99267\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-44kjm" Apr 20 14:28:52.002859 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:52.002836 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/422e9153-0d1c-42db-a5e3-d97c30b99267-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-44kjm\" (UID: \"422e9153-0d1c-42db-a5e3-d97c30b99267\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-44kjm" Apr 20 14:28:52.039204 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:52.039171 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-44kjm" Apr 20 14:28:52.135445 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:52.135416 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-q2tq5"] Apr 20 14:28:52.139115 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:28:52.139088 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e2b9449_8bdb_4411_bc74_b657c116838a.slice/crio-c64455116c21c2247d331398d17817eeeec91a1fdf6ea58e5fb2c8dff560439f WatchSource:0}: Error finding container c64455116c21c2247d331398d17817eeeec91a1fdf6ea58e5fb2c8dff560439f: Status 404 returned error can't find the container with id c64455116c21c2247d331398d17817eeeec91a1fdf6ea58e5fb2c8dff560439f Apr 20 14:28:52.179043 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:52.178935 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-44kjm"] Apr 20 14:28:52.184112 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:28:52.184078 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod422e9153_0d1c_42db_a5e3_d97c30b99267.slice/crio-b3866cf4ae6d55031fed774062ce13ad94571519a13deb9b2b295ae4706d5e4f WatchSource:0}: Error finding container b3866cf4ae6d55031fed774062ce13ad94571519a13deb9b2b295ae4706d5e4f: Status 404 returned error can't find the container with id b3866cf4ae6d55031fed774062ce13ad94571519a13deb9b2b295ae4706d5e4f Apr 20 14:28:52.286039 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:52.285942 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q2tq5" event={"ID":"6e2b9449-8bdb-4411-bc74-b657c116838a","Type":"ContainerStarted","Data":"ed21ccaa2937060ed6300a84f253af6ba5f9c3b2731ce8194bf38caaa138257d"} Apr 20 14:28:52.286039 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:52.285985 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q2tq5" event={"ID":"6e2b9449-8bdb-4411-bc74-b657c116838a","Type":"ContainerStarted","Data":"c64455116c21c2247d331398d17817eeeec91a1fdf6ea58e5fb2c8dff560439f"} Apr 20 14:28:52.286885 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:52.286860 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-44kjm" event={"ID":"422e9153-0d1c-42db-a5e3-d97c30b99267","Type":"ContainerStarted","Data":"b3866cf4ae6d55031fed774062ce13ad94571519a13deb9b2b295ae4706d5e4f"} Apr 20 14:28:53.291371 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:53.291327 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q2tq5" event={"ID":"6e2b9449-8bdb-4411-bc74-b657c116838a","Type":"ContainerStarted","Data":"6b3e4c30900ffee90b6b592555f35210e88864496487977b215e6bd7c0dfbf70"} Apr 20 14:28:54.117653 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:54.117606 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a04aab45-434d-4c28-a9f8-631e889ece22-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lcqfj\" (UID: \"a04aab45-434d-4c28-a9f8-631e889ece22\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj" Apr 20 14:28:54.120517 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:54.120463 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a04aab45-434d-4c28-a9f8-631e889ece22-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lcqfj\" (UID: \"a04aab45-434d-4c28-a9f8-631e889ece22\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj" Apr 20 14:28:54.295575 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:54.295535 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-44kjm" event={"ID":"422e9153-0d1c-42db-a5e3-d97c30b99267","Type":"ContainerStarted","Data":"62fafd3bf7b19a16300f015c42ee13191df6d344509303dc085a9ecaf6489fb6"} Apr 20 14:28:54.313014 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:54.312964 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-44kjm" podStartSLOduration=2.124665123 podStartE2EDuration="3.312947381s" podCreationTimestamp="2026-04-20 14:28:51 +0000 UTC" firstStartedPulling="2026-04-20 14:28:52.185905335 +0000 UTC m=+118.906206746" lastFinishedPulling="2026-04-20 14:28:53.374187581 +0000 UTC m=+120.094489004" observedRunningTime="2026-04-20 14:28:54.312208935 +0000 UTC m=+121.032510368" watchObservedRunningTime="2026-04-20 14:28:54.312947381 +0000 UTC m=+121.033248812" Apr 20 14:28:54.397065 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:54.396963 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-2bfrt\"" Apr 20 14:28:54.404922 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:54.404881 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj" Apr 20 14:28:54.605542 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:54.605489 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj"] Apr 20 14:28:54.608742 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:28:54.608715 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda04aab45_434d_4c28_a9f8_631e889ece22.slice/crio-7f834e6937214db0f99d94fb447160524029a0e006691b6942a92d2709e1f854 WatchSource:0}: Error finding container 7f834e6937214db0f99d94fb447160524029a0e006691b6942a92d2709e1f854: Status 404 returned error can't find the container with id 7f834e6937214db0f99d94fb447160524029a0e006691b6942a92d2709e1f854 Apr 20 14:28:55.299912 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:55.299869 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-q2tq5" event={"ID":"6e2b9449-8bdb-4411-bc74-b657c116838a","Type":"ContainerStarted","Data":"039d296f65f7d656b495f1766c93229fcb13f98327a167592bde50be67748b73"} Apr 20 14:28:55.300900 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:55.300874 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj" event={"ID":"a04aab45-434d-4c28-a9f8-631e889ece22","Type":"ContainerStarted","Data":"7f834e6937214db0f99d94fb447160524029a0e006691b6942a92d2709e1f854"} Apr 20 14:28:55.318017 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:55.317965 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-q2tq5" podStartSLOduration=1.983381145 podStartE2EDuration="4.317948548s" podCreationTimestamp="2026-04-20 14:28:51 +0000 UTC" firstStartedPulling="2026-04-20 14:28:52.191861087 +0000 UTC m=+118.912162497" lastFinishedPulling="2026-04-20 14:28:54.526428489 +0000 UTC m=+121.246729900" observedRunningTime="2026-04-20 14:28:55.31665259 +0000 UTC m=+122.036954022" watchObservedRunningTime="2026-04-20 14:28:55.317948548 +0000 UTC m=+122.038249980" Apr 20 14:28:57.216331 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:57.216293 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2pt82"] Apr 20 14:28:57.218771 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:57.218753 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2pt82" Apr 20 14:28:57.221030 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:57.221006 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-x9bhl\"" Apr 20 14:28:57.221151 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:57.221007 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 14:28:57.226207 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:57.226184 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2pt82"] Apr 20 14:28:57.309562 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:57.308599 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj" event={"ID":"a04aab45-434d-4c28-a9f8-631e889ece22","Type":"ContainerStarted","Data":"1d3b032b330579cc6275ffb4e06c6b5e8358dc30c47364f73248aa41405dcad7"} Apr 20 14:28:57.325954 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:57.325899 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lcqfj" podStartSLOduration=33.214644914 podStartE2EDuration="35.325883889s" podCreationTimestamp="2026-04-20 14:28:22 +0000 UTC" firstStartedPulling="2026-04-20 14:28:54.610734709 +0000 UTC m=+121.331036119" lastFinishedPulling="2026-04-20 14:28:56.721973681 +0000 UTC m=+123.442275094" observedRunningTime="2026-04-20 14:28:57.324993546 +0000 UTC m=+124.045294978" watchObservedRunningTime="2026-04-20 14:28:57.325883889 +0000 UTC m=+124.046185320" Apr 20 14:28:57.344265 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:57.344227 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/08e3cb9f-b15e-4569-a3b6-0c1123318898-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-2pt82\" (UID: \"08e3cb9f-b15e-4569-a3b6-0c1123318898\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2pt82" Apr 20 14:28:57.445793 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:57.445746 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/08e3cb9f-b15e-4569-a3b6-0c1123318898-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-2pt82\" (UID: \"08e3cb9f-b15e-4569-a3b6-0c1123318898\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2pt82" Apr 20 14:28:57.448184 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:57.448161 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/08e3cb9f-b15e-4569-a3b6-0c1123318898-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-2pt82\" (UID: \"08e3cb9f-b15e-4569-a3b6-0c1123318898\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2pt82" Apr 20 14:28:57.528774 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:57.528683 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2pt82" Apr 20 14:28:57.645862 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:57.645829 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2pt82"] Apr 20 14:28:57.648972 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:28:57.648944 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08e3cb9f_b15e_4569_a3b6_0c1123318898.slice/crio-4b5e18c428371c2bcedc13e38d1638fe08852da6860657c9bd2441e8a94997a1 WatchSource:0}: Error finding container 4b5e18c428371c2bcedc13e38d1638fe08852da6860657c9bd2441e8a94997a1: Status 404 returned error can't find the container with id 4b5e18c428371c2bcedc13e38d1638fe08852da6860657c9bd2441e8a94997a1 Apr 20 14:28:58.312780 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:58.312736 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2pt82" event={"ID":"08e3cb9f-b15e-4569-a3b6-0c1123318898","Type":"ContainerStarted","Data":"4b5e18c428371c2bcedc13e38d1638fe08852da6860657c9bd2441e8a94997a1"} Apr 20 14:28:59.316842 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:59.316753 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2pt82" event={"ID":"08e3cb9f-b15e-4569-a3b6-0c1123318898","Type":"ContainerStarted","Data":"8bc26e62449cb9fdb50b9f79044561508aa0657459333b9084f622d78348f69e"} Apr 20 14:28:59.317250 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:59.316987 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2pt82" Apr 20 14:28:59.321857 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:59.321826 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2pt82" Apr 20 14:28:59.331713 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:28:59.331666 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-2pt82" podStartSLOduration=0.96092778 podStartE2EDuration="2.331649847s" podCreationTimestamp="2026-04-20 14:28:57 +0000 UTC" firstStartedPulling="2026-04-20 14:28:57.650955625 +0000 UTC m=+124.371257039" lastFinishedPulling="2026-04-20 14:28:59.021677695 +0000 UTC m=+125.741979106" observedRunningTime="2026-04-20 14:28:59.33106788 +0000 UTC m=+126.051369312" watchObservedRunningTime="2026-04-20 14:28:59.331649847 +0000 UTC m=+126.051951278" Apr 20 14:29:00.280456 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:00.280418 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jl55v"] Apr 20 14:29:00.283073 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:00.283057 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-jl55v" Apr 20 14:29:00.286234 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:00.286205 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 14:29:00.286400 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:00.286206 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 14:29:00.286400 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:00.286205 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 14:29:00.286400 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:00.286205 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-57gqg\"" Apr 20 14:29:00.291688 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:00.291667 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jl55v"] Apr 20 14:29:00.370422 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:00.370389 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5fcc0fbb-05e8-4606-85c3-84cc15be4e7c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jl55v\" (UID: \"5fcc0fbb-05e8-4606-85c3-84cc15be4e7c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jl55v" Apr 20 14:29:00.370422 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:00.370426 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5fcc0fbb-05e8-4606-85c3-84cc15be4e7c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jl55v\" (UID: \"5fcc0fbb-05e8-4606-85c3-84cc15be4e7c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jl55v" Apr 20 14:29:00.370957 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:00.370542 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qws6w\" (UniqueName: \"kubernetes.io/projected/5fcc0fbb-05e8-4606-85c3-84cc15be4e7c-kube-api-access-qws6w\") pod \"prometheus-operator-5676c8c784-jl55v\" (UID: \"5fcc0fbb-05e8-4606-85c3-84cc15be4e7c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jl55v" Apr 20 14:29:00.370957 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:00.370591 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5fcc0fbb-05e8-4606-85c3-84cc15be4e7c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jl55v\" (UID: \"5fcc0fbb-05e8-4606-85c3-84cc15be4e7c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jl55v" Apr 20 14:29:00.471440 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:00.471407 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5fcc0fbb-05e8-4606-85c3-84cc15be4e7c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jl55v\" (UID: \"5fcc0fbb-05e8-4606-85c3-84cc15be4e7c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jl55v" Apr 20 14:29:00.471619 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:00.471461 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5fcc0fbb-05e8-4606-85c3-84cc15be4e7c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jl55v\" (UID: \"5fcc0fbb-05e8-4606-85c3-84cc15be4e7c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jl55v" Apr 20 14:29:00.471665 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:00.471638 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5fcc0fbb-05e8-4606-85c3-84cc15be4e7c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jl55v\" (UID: \"5fcc0fbb-05e8-4606-85c3-84cc15be4e7c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jl55v" Apr 20 14:29:00.471737 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:00.471721 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qws6w\" (UniqueName: \"kubernetes.io/projected/5fcc0fbb-05e8-4606-85c3-84cc15be4e7c-kube-api-access-qws6w\") pod \"prometheus-operator-5676c8c784-jl55v\" (UID: \"5fcc0fbb-05e8-4606-85c3-84cc15be4e7c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jl55v" Apr 20 14:29:00.472209 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:00.472186 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5fcc0fbb-05e8-4606-85c3-84cc15be4e7c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jl55v\" (UID: \"5fcc0fbb-05e8-4606-85c3-84cc15be4e7c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jl55v" Apr 20 14:29:00.473848 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:00.473822 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/5fcc0fbb-05e8-4606-85c3-84cc15be4e7c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jl55v\" (UID: \"5fcc0fbb-05e8-4606-85c3-84cc15be4e7c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jl55v" Apr 20 14:29:00.473948 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:00.473861 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5fcc0fbb-05e8-4606-85c3-84cc15be4e7c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jl55v\" (UID: \"5fcc0fbb-05e8-4606-85c3-84cc15be4e7c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jl55v" Apr 20 14:29:00.479964 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:00.479944 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qws6w\" (UniqueName: \"kubernetes.io/projected/5fcc0fbb-05e8-4606-85c3-84cc15be4e7c-kube-api-access-qws6w\") pod \"prometheus-operator-5676c8c784-jl55v\" (UID: \"5fcc0fbb-05e8-4606-85c3-84cc15be4e7c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jl55v" Apr 20 14:29:00.592492 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:00.592458 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-jl55v" Apr 20 14:29:00.708785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:00.708759 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jl55v"] Apr 20 14:29:00.711485 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:29:00.711447 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fcc0fbb_05e8_4606_85c3_84cc15be4e7c.slice/crio-ce07ef53ca93904e6090bb92140d38053130bc6ed941b87bac6ae56859acb445 WatchSource:0}: Error finding container ce07ef53ca93904e6090bb92140d38053130bc6ed941b87bac6ae56859acb445: Status 404 returned error can't find the container with id ce07ef53ca93904e6090bb92140d38053130bc6ed941b87bac6ae56859acb445 Apr 20 14:29:01.327375 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:01.327336 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jl55v" event={"ID":"5fcc0fbb-05e8-4606-85c3-84cc15be4e7c","Type":"ContainerStarted","Data":"ce07ef53ca93904e6090bb92140d38053130bc6ed941b87bac6ae56859acb445"} Apr 20 14:29:02.333226 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:02.333186 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jl55v" event={"ID":"5fcc0fbb-05e8-4606-85c3-84cc15be4e7c","Type":"ContainerStarted","Data":"5243b371b99feff5c0bcdd8242b83c0f57f95fde2d384b9484770f19ef72e5bf"} Apr 20 14:29:02.333226 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:02.333224 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jl55v" event={"ID":"5fcc0fbb-05e8-4606-85c3-84cc15be4e7c","Type":"ContainerStarted","Data":"f7e1c10c39ddb875874e2837d999bd72fd1ad9f0dea4352a1f63b9225469b153"} Apr 20 14:29:02.350648 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:02.350602 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-jl55v" podStartSLOduration=1.041927538 podStartE2EDuration="2.350588712s" podCreationTimestamp="2026-04-20 14:29:00 +0000 UTC" firstStartedPulling="2026-04-20 14:29:00.713775932 +0000 UTC m=+127.434077342" lastFinishedPulling="2026-04-20 14:29:02.022437103 +0000 UTC m=+128.742738516" observedRunningTime="2026-04-20 14:29:02.348736329 +0000 UTC m=+129.069037782" watchObservedRunningTime="2026-04-20 14:29:02.350588712 +0000 UTC m=+129.070890179" Apr 20 14:29:03.698099 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:03.698047 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs\") pod \"network-metrics-daemon-qsqks\" (UID: \"db79a290-5377-45f9-bb87-89588231d8a7\") " pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:29:03.700296 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:03.700273 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/db79a290-5377-45f9-bb87-89588231d8a7-metrics-certs\") pod \"network-metrics-daemon-qsqks\" (UID: \"db79a290-5377-45f9-bb87-89588231d8a7\") " pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:29:03.955469 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:03.955386 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xmk8q\"" Apr 20 14:29:03.963380 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:03.963363 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qsqks" Apr 20 14:29:04.077449 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.077418 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qsqks"] Apr 20 14:29:04.080488 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:29:04.080457 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb79a290_5377_45f9_bb87_89588231d8a7.slice/crio-ca2fb85978e3e63859f4c8a5fbe56d14b810539ff192d8294a5d94e8130b3db6 WatchSource:0}: Error finding container ca2fb85978e3e63859f4c8a5fbe56d14b810539ff192d8294a5d94e8130b3db6: Status 404 returned error can't find the container with id ca2fb85978e3e63859f4c8a5fbe56d14b810539ff192d8294a5d94e8130b3db6 Apr 20 14:29:04.339937 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.339885 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qsqks" event={"ID":"db79a290-5377-45f9-bb87-89588231d8a7","Type":"ContainerStarted","Data":"ca2fb85978e3e63859f4c8a5fbe56d14b810539ff192d8294a5d94e8130b3db6"} Apr 20 14:29:04.633964 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.633843 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-jjdk7"] Apr 20 14:29:04.637008 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.636989 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jjdk7" Apr 20 14:29:04.639309 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.639278 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 14:29:04.639421 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.639403 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 14:29:04.639486 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.639405 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-824r6\"" Apr 20 14:29:04.646759 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.646738 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-jjdk7"] Apr 20 14:29:04.651710 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.651691 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-9fd44"] Apr 20 14:29:04.653913 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.653898 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" Apr 20 14:29:04.656640 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.656614 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 20 14:29:04.656873 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.656852 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 20 14:29:04.656957 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.656886 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 20 14:29:04.657175 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.657151 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-w2z8s\"" Apr 20 14:29:04.669288 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.669258 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-v9pc4"] Apr 20 14:29:04.672654 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.672632 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-9fd44"] Apr 20 14:29:04.672791 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.672772 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.674862 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.674827 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 14:29:04.675015 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.674889 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-vhd9x\"" Apr 20 14:29:04.675082 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.675067 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 14:29:04.675153 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.675131 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 14:29:04.707192 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.707157 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ccedf4b7-a764-4c69-ae30-16d122f1bddc-node-exporter-tls\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.707192 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.707207 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8clsl\" (UniqueName: \"kubernetes.io/projected/ccedf4b7-a764-4c69-ae30-16d122f1bddc-kube-api-access-8clsl\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.707731 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.707241 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/04817841-de4d-4aa5-b903-08642105cdfb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-9fd44\" (UID: \"04817841-de4d-4aa5-b903-08642105cdfb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" Apr 20 14:29:04.707731 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.707266 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ccedf4b7-a764-4c69-ae30-16d122f1bddc-node-exporter-wtmp\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.707731 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.707315 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/66a3170e-bdf6-4b55-a50b-8c583852bce8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-jjdk7\" (UID: \"66a3170e-bdf6-4b55-a50b-8c583852bce8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jjdk7" Apr 20 14:29:04.707731 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.707402 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ccedf4b7-a764-4c69-ae30-16d122f1bddc-node-exporter-accelerators-collector-config\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.707731 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.707431 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/66a3170e-bdf6-4b55-a50b-8c583852bce8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-jjdk7\" (UID: \"66a3170e-bdf6-4b55-a50b-8c583852bce8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jjdk7" Apr 20 14:29:04.707731 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.707462 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/04817841-de4d-4aa5-b903-08642105cdfb-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-9fd44\" (UID: \"04817841-de4d-4aa5-b903-08642105cdfb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" Apr 20 14:29:04.707731 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.707549 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ccedf4b7-a764-4c69-ae30-16d122f1bddc-sys\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.707731 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.707583 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ccedf4b7-a764-4c69-ae30-16d122f1bddc-root\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.707731 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.707606 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/04817841-de4d-4aa5-b903-08642105cdfb-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9fd44\" (UID: \"04817841-de4d-4aa5-b903-08642105cdfb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" Apr 20 14:29:04.707731 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.707631 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ccedf4b7-a764-4c69-ae30-16d122f1bddc-metrics-client-ca\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.707731 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.707659 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04817841-de4d-4aa5-b903-08642105cdfb-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-9fd44\" (UID: \"04817841-de4d-4aa5-b903-08642105cdfb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" Apr 20 14:29:04.707731 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.707715 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/66a3170e-bdf6-4b55-a50b-8c583852bce8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-jjdk7\" (UID: \"66a3170e-bdf6-4b55-a50b-8c583852bce8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jjdk7" Apr 20 14:29:04.708176 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.707768 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ccedf4b7-a764-4c69-ae30-16d122f1bddc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.708176 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.707800 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnzqs\" (UniqueName: \"kubernetes.io/projected/66a3170e-bdf6-4b55-a50b-8c583852bce8-kube-api-access-jnzqs\") pod \"openshift-state-metrics-9d44df66c-jjdk7\" (UID: \"66a3170e-bdf6-4b55-a50b-8c583852bce8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jjdk7" Apr 20 14:29:04.708176 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.707842 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ccedf4b7-a764-4c69-ae30-16d122f1bddc-node-exporter-textfile\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.708176 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.707866 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddhk7\" (UniqueName: \"kubernetes.io/projected/04817841-de4d-4aa5-b903-08642105cdfb-kube-api-access-ddhk7\") pod \"kube-state-metrics-69db897b98-9fd44\" (UID: \"04817841-de4d-4aa5-b903-08642105cdfb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" Apr 20 14:29:04.708176 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.707955 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/04817841-de4d-4aa5-b903-08642105cdfb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-9fd44\" (UID: \"04817841-de4d-4aa5-b903-08642105cdfb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" Apr 20 14:29:04.808637 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.808598 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ccedf4b7-a764-4c69-ae30-16d122f1bddc-node-exporter-accelerators-collector-config\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.808637 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.808640 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/66a3170e-bdf6-4b55-a50b-8c583852bce8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-jjdk7\" (UID: \"66a3170e-bdf6-4b55-a50b-8c583852bce8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jjdk7" Apr 20 14:29:04.808939 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.808659 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/04817841-de4d-4aa5-b903-08642105cdfb-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-9fd44\" (UID: \"04817841-de4d-4aa5-b903-08642105cdfb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" Apr 20 14:29:04.808939 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.808701 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ccedf4b7-a764-4c69-ae30-16d122f1bddc-sys\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.808939 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.808725 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ccedf4b7-a764-4c69-ae30-16d122f1bddc-root\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.808939 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.808752 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/04817841-de4d-4aa5-b903-08642105cdfb-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9fd44\" (UID: \"04817841-de4d-4aa5-b903-08642105cdfb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" Apr 20 14:29:04.808939 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.808783 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ccedf4b7-a764-4c69-ae30-16d122f1bddc-metrics-client-ca\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.808939 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.808814 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04817841-de4d-4aa5-b903-08642105cdfb-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-9fd44\" (UID: \"04817841-de4d-4aa5-b903-08642105cdfb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" Apr 20 14:29:04.808939 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.808837 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ccedf4b7-a764-4c69-ae30-16d122f1bddc-sys\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.808939 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.808847 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/66a3170e-bdf6-4b55-a50b-8c583852bce8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-jjdk7\" (UID: \"66a3170e-bdf6-4b55-a50b-8c583852bce8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jjdk7" Apr 20 14:29:04.808939 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.808907 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ccedf4b7-a764-4c69-ae30-16d122f1bddc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.808939 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.808937 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jnzqs\" (UniqueName: \"kubernetes.io/projected/66a3170e-bdf6-4b55-a50b-8c583852bce8-kube-api-access-jnzqs\") pod \"openshift-state-metrics-9d44df66c-jjdk7\" (UID: \"66a3170e-bdf6-4b55-a50b-8c583852bce8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jjdk7" Apr 20 14:29:04.809936 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.808968 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ccedf4b7-a764-4c69-ae30-16d122f1bddc-node-exporter-textfile\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.809936 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.808994 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddhk7\" (UniqueName: \"kubernetes.io/projected/04817841-de4d-4aa5-b903-08642105cdfb-kube-api-access-ddhk7\") pod \"kube-state-metrics-69db897b98-9fd44\" (UID: \"04817841-de4d-4aa5-b903-08642105cdfb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" Apr 20 14:29:04.809936 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.809037 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/04817841-de4d-4aa5-b903-08642105cdfb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-9fd44\" (UID: \"04817841-de4d-4aa5-b903-08642105cdfb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" Apr 20 14:29:04.809936 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.809063 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ccedf4b7-a764-4c69-ae30-16d122f1bddc-node-exporter-tls\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.809936 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.809104 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8clsl\" (UniqueName: \"kubernetes.io/projected/ccedf4b7-a764-4c69-ae30-16d122f1bddc-kube-api-access-8clsl\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.809936 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.809134 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/04817841-de4d-4aa5-b903-08642105cdfb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-9fd44\" (UID: \"04817841-de4d-4aa5-b903-08642105cdfb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" Apr 20 14:29:04.809936 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.809179 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ccedf4b7-a764-4c69-ae30-16d122f1bddc-node-exporter-wtmp\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.809936 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.809221 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/66a3170e-bdf6-4b55-a50b-8c583852bce8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-jjdk7\" (UID: \"66a3170e-bdf6-4b55-a50b-8c583852bce8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jjdk7" Apr 20 14:29:04.809936 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.809460 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/66a3170e-bdf6-4b55-a50b-8c583852bce8-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-jjdk7\" (UID: \"66a3170e-bdf6-4b55-a50b-8c583852bce8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jjdk7" Apr 20 14:29:04.809936 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.809561 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ccedf4b7-a764-4c69-ae30-16d122f1bddc-root\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.810424 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.809946 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04817841-de4d-4aa5-b903-08642105cdfb-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-9fd44\" (UID: \"04817841-de4d-4aa5-b903-08642105cdfb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" Apr 20 14:29:04.810424 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.810262 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ccedf4b7-a764-4c69-ae30-16d122f1bddc-metrics-client-ca\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.810596 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.810570 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ccedf4b7-a764-4c69-ae30-16d122f1bddc-node-exporter-wtmp\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.810664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.810598 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ccedf4b7-a764-4c69-ae30-16d122f1bddc-node-exporter-textfile\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.810720 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.810687 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/04817841-de4d-4aa5-b903-08642105cdfb-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-9fd44\" (UID: \"04817841-de4d-4aa5-b903-08642105cdfb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" Apr 20 14:29:04.810950 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.810902 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ccedf4b7-a764-4c69-ae30-16d122f1bddc-node-exporter-accelerators-collector-config\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.811104 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.811079 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/04817841-de4d-4aa5-b903-08642105cdfb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-9fd44\" (UID: \"04817841-de4d-4aa5-b903-08642105cdfb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" Apr 20 14:29:04.812590 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.812549 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ccedf4b7-a764-4c69-ae30-16d122f1bddc-node-exporter-tls\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.812918 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.812876 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/04817841-de4d-4aa5-b903-08642105cdfb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-9fd44\" (UID: \"04817841-de4d-4aa5-b903-08642105cdfb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" Apr 20 14:29:04.813096 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.813077 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/66a3170e-bdf6-4b55-a50b-8c583852bce8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-jjdk7\" (UID: \"66a3170e-bdf6-4b55-a50b-8c583852bce8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jjdk7" Apr 20 14:29:04.813160 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.813143 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/66a3170e-bdf6-4b55-a50b-8c583852bce8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-jjdk7\" (UID: \"66a3170e-bdf6-4b55-a50b-8c583852bce8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jjdk7" Apr 20 14:29:04.813437 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.813409 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ccedf4b7-a764-4c69-ae30-16d122f1bddc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.813668 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.813652 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/04817841-de4d-4aa5-b903-08642105cdfb-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-9fd44\" (UID: \"04817841-de4d-4aa5-b903-08642105cdfb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" Apr 20 14:29:04.817424 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.817399 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8clsl\" (UniqueName: \"kubernetes.io/projected/ccedf4b7-a764-4c69-ae30-16d122f1bddc-kube-api-access-8clsl\") pod \"node-exporter-v9pc4\" (UID: \"ccedf4b7-a764-4c69-ae30-16d122f1bddc\") " pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:04.818132 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.818111 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddhk7\" (UniqueName: \"kubernetes.io/projected/04817841-de4d-4aa5-b903-08642105cdfb-kube-api-access-ddhk7\") pod \"kube-state-metrics-69db897b98-9fd44\" (UID: \"04817841-de4d-4aa5-b903-08642105cdfb\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" Apr 20 14:29:04.818930 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.818904 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnzqs\" (UniqueName: \"kubernetes.io/projected/66a3170e-bdf6-4b55-a50b-8c583852bce8-kube-api-access-jnzqs\") pod \"openshift-state-metrics-9d44df66c-jjdk7\" (UID: \"66a3170e-bdf6-4b55-a50b-8c583852bce8\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jjdk7" Apr 20 14:29:04.947219 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.947148 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jjdk7" Apr 20 14:29:04.969929 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.969895 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" Apr 20 14:29:04.984697 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:04.984665 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-v9pc4" Apr 20 14:29:05.209983 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:29:05.209939 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccedf4b7_a764_4c69_ae30_16d122f1bddc.slice/crio-bd09fd1401223e3599f7528ab3000d958c8ebf6e71ce40d69231ea64f5dca75b WatchSource:0}: Error finding container bd09fd1401223e3599f7528ab3000d958c8ebf6e71ce40d69231ea64f5dca75b: Status 404 returned error can't find the container with id bd09fd1401223e3599f7528ab3000d958c8ebf6e71ce40d69231ea64f5dca75b Apr 20 14:29:05.333261 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.333229 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-jjdk7"] Apr 20 14:29:05.337072 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:29:05.337036 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66a3170e_bdf6_4b55_a50b_8c583852bce8.slice/crio-aac908ec6e56155c16df20cb0142ddc94517348567f3c18f01b52d5a5cc8ee4c WatchSource:0}: Error finding container aac908ec6e56155c16df20cb0142ddc94517348567f3c18f01b52d5a5cc8ee4c: Status 404 returned error can't find the container with id aac908ec6e56155c16df20cb0142ddc94517348567f3c18f01b52d5a5cc8ee4c Apr 20 14:29:05.344573 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.344538 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v9pc4" event={"ID":"ccedf4b7-a764-4c69-ae30-16d122f1bddc","Type":"ContainerStarted","Data":"bd09fd1401223e3599f7528ab3000d958c8ebf6e71ce40d69231ea64f5dca75b"} Apr 20 14:29:05.345679 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.345649 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jjdk7" event={"ID":"66a3170e-bdf6-4b55-a50b-8c583852bce8","Type":"ContainerStarted","Data":"aac908ec6e56155c16df20cb0142ddc94517348567f3c18f01b52d5a5cc8ee4c"} Apr 20 14:29:05.357688 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.357654 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-9fd44"] Apr 20 14:29:05.360977 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:29:05.360947 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04817841_de4d_4aa5_b903_08642105cdfb.slice/crio-09f1bcec3d46f9a2ec312bfd1fc7dd53e29ea2a622d6e2f8a65a0795db62e365 WatchSource:0}: Error finding container 09f1bcec3d46f9a2ec312bfd1fc7dd53e29ea2a622d6e2f8a65a0795db62e365: Status 404 returned error can't find the container with id 09f1bcec3d46f9a2ec312bfd1fc7dd53e29ea2a622d6e2f8a65a0795db62e365 Apr 20 14:29:05.717963 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.717855 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 14:29:05.721459 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.721437 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.723753 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.723591 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 14:29:05.723753 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.723640 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 14:29:05.723753 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.723646 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 14:29:05.723753 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.723678 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-xrsmc\"" Apr 20 14:29:05.724035 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.723933 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 14:29:05.724035 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.724008 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 14:29:05.724035 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.724022 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 14:29:05.724184 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.724152 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 14:29:05.724444 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.724398 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 14:29:05.724444 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.724440 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 14:29:05.735983 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.735957 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 14:29:05.819793 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.819761 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8b6kr\" (UniqueName: \"kubernetes.io/projected/c02b03da-cf73-405d-a560-0e05db9a7dfc-kube-api-access-8b6kr\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.819974 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.819807 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-web-config\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.819974 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.819837 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.819974 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.819919 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-config-volume\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.819974 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.819960 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c02b03da-cf73-405d-a560-0e05db9a7dfc-config-out\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.820145 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.820005 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.820145 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.820032 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c02b03da-cf73-405d-a560-0e05db9a7dfc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.820145 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.820081 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.820145 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.820119 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.820323 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.820251 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c02b03da-cf73-405d-a560-0e05db9a7dfc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.820323 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.820278 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c02b03da-cf73-405d-a560-0e05db9a7dfc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.820416 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.820345 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c02b03da-cf73-405d-a560-0e05db9a7dfc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.820416 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.820402 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.921381 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.921317 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c02b03da-cf73-405d-a560-0e05db9a7dfc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.921381 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.921387 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.921670 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.921420 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8b6kr\" (UniqueName: \"kubernetes.io/projected/c02b03da-cf73-405d-a560-0e05db9a7dfc-kube-api-access-8b6kr\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.921670 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.921454 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-web-config\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.921670 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.921480 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.921670 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.921532 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-config-volume\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.921670 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.921555 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c02b03da-cf73-405d-a560-0e05db9a7dfc-config-out\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.921670 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.921587 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.921670 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.921611 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c02b03da-cf73-405d-a560-0e05db9a7dfc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.922008 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.921814 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.922008 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.921836 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c02b03da-cf73-405d-a560-0e05db9a7dfc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.922008 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.921888 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.922008 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.921926 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c02b03da-cf73-405d-a560-0e05db9a7dfc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.922196 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.922060 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c02b03da-cf73-405d-a560-0e05db9a7dfc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.922248 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:29:05.922219 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c02b03da-cf73-405d-a560-0e05db9a7dfc-alertmanager-trusted-ca-bundle podName:c02b03da-cf73-405d-a560-0e05db9a7dfc nodeName:}" failed. No retries permitted until 2026-04-20 14:29:06.422197051 +0000 UTC m=+133.142498463 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/c02b03da-cf73-405d-a560-0e05db9a7dfc-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "c02b03da-cf73-405d-a560-0e05db9a7dfc") : configmap references non-existent config key: ca-bundle.crt Apr 20 14:29:05.923327 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.923218 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c02b03da-cf73-405d-a560-0e05db9a7dfc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.925221 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.925156 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-web-config\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.926242 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.926069 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.926242 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.926203 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c02b03da-cf73-405d-a560-0e05db9a7dfc-config-out\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.928843 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.928799 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-config-volume\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.928843 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.928809 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.929402 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.929034 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.930625 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.930027 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.931300 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.931252 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.932009 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.931986 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c02b03da-cf73-405d-a560-0e05db9a7dfc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:05.932225 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:05.932200 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8b6kr\" (UniqueName: \"kubernetes.io/projected/c02b03da-cf73-405d-a560-0e05db9a7dfc-kube-api-access-8b6kr\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:06.351877 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:06.351776 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qsqks" event={"ID":"db79a290-5377-45f9-bb87-89588231d8a7","Type":"ContainerStarted","Data":"dd40e8a7e8049f7b58c6fac1d88774f34d8dc23633995f5445be88a3497af310"} Apr 20 14:29:06.351877 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:06.351832 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qsqks" event={"ID":"db79a290-5377-45f9-bb87-89588231d8a7","Type":"ContainerStarted","Data":"e565d777a03f9843329d5f28ebe47829fe6f646e2ad2ee2bac7e3f2bdc22fdfa"} Apr 20 14:29:06.354056 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:06.354022 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" event={"ID":"04817841-de4d-4aa5-b903-08642105cdfb","Type":"ContainerStarted","Data":"09f1bcec3d46f9a2ec312bfd1fc7dd53e29ea2a622d6e2f8a65a0795db62e365"} Apr 20 14:29:06.355811 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:06.355686 2580 generic.go:358] "Generic (PLEG): container finished" podID="ccedf4b7-a764-4c69-ae30-16d122f1bddc" containerID="8d7af6fbfb442f34bd853d9bd8dcd916b1bdfd127dfd54e1615dd335c26657c6" exitCode=0 Apr 20 14:29:06.355811 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:06.355773 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v9pc4" event={"ID":"ccedf4b7-a764-4c69-ae30-16d122f1bddc","Type":"ContainerDied","Data":"8d7af6fbfb442f34bd853d9bd8dcd916b1bdfd127dfd54e1615dd335c26657c6"} Apr 20 14:29:06.357888 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:06.357855 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jjdk7" event={"ID":"66a3170e-bdf6-4b55-a50b-8c583852bce8","Type":"ContainerStarted","Data":"6b0a59fae17a64553770ea8767ff56a085d4e5a5612bb669f81f650ebb8f0a4c"} Apr 20 14:29:06.357888 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:06.357887 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jjdk7" event={"ID":"66a3170e-bdf6-4b55-a50b-8c583852bce8","Type":"ContainerStarted","Data":"c5ceaad2ac0d99048a8203dda223f11a1b4fdd4536f684cd787f7479c8b2bc19"} Apr 20 14:29:06.398315 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:06.398239 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qsqks" podStartSLOduration=131.236366466 podStartE2EDuration="2m12.398223533s" podCreationTimestamp="2026-04-20 14:26:54 +0000 UTC" firstStartedPulling="2026-04-20 14:29:04.082424592 +0000 UTC m=+130.802726002" lastFinishedPulling="2026-04-20 14:29:05.244281653 +0000 UTC m=+131.964583069" observedRunningTime="2026-04-20 14:29:06.372922701 +0000 UTC m=+133.093224144" watchObservedRunningTime="2026-04-20 14:29:06.398223533 +0000 UTC m=+133.118524964" Apr 20 14:29:06.429277 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:06.429145 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c02b03da-cf73-405d-a560-0e05db9a7dfc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:06.430472 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:06.430427 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c02b03da-cf73-405d-a560-0e05db9a7dfc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:06.634160 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:06.634065 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:29:07.192475 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:07.192411 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 14:29:07.195164 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:29:07.195132 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc02b03da_cf73_405d_a560_0e05db9a7dfc.slice/crio-4323b30d8f2600223fecc2a209e25570c5058da07c513de640c317c7925015a3 WatchSource:0}: Error finding container 4323b30d8f2600223fecc2a209e25570c5058da07c513de640c317c7925015a3: Status 404 returned error can't find the container with id 4323b30d8f2600223fecc2a209e25570c5058da07c513de640c317c7925015a3 Apr 20 14:29:07.363876 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:07.363818 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" event={"ID":"04817841-de4d-4aa5-b903-08642105cdfb","Type":"ContainerStarted","Data":"1e5784bd5fc1605da6367ae2cc0c478e80a14f0ebf424902187c208f7b087c71"} Apr 20 14:29:07.363876 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:07.363869 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" event={"ID":"04817841-de4d-4aa5-b903-08642105cdfb","Type":"ContainerStarted","Data":"d28b88573e0ecafd0a34fa343bb809acbeace77f83350e70025d26d66fd5424c"} Apr 20 14:29:07.363876 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:07.363885 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" event={"ID":"04817841-de4d-4aa5-b903-08642105cdfb","Type":"ContainerStarted","Data":"7416acc3cf86224241178698291cc79aa90297a4bdc82008a34a83bcab693ab5"} Apr 20 14:29:07.366227 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:07.366201 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v9pc4" event={"ID":"ccedf4b7-a764-4c69-ae30-16d122f1bddc","Type":"ContainerStarted","Data":"26e54c2a0908071ed6317993d607f2546b9e196f19b124f5d08635d654de5553"} Apr 20 14:29:07.366227 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:07.366234 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-v9pc4" event={"ID":"ccedf4b7-a764-4c69-ae30-16d122f1bddc","Type":"ContainerStarted","Data":"fdb14894bb4c356876b128b5e2df588c9035388a49aca890502303e7f4aef384"} Apr 20 14:29:07.368152 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:07.368101 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jjdk7" event={"ID":"66a3170e-bdf6-4b55-a50b-8c583852bce8","Type":"ContainerStarted","Data":"d3c66010b90f8112005f753c7937c399fff1308c5c59d7820ca088d57f311a4e"} Apr 20 14:29:07.369391 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:07.369370 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c02b03da-cf73-405d-a560-0e05db9a7dfc","Type":"ContainerStarted","Data":"4323b30d8f2600223fecc2a209e25570c5058da07c513de640c317c7925015a3"} Apr 20 14:29:07.389406 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:07.389327 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-9fd44" podStartSLOduration=1.7072276560000001 podStartE2EDuration="3.389306786s" podCreationTimestamp="2026-04-20 14:29:04 +0000 UTC" firstStartedPulling="2026-04-20 14:29:05.363159154 +0000 UTC m=+132.083460564" lastFinishedPulling="2026-04-20 14:29:07.045238279 +0000 UTC m=+133.765539694" observedRunningTime="2026-04-20 14:29:07.387811732 +0000 UTC m=+134.108113164" watchObservedRunningTime="2026-04-20 14:29:07.389306786 +0000 UTC m=+134.109608219" Apr 20 14:29:07.417369 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:07.417302 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-v9pc4" podStartSLOduration=2.57349832 podStartE2EDuration="3.417284717s" podCreationTimestamp="2026-04-20 14:29:04 +0000 UTC" firstStartedPulling="2026-04-20 14:29:05.236842302 +0000 UTC m=+131.957143726" lastFinishedPulling="2026-04-20 14:29:06.080628708 +0000 UTC m=+132.800930123" observedRunningTime="2026-04-20 14:29:07.41664448 +0000 UTC m=+134.136945914" watchObservedRunningTime="2026-04-20 14:29:07.417284717 +0000 UTC m=+134.137586147" Apr 20 14:29:07.447563 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:07.447465 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-jjdk7" podStartSLOduration=1.881131813 podStartE2EDuration="3.447447458s" podCreationTimestamp="2026-04-20 14:29:04 +0000 UTC" firstStartedPulling="2026-04-20 14:29:05.476741321 +0000 UTC m=+132.197042745" lastFinishedPulling="2026-04-20 14:29:07.043056977 +0000 UTC m=+133.763358390" observedRunningTime="2026-04-20 14:29:07.444904229 +0000 UTC m=+134.165205661" watchObservedRunningTime="2026-04-20 14:29:07.447447458 +0000 UTC m=+134.167748891" Apr 20 14:29:09.378707 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:09.378664 2580 generic.go:358] "Generic (PLEG): container finished" podID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerID="032c9c880f135107f9029eaa583cff941f75773ab920150cdc393cf9cd82d6c3" exitCode=0 Apr 20 14:29:09.379199 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:09.378750 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c02b03da-cf73-405d-a560-0e05db9a7dfc","Type":"ContainerDied","Data":"032c9c880f135107f9029eaa583cff941f75773ab920150cdc393cf9cd82d6c3"} Apr 20 14:29:11.063774 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.063742 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 14:29:11.075961 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.075933 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.080426 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.080392 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 14:29:11.080593 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.080437 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 14:29:11.080768 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.080691 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 14:29:11.081408 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.081388 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 14:29:11.082004 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.081977 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 14:29:11.088259 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.088036 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 14:29:11.088259 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.088076 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 14:29:11.088259 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.088098 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-rv2f2\"" Apr 20 14:29:11.088259 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.088154 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 14:29:11.088259 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.088178 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 14:29:11.088259 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.088099 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-cid9rocamv4tr\"" Apr 20 14:29:11.088259 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.088245 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 14:29:11.088259 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.088248 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 14:29:11.088782 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.088521 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 14:29:11.089438 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.089422 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 14:29:11.105012 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.104979 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 14:29:11.174026 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.173992 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.174026 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.174028 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.174244 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.174074 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b49ed096-fc59-422a-993a-259f96d09946-config-out\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.174244 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.174101 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.174244 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.174211 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-web-config\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.174338 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.174250 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.174338 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.174279 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-config\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.174338 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.174318 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.174424 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.174359 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b49ed096-fc59-422a-993a-259f96d09946-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.174424 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.174386 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.174424 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.174418 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh28w\" (UniqueName: \"kubernetes.io/projected/b49ed096-fc59-422a-993a-259f96d09946-kube-api-access-fh28w\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.174529 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.174472 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b49ed096-fc59-422a-993a-259f96d09946-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.174572 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.174522 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.174572 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.174549 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.174572 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.174565 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.174657 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.174605 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.174657 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.174633 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.174721 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.174659 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.277708 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.275678 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b49ed096-fc59-422a-993a-259f96d09946-config-out\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.277708 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.275732 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.277708 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.275765 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-web-config\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.277708 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.275789 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.277708 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.275813 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-config\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.277708 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.275848 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.277708 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.275886 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b49ed096-fc59-422a-993a-259f96d09946-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.277708 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.275909 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.277708 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.275939 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fh28w\" (UniqueName: \"kubernetes.io/projected/b49ed096-fc59-422a-993a-259f96d09946-kube-api-access-fh28w\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.277708 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.275977 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b49ed096-fc59-422a-993a-259f96d09946-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.277708 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.276004 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.277708 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.276031 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.277708 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.276057 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.277708 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.276101 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.277708 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.276127 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.277708 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.276165 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.277708 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.276236 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.278630 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.276261 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.278630 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.277137 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.279487 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.279161 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.281842 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.281098 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.281842 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.281154 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b49ed096-fc59-422a-993a-259f96d09946-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.281842 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.281352 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.281842 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.281376 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b49ed096-fc59-422a-993a-259f96d09946-config-out\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.281842 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.281589 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-config\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.281842 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.281784 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.282283 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.282251 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.282347 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.282299 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.283059 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.282414 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.283059 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.282956 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-web-config\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.283059 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.283007 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.283732 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.283708 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b49ed096-fc59-422a-993a-259f96d09946-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.284179 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.284144 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.284266 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.284239 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.284890 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.284868 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.293110 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.293076 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh28w\" (UniqueName: \"kubernetes.io/projected/b49ed096-fc59-422a-993a-259f96d09946-kube-api-access-fh28w\") pod \"prometheus-k8s-0\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.402100 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.401996 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c02b03da-cf73-405d-a560-0e05db9a7dfc","Type":"ContainerStarted","Data":"de27fd50a9cb93471a16121102e2d4d06307d2831ebde12d89dd6f3f2c4287ed"} Apr 20 14:29:11.402100 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.402039 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c02b03da-cf73-405d-a560-0e05db9a7dfc","Type":"ContainerStarted","Data":"fa0c529fd947ed0cfbaecefa03f47f69ae76432fdc1569aa9f2804b46b73b019"} Apr 20 14:29:11.402100 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.402052 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c02b03da-cf73-405d-a560-0e05db9a7dfc","Type":"ContainerStarted","Data":"4d89303ae5cf9e9606281148d6051ca34495a0ff982792a0269acc9acfca3050"} Apr 20 14:29:11.402100 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.402065 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c02b03da-cf73-405d-a560-0e05db9a7dfc","Type":"ContainerStarted","Data":"a5703c8a46e1c9107c318b504487028469e72f259421f84736324cb020d74a85"} Apr 20 14:29:11.405418 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.404972 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:11.558453 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:11.558421 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 14:29:11.561870 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:29:11.561827 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb49ed096_fc59_422a_993a_259f96d09946.slice/crio-254f38514cf9c277d7addb40a9fff1779ce6dd29b9af80a303c00677aa2c95e3 WatchSource:0}: Error finding container 254f38514cf9c277d7addb40a9fff1779ce6dd29b9af80a303c00677aa2c95e3: Status 404 returned error can't find the container with id 254f38514cf9c277d7addb40a9fff1779ce6dd29b9af80a303c00677aa2c95e3 Apr 20 14:29:12.406715 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:12.406682 2580 generic.go:358] "Generic (PLEG): container finished" podID="b49ed096-fc59-422a-993a-259f96d09946" containerID="b46689b54626202931fd89ae0bb8df1370910d883efa83a4126c789d594bb32d" exitCode=0 Apr 20 14:29:12.407128 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:12.406774 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b49ed096-fc59-422a-993a-259f96d09946","Type":"ContainerDied","Data":"b46689b54626202931fd89ae0bb8df1370910d883efa83a4126c789d594bb32d"} Apr 20 14:29:12.407128 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:12.406823 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b49ed096-fc59-422a-993a-259f96d09946","Type":"ContainerStarted","Data":"254f38514cf9c277d7addb40a9fff1779ce6dd29b9af80a303c00677aa2c95e3"} Apr 20 14:29:12.410097 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:12.410073 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c02b03da-cf73-405d-a560-0e05db9a7dfc","Type":"ContainerStarted","Data":"7fe363164627534e520fff159c755472387cf854883d2991e7aacbbfe994cca8"} Apr 20 14:29:12.410178 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:12.410107 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c02b03da-cf73-405d-a560-0e05db9a7dfc","Type":"ContainerStarted","Data":"c98ad5e33a640df44728cd30a8daed3d147cc5bf6b04b94610de77f3ca6c4d60"} Apr 20 14:29:12.460209 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:12.460156 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.347677436 podStartE2EDuration="7.460140177s" podCreationTimestamp="2026-04-20 14:29:05 +0000 UTC" firstStartedPulling="2026-04-20 14:29:07.197144482 +0000 UTC m=+133.917445893" lastFinishedPulling="2026-04-20 14:29:12.309607222 +0000 UTC m=+139.029908634" observedRunningTime="2026-04-20 14:29:12.458933745 +0000 UTC m=+139.179235179" watchObservedRunningTime="2026-04-20 14:29:12.460140177 +0000 UTC m=+139.180441609" Apr 20 14:29:13.823426 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:13.823390 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-556bcd8745-tx4hg"] Apr 20 14:29:13.823865 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:29:13.823724 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" podUID="150936cc-56bb-4f1a-9468-4a8527e8cec7" Apr 20 14:29:14.419425 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.419389 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:29:14.424989 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.424963 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:29:14.510373 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.510339 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/150936cc-56bb-4f1a-9468-4a8527e8cec7-installation-pull-secrets\") pod \"150936cc-56bb-4f1a-9468-4a8527e8cec7\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " Apr 20 14:29:14.510581 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.510384 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/150936cc-56bb-4f1a-9468-4a8527e8cec7-trusted-ca\") pod \"150936cc-56bb-4f1a-9468-4a8527e8cec7\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " Apr 20 14:29:14.510581 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.510447 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/150936cc-56bb-4f1a-9468-4a8527e8cec7-ca-trust-extracted\") pod \"150936cc-56bb-4f1a-9468-4a8527e8cec7\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " Apr 20 14:29:14.510581 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.510488 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz6l9\" (UniqueName: \"kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-kube-api-access-zz6l9\") pod \"150936cc-56bb-4f1a-9468-4a8527e8cec7\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " Apr 20 14:29:14.510581 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.510537 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/150936cc-56bb-4f1a-9468-4a8527e8cec7-image-registry-private-configuration\") pod \"150936cc-56bb-4f1a-9468-4a8527e8cec7\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " Apr 20 14:29:14.510581 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.510569 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-certificates\") pod \"150936cc-56bb-4f1a-9468-4a8527e8cec7\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " Apr 20 14:29:14.510831 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.510598 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-bound-sa-token\") pod \"150936cc-56bb-4f1a-9468-4a8527e8cec7\" (UID: \"150936cc-56bb-4f1a-9468-4a8527e8cec7\") " Apr 20 14:29:14.510973 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.510933 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/150936cc-56bb-4f1a-9468-4a8527e8cec7-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "150936cc-56bb-4f1a-9468-4a8527e8cec7" (UID: "150936cc-56bb-4f1a-9468-4a8527e8cec7"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:29:14.511106 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.511083 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/150936cc-56bb-4f1a-9468-4a8527e8cec7-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "150936cc-56bb-4f1a-9468-4a8527e8cec7" (UID: "150936cc-56bb-4f1a-9468-4a8527e8cec7"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:29:14.511211 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.511187 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "150936cc-56bb-4f1a-9468-4a8527e8cec7" (UID: "150936cc-56bb-4f1a-9468-4a8527e8cec7"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:29:14.513171 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.513109 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/150936cc-56bb-4f1a-9468-4a8527e8cec7-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "150936cc-56bb-4f1a-9468-4a8527e8cec7" (UID: "150936cc-56bb-4f1a-9468-4a8527e8cec7"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:29:14.513171 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.513127 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-kube-api-access-zz6l9" (OuterVolumeSpecName: "kube-api-access-zz6l9") pod "150936cc-56bb-4f1a-9468-4a8527e8cec7" (UID: "150936cc-56bb-4f1a-9468-4a8527e8cec7"). InnerVolumeSpecName "kube-api-access-zz6l9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:29:14.513540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.513516 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "150936cc-56bb-4f1a-9468-4a8527e8cec7" (UID: "150936cc-56bb-4f1a-9468-4a8527e8cec7"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:29:14.513612 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.513534 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/150936cc-56bb-4f1a-9468-4a8527e8cec7-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "150936cc-56bb-4f1a-9468-4a8527e8cec7" (UID: "150936cc-56bb-4f1a-9468-4a8527e8cec7"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:29:14.611867 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.611834 2580 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/150936cc-56bb-4f1a-9468-4a8527e8cec7-installation-pull-secrets\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:29:14.611867 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.611863 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/150936cc-56bb-4f1a-9468-4a8527e8cec7-trusted-ca\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:29:14.611867 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.611873 2580 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/150936cc-56bb-4f1a-9468-4a8527e8cec7-ca-trust-extracted\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:29:14.612088 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.611882 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zz6l9\" (UniqueName: \"kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-kube-api-access-zz6l9\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:29:14.612088 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.611892 2580 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/150936cc-56bb-4f1a-9468-4a8527e8cec7-image-registry-private-configuration\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:29:14.612088 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.611901 2580 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-certificates\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:29:14.612088 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:14.611910 2580 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-bound-sa-token\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:29:15.422210 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:15.422177 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-556bcd8745-tx4hg" Apr 20 14:29:15.456011 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:15.455973 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-556bcd8745-tx4hg"] Apr 20 14:29:15.459632 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:15.459605 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-556bcd8745-tx4hg"] Apr 20 14:29:15.520438 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:15.520394 2580 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/150936cc-56bb-4f1a-9468-4a8527e8cec7-registry-tls\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:29:15.933881 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:15.933841 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="150936cc-56bb-4f1a-9468-4a8527e8cec7" path="/var/lib/kubelet/pods/150936cc-56bb-4f1a-9468-4a8527e8cec7/volumes" Apr 20 14:29:16.427839 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:16.427802 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b49ed096-fc59-422a-993a-259f96d09946","Type":"ContainerStarted","Data":"133729ed9d7979b91e91e3ced086535439b5d2a9cfcdc0c3ee3f9970a6c21f58"} Apr 20 14:29:16.427839 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:16.427843 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b49ed096-fc59-422a-993a-259f96d09946","Type":"ContainerStarted","Data":"094eeaa4b1648bd61b9ef04837c68a3bdf98c199e59b5222e7deb0a426486104"} Apr 20 14:29:18.436817 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:18.436782 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b49ed096-fc59-422a-993a-259f96d09946","Type":"ContainerStarted","Data":"fe8bdcfb2ca62621ceffe4bd8769d0c365554c1f203928224f86aae5e1c0b4d9"} Apr 20 14:29:18.436817 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:18.436819 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b49ed096-fc59-422a-993a-259f96d09946","Type":"ContainerStarted","Data":"cb070998d128fb3e56a1f61b9a0626c80be8d78f2bbe21a4ce23d2832ec2899d"} Apr 20 14:29:18.437269 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:18.436831 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b49ed096-fc59-422a-993a-259f96d09946","Type":"ContainerStarted","Data":"d456832db40297a46e3c05115bd9b94ab03a1d67a750ecd5cc99a69feb0fb523"} Apr 20 14:29:18.437269 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:18.436843 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b49ed096-fc59-422a-993a-259f96d09946","Type":"ContainerStarted","Data":"caed29d2329376e5c4e1b745a79fe5fd51fb6918474dae87816bd9df93e87b3d"} Apr 20 14:29:18.469861 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:18.469813 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.131948674 podStartE2EDuration="7.469798819s" podCreationTimestamp="2026-04-20 14:29:11 +0000 UTC" firstStartedPulling="2026-04-20 14:29:12.40803842 +0000 UTC m=+139.128339833" lastFinishedPulling="2026-04-20 14:29:17.745888558 +0000 UTC m=+144.466189978" observedRunningTime="2026-04-20 14:29:18.467108403 +0000 UTC m=+145.187409835" watchObservedRunningTime="2026-04-20 14:29:18.469798819 +0000 UTC m=+145.190100252" Apr 20 14:29:21.406359 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:21.406313 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:29:29.611051 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:29:29.610998 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-vlqz9" podUID="26882095-42a8-4889-9983-45d2dc2d0fc6" Apr 20 14:29:29.621150 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:29:29.621108 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-8g5vx" podUID="d6c411bc-80c8-4d9c-993c-cb6aeb232750" Apr 20 14:29:30.479487 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:30.479456 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vlqz9" Apr 20 14:29:30.479746 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:30.479456 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8g5vx" Apr 20 14:29:34.596666 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:34.596606 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert\") pod \"ingress-canary-8g5vx\" (UID: \"d6c411bc-80c8-4d9c-993c-cb6aeb232750\") " pod="openshift-ingress-canary/ingress-canary-8g5vx" Apr 20 14:29:34.597065 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:34.596692 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls\") pod \"dns-default-vlqz9\" (UID: \"26882095-42a8-4889-9983-45d2dc2d0fc6\") " pod="openshift-dns/dns-default-vlqz9" Apr 20 14:29:34.599161 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:34.599135 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/26882095-42a8-4889-9983-45d2dc2d0fc6-metrics-tls\") pod \"dns-default-vlqz9\" (UID: \"26882095-42a8-4889-9983-45d2dc2d0fc6\") " pod="openshift-dns/dns-default-vlqz9" Apr 20 14:29:34.599641 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:34.599621 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d6c411bc-80c8-4d9c-993c-cb6aeb232750-cert\") pod \"ingress-canary-8g5vx\" (UID: \"d6c411bc-80c8-4d9c-993c-cb6aeb232750\") " pod="openshift-ingress-canary/ingress-canary-8g5vx" Apr 20 14:29:34.683240 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:34.683208 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5k2mg\"" Apr 20 14:29:34.683868 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:34.683843 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-79jcr\"" Apr 20 14:29:34.690605 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:34.690583 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vlqz9" Apr 20 14:29:34.690692 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:34.690653 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8g5vx" Apr 20 14:29:34.846730 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:34.846657 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vlqz9"] Apr 20 14:29:34.849800 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:29:34.849766 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26882095_42a8_4889_9983_45d2dc2d0fc6.slice/crio-071e556f85a5a21661e6abf6fde00260609e1afdc6a0d03de6cd29a49cc3fc70 WatchSource:0}: Error finding container 071e556f85a5a21661e6abf6fde00260609e1afdc6a0d03de6cd29a49cc3fc70: Status 404 returned error can't find the container with id 071e556f85a5a21661e6abf6fde00260609e1afdc6a0d03de6cd29a49cc3fc70 Apr 20 14:29:34.873093 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:34.873071 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8g5vx"] Apr 20 14:29:34.874792 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:29:34.874768 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6c411bc_80c8_4d9c_993c_cb6aeb232750.slice/crio-7c6dafb5df8d15acd75b211c2eae20cda5e90e8d39e45c8e03f3474054b10e3d WatchSource:0}: Error finding container 7c6dafb5df8d15acd75b211c2eae20cda5e90e8d39e45c8e03f3474054b10e3d: Status 404 returned error can't find the container with id 7c6dafb5df8d15acd75b211c2eae20cda5e90e8d39e45c8e03f3474054b10e3d Apr 20 14:29:35.497269 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:35.497226 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8g5vx" event={"ID":"d6c411bc-80c8-4d9c-993c-cb6aeb232750","Type":"ContainerStarted","Data":"7c6dafb5df8d15acd75b211c2eae20cda5e90e8d39e45c8e03f3474054b10e3d"} Apr 20 14:29:35.498433 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:35.498384 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vlqz9" event={"ID":"26882095-42a8-4889-9983-45d2dc2d0fc6","Type":"ContainerStarted","Data":"071e556f85a5a21661e6abf6fde00260609e1afdc6a0d03de6cd29a49cc3fc70"} Apr 20 14:29:37.506763 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:37.506723 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8g5vx" event={"ID":"d6c411bc-80c8-4d9c-993c-cb6aeb232750","Type":"ContainerStarted","Data":"d9d7932094a55314999fc49c73907e59f12c0a1c001286cd381747bcdd2dc545"} Apr 20 14:29:37.508123 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:37.508097 2580 generic.go:358] "Generic (PLEG): container finished" podID="898273ba-9057-4ee0-9211-0db4c4234ca3" containerID="8e0082fb6aef80ecb897e97e2a7924c60148c7daa4a7e48d46cc50ead1ef4e32" exitCode=0 Apr 20 14:29:37.508216 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:37.508172 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gc5dt" event={"ID":"898273ba-9057-4ee0-9211-0db4c4234ca3","Type":"ContainerDied","Data":"8e0082fb6aef80ecb897e97e2a7924c60148c7daa4a7e48d46cc50ead1ef4e32"} Apr 20 14:29:37.508487 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:37.508470 2580 scope.go:117] "RemoveContainer" containerID="8e0082fb6aef80ecb897e97e2a7924c60148c7daa4a7e48d46cc50ead1ef4e32" Apr 20 14:29:37.509694 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:37.509673 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vlqz9" event={"ID":"26882095-42a8-4889-9983-45d2dc2d0fc6","Type":"ContainerStarted","Data":"8c6608cdaa9a7379c08ce65ea73ec141e39370e2cc7eff86b1327f9ae78011b6"} Apr 20 14:29:37.509785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:37.509699 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vlqz9" event={"ID":"26882095-42a8-4889-9983-45d2dc2d0fc6","Type":"ContainerStarted","Data":"72f982c37d2f10881add7e6107d6b8cf47542dfd4a9c7b12bcfaf990d096e5a3"} Apr 20 14:29:37.509883 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:37.509813 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-vlqz9" Apr 20 14:29:37.525054 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:37.525011 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8g5vx" podStartSLOduration=129.706453815 podStartE2EDuration="2m11.52499665s" podCreationTimestamp="2026-04-20 14:27:26 +0000 UTC" firstStartedPulling="2026-04-20 14:29:34.876764028 +0000 UTC m=+161.597065437" lastFinishedPulling="2026-04-20 14:29:36.695306857 +0000 UTC m=+163.415608272" observedRunningTime="2026-04-20 14:29:37.524511559 +0000 UTC m=+164.244812988" watchObservedRunningTime="2026-04-20 14:29:37.52499665 +0000 UTC m=+164.245298076" Apr 20 14:29:37.558788 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:37.558726 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vlqz9" podStartSLOduration=129.718555054 podStartE2EDuration="2m11.558703162s" podCreationTimestamp="2026-04-20 14:27:26 +0000 UTC" firstStartedPulling="2026-04-20 14:29:34.851784891 +0000 UTC m=+161.572086301" lastFinishedPulling="2026-04-20 14:29:36.691932999 +0000 UTC m=+163.412234409" observedRunningTime="2026-04-20 14:29:37.557369424 +0000 UTC m=+164.277670886" watchObservedRunningTime="2026-04-20 14:29:37.558703162 +0000 UTC m=+164.279004596" Apr 20 14:29:38.514092 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:38.514055 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gc5dt" event={"ID":"898273ba-9057-4ee0-9211-0db4c4234ca3","Type":"ContainerStarted","Data":"90787f323fc9e5a96eb771402018a41c09741b41317d88e5d3d9f51595482441"} Apr 20 14:29:44.558515 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:44.558415 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vlqz9_26882095-42a8-4889-9983-45d2dc2d0fc6/dns/0.log" Apr 20 14:29:44.759817 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:44.759790 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vlqz9_26882095-42a8-4889-9983-45d2dc2d0fc6/kube-rbac-proxy/0.log" Apr 20 14:29:45.159182 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:45.159154 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mw9b9_236542ea-fda7-4b96-ae9e-dd685e15e5ef/dns-node-resolver/0.log" Apr 20 14:29:45.759417 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:45.759388 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8g5vx_d6c411bc-80c8-4d9c-993c-cb6aeb232750/serve-healthcheck-canary/0.log" Apr 20 14:29:46.543160 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:46.543126 2580 generic.go:358] "Generic (PLEG): container finished" podID="f5c1dc75-f6fa-4a93-af4f-753f62471c29" containerID="56c634f0606f50e246392dc72e6a5d9ea2e4d22fab957c461e3ee7b9f86ef3de" exitCode=0 Apr 20 14:29:46.543363 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:46.543200 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpkpd" event={"ID":"f5c1dc75-f6fa-4a93-af4f-753f62471c29","Type":"ContainerDied","Data":"56c634f0606f50e246392dc72e6a5d9ea2e4d22fab957c461e3ee7b9f86ef3de"} Apr 20 14:29:46.543645 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:46.543625 2580 scope.go:117] "RemoveContainer" containerID="56c634f0606f50e246392dc72e6a5d9ea2e4d22fab957c461e3ee7b9f86ef3de" Apr 20 14:29:47.516992 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:47.516950 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vlqz9" Apr 20 14:29:47.548090 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:29:47.548056 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zpkpd" event={"ID":"f5c1dc75-f6fa-4a93-af4f-753f62471c29","Type":"ContainerStarted","Data":"bd29771cb6e6e0b2c3c05d90dbcf2f15bb1acab8b0da124e0175ade6f85ff832"} Apr 20 14:30:11.406335 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:11.406288 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:11.427284 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:11.427256 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:11.640394 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:11.640364 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:25.127561 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:25.127527 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 14:30:25.128035 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:25.127973 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="alertmanager" containerID="cri-o://a5703c8a46e1c9107c318b504487028469e72f259421f84736324cb020d74a85" gracePeriod=120 Apr 20 14:30:25.128101 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:25.128031 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="kube-rbac-proxy-metric" containerID="cri-o://c98ad5e33a640df44728cd30a8daed3d147cc5bf6b04b94610de77f3ca6c4d60" gracePeriod=120 Apr 20 14:30:25.128151 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:25.128097 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="kube-rbac-proxy-web" containerID="cri-o://fa0c529fd947ed0cfbaecefa03f47f69ae76432fdc1569aa9f2804b46b73b019" gracePeriod=120 Apr 20 14:30:25.128151 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:25.128090 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="prom-label-proxy" containerID="cri-o://7fe363164627534e520fff159c755472387cf854883d2991e7aacbbfe994cca8" gracePeriod=120 Apr 20 14:30:25.128151 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:25.128117 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="kube-rbac-proxy" containerID="cri-o://de27fd50a9cb93471a16121102e2d4d06307d2831ebde12d89dd6f3f2c4287ed" gracePeriod=120 Apr 20 14:30:25.128296 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:25.128126 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="config-reloader" containerID="cri-o://4d89303ae5cf9e9606281148d6051ca34495a0ff982792a0269acc9acfca3050" gracePeriod=120 Apr 20 14:30:25.667380 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:25.667346 2580 generic.go:358] "Generic (PLEG): container finished" podID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerID="7fe363164627534e520fff159c755472387cf854883d2991e7aacbbfe994cca8" exitCode=0 Apr 20 14:30:25.667380 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:25.667376 2580 generic.go:358] "Generic (PLEG): container finished" podID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerID="de27fd50a9cb93471a16121102e2d4d06307d2831ebde12d89dd6f3f2c4287ed" exitCode=0 Apr 20 14:30:25.667380 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:25.667383 2580 generic.go:358] "Generic (PLEG): container finished" podID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerID="4d89303ae5cf9e9606281148d6051ca34495a0ff982792a0269acc9acfca3050" exitCode=0 Apr 20 14:30:25.667380 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:25.667388 2580 generic.go:358] "Generic (PLEG): container finished" podID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerID="a5703c8a46e1c9107c318b504487028469e72f259421f84736324cb020d74a85" exitCode=0 Apr 20 14:30:25.667662 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:25.667423 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c02b03da-cf73-405d-a560-0e05db9a7dfc","Type":"ContainerDied","Data":"7fe363164627534e520fff159c755472387cf854883d2991e7aacbbfe994cca8"} Apr 20 14:30:25.667662 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:25.667457 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c02b03da-cf73-405d-a560-0e05db9a7dfc","Type":"ContainerDied","Data":"de27fd50a9cb93471a16121102e2d4d06307d2831ebde12d89dd6f3f2c4287ed"} Apr 20 14:30:25.667662 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:25.667468 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c02b03da-cf73-405d-a560-0e05db9a7dfc","Type":"ContainerDied","Data":"4d89303ae5cf9e9606281148d6051ca34495a0ff982792a0269acc9acfca3050"} Apr 20 14:30:25.667662 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:25.667476 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c02b03da-cf73-405d-a560-0e05db9a7dfc","Type":"ContainerDied","Data":"a5703c8a46e1c9107c318b504487028469e72f259421f84736324cb020d74a85"} Apr 20 14:30:26.371599 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.371577 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.467730 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.467643 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-web-config\") pod \"c02b03da-cf73-405d-a560-0e05db9a7dfc\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " Apr 20 14:30:26.467730 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.467682 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c02b03da-cf73-405d-a560-0e05db9a7dfc-alertmanager-trusted-ca-bundle\") pod \"c02b03da-cf73-405d-a560-0e05db9a7dfc\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " Apr 20 14:30:26.467730 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.467720 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-config-volume\") pod \"c02b03da-cf73-405d-a560-0e05db9a7dfc\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " Apr 20 14:30:26.468010 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.467752 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c02b03da-cf73-405d-a560-0e05db9a7dfc-tls-assets\") pod \"c02b03da-cf73-405d-a560-0e05db9a7dfc\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " Apr 20 14:30:26.468010 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.467780 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-kube-rbac-proxy\") pod \"c02b03da-cf73-405d-a560-0e05db9a7dfc\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " Apr 20 14:30:26.468010 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.467804 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-cluster-tls-config\") pod \"c02b03da-cf73-405d-a560-0e05db9a7dfc\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " Apr 20 14:30:26.468010 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.467834 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8b6kr\" (UniqueName: \"kubernetes.io/projected/c02b03da-cf73-405d-a560-0e05db9a7dfc-kube-api-access-8b6kr\") pod \"c02b03da-cf73-405d-a560-0e05db9a7dfc\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " Apr 20 14:30:26.468010 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.467859 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"c02b03da-cf73-405d-a560-0e05db9a7dfc\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " Apr 20 14:30:26.468010 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.467895 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-main-tls\") pod \"c02b03da-cf73-405d-a560-0e05db9a7dfc\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " Apr 20 14:30:26.468010 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.467921 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c02b03da-cf73-405d-a560-0e05db9a7dfc-config-out\") pod \"c02b03da-cf73-405d-a560-0e05db9a7dfc\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " Apr 20 14:30:26.468010 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.467946 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c02b03da-cf73-405d-a560-0e05db9a7dfc-alertmanager-main-db\") pod \"c02b03da-cf73-405d-a560-0e05db9a7dfc\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " Apr 20 14:30:26.468400 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.468023 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-kube-rbac-proxy-web\") pod \"c02b03da-cf73-405d-a560-0e05db9a7dfc\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " Apr 20 14:30:26.468400 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.468054 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c02b03da-cf73-405d-a560-0e05db9a7dfc-metrics-client-ca\") pod \"c02b03da-cf73-405d-a560-0e05db9a7dfc\" (UID: \"c02b03da-cf73-405d-a560-0e05db9a7dfc\") " Apr 20 14:30:26.468400 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.468104 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c02b03da-cf73-405d-a560-0e05db9a7dfc-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "c02b03da-cf73-405d-a560-0e05db9a7dfc" (UID: "c02b03da-cf73-405d-a560-0e05db9a7dfc"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:30:26.468579 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.468422 2580 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c02b03da-cf73-405d-a560-0e05db9a7dfc-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:26.468579 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.468420 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c02b03da-cf73-405d-a560-0e05db9a7dfc-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "c02b03da-cf73-405d-a560-0e05db9a7dfc" (UID: "c02b03da-cf73-405d-a560-0e05db9a7dfc"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:30:26.468958 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.468925 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c02b03da-cf73-405d-a560-0e05db9a7dfc-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "c02b03da-cf73-405d-a560-0e05db9a7dfc" (UID: "c02b03da-cf73-405d-a560-0e05db9a7dfc"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:30:26.470780 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.470747 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "c02b03da-cf73-405d-a560-0e05db9a7dfc" (UID: "c02b03da-cf73-405d-a560-0e05db9a7dfc"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:30:26.471007 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.470881 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "c02b03da-cf73-405d-a560-0e05db9a7dfc" (UID: "c02b03da-cf73-405d-a560-0e05db9a7dfc"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:30:26.471007 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.470971 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-config-volume" (OuterVolumeSpecName: "config-volume") pod "c02b03da-cf73-405d-a560-0e05db9a7dfc" (UID: "c02b03da-cf73-405d-a560-0e05db9a7dfc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:30:26.471369 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.471343 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c02b03da-cf73-405d-a560-0e05db9a7dfc-config-out" (OuterVolumeSpecName: "config-out") pod "c02b03da-cf73-405d-a560-0e05db9a7dfc" (UID: "c02b03da-cf73-405d-a560-0e05db9a7dfc"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:30:26.471613 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.471593 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c02b03da-cf73-405d-a560-0e05db9a7dfc-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c02b03da-cf73-405d-a560-0e05db9a7dfc" (UID: "c02b03da-cf73-405d-a560-0e05db9a7dfc"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:30:26.472071 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.472052 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "c02b03da-cf73-405d-a560-0e05db9a7dfc" (UID: "c02b03da-cf73-405d-a560-0e05db9a7dfc"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:30:26.472995 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.472975 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "c02b03da-cf73-405d-a560-0e05db9a7dfc" (UID: "c02b03da-cf73-405d-a560-0e05db9a7dfc"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:30:26.473477 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.473462 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c02b03da-cf73-405d-a560-0e05db9a7dfc-kube-api-access-8b6kr" (OuterVolumeSpecName: "kube-api-access-8b6kr") pod "c02b03da-cf73-405d-a560-0e05db9a7dfc" (UID: "c02b03da-cf73-405d-a560-0e05db9a7dfc"). InnerVolumeSpecName "kube-api-access-8b6kr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:30:26.475958 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.475933 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "c02b03da-cf73-405d-a560-0e05db9a7dfc" (UID: "c02b03da-cf73-405d-a560-0e05db9a7dfc"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:30:26.482477 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.482457 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-web-config" (OuterVolumeSpecName: "web-config") pod "c02b03da-cf73-405d-a560-0e05db9a7dfc" (UID: "c02b03da-cf73-405d-a560-0e05db9a7dfc"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:30:26.569853 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.569815 2580 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-config-volume\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:26.569853 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.569849 2580 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c02b03da-cf73-405d-a560-0e05db9a7dfc-tls-assets\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:26.569853 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.569861 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:26.570109 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.569876 2580 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-cluster-tls-config\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:26.570109 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.569889 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8b6kr\" (UniqueName: \"kubernetes.io/projected/c02b03da-cf73-405d-a560-0e05db9a7dfc-kube-api-access-8b6kr\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:26.570109 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.569901 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:26.570109 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.569913 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-main-tls\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:26.570109 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.569925 2580 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c02b03da-cf73-405d-a560-0e05db9a7dfc-config-out\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:26.570109 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.569938 2580 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/c02b03da-cf73-405d-a560-0e05db9a7dfc-alertmanager-main-db\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:26.570109 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.569950 2580 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:26.570109 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.569963 2580 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c02b03da-cf73-405d-a560-0e05db9a7dfc-metrics-client-ca\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:26.570109 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.569975 2580 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c02b03da-cf73-405d-a560-0e05db9a7dfc-web-config\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:26.672434 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.672397 2580 generic.go:358] "Generic (PLEG): container finished" podID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerID="c98ad5e33a640df44728cd30a8daed3d147cc5bf6b04b94610de77f3ca6c4d60" exitCode=0 Apr 20 14:30:26.672434 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.672428 2580 generic.go:358] "Generic (PLEG): container finished" podID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerID="fa0c529fd947ed0cfbaecefa03f47f69ae76432fdc1569aa9f2804b46b73b019" exitCode=0 Apr 20 14:30:26.672664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.672458 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c02b03da-cf73-405d-a560-0e05db9a7dfc","Type":"ContainerDied","Data":"c98ad5e33a640df44728cd30a8daed3d147cc5bf6b04b94610de77f3ca6c4d60"} Apr 20 14:30:26.672664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.672492 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c02b03da-cf73-405d-a560-0e05db9a7dfc","Type":"ContainerDied","Data":"fa0c529fd947ed0cfbaecefa03f47f69ae76432fdc1569aa9f2804b46b73b019"} Apr 20 14:30:26.672664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.672531 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"c02b03da-cf73-405d-a560-0e05db9a7dfc","Type":"ContainerDied","Data":"4323b30d8f2600223fecc2a209e25570c5058da07c513de640c317c7925015a3"} Apr 20 14:30:26.672664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.672551 2580 scope.go:117] "RemoveContainer" containerID="7fe363164627534e520fff159c755472387cf854883d2991e7aacbbfe994cca8" Apr 20 14:30:26.672664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.672553 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.680446 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.680410 2580 scope.go:117] "RemoveContainer" containerID="c98ad5e33a640df44728cd30a8daed3d147cc5bf6b04b94610de77f3ca6c4d60" Apr 20 14:30:26.687353 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.687330 2580 scope.go:117] "RemoveContainer" containerID="de27fd50a9cb93471a16121102e2d4d06307d2831ebde12d89dd6f3f2c4287ed" Apr 20 14:30:26.693798 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.693779 2580 scope.go:117] "RemoveContainer" containerID="fa0c529fd947ed0cfbaecefa03f47f69ae76432fdc1569aa9f2804b46b73b019" Apr 20 14:30:26.697125 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.697106 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 14:30:26.700988 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.700953 2580 scope.go:117] "RemoveContainer" containerID="4d89303ae5cf9e9606281148d6051ca34495a0ff982792a0269acc9acfca3050" Apr 20 14:30:26.712121 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.712092 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 14:30:26.716009 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.715992 2580 scope.go:117] "RemoveContainer" containerID="a5703c8a46e1c9107c318b504487028469e72f259421f84736324cb020d74a85" Apr 20 14:30:26.722789 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.722770 2580 scope.go:117] "RemoveContainer" containerID="032c9c880f135107f9029eaa583cff941f75773ab920150cdc393cf9cd82d6c3" Apr 20 14:30:26.729456 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.729434 2580 scope.go:117] "RemoveContainer" containerID="7fe363164627534e520fff159c755472387cf854883d2991e7aacbbfe994cca8" Apr 20 14:30:26.729724 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:30:26.729706 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fe363164627534e520fff159c755472387cf854883d2991e7aacbbfe994cca8\": container with ID starting with 7fe363164627534e520fff159c755472387cf854883d2991e7aacbbfe994cca8 not found: ID does not exist" containerID="7fe363164627534e520fff159c755472387cf854883d2991e7aacbbfe994cca8" Apr 20 14:30:26.729780 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.729733 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fe363164627534e520fff159c755472387cf854883d2991e7aacbbfe994cca8"} err="failed to get container status \"7fe363164627534e520fff159c755472387cf854883d2991e7aacbbfe994cca8\": rpc error: code = NotFound desc = could not find container \"7fe363164627534e520fff159c755472387cf854883d2991e7aacbbfe994cca8\": container with ID starting with 7fe363164627534e520fff159c755472387cf854883d2991e7aacbbfe994cca8 not found: ID does not exist" Apr 20 14:30:26.729780 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.729766 2580 scope.go:117] "RemoveContainer" containerID="c98ad5e33a640df44728cd30a8daed3d147cc5bf6b04b94610de77f3ca6c4d60" Apr 20 14:30:26.729983 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:30:26.729969 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c98ad5e33a640df44728cd30a8daed3d147cc5bf6b04b94610de77f3ca6c4d60\": container with ID starting with c98ad5e33a640df44728cd30a8daed3d147cc5bf6b04b94610de77f3ca6c4d60 not found: ID does not exist" containerID="c98ad5e33a640df44728cd30a8daed3d147cc5bf6b04b94610de77f3ca6c4d60" Apr 20 14:30:26.730024 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.729988 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c98ad5e33a640df44728cd30a8daed3d147cc5bf6b04b94610de77f3ca6c4d60"} err="failed to get container status \"c98ad5e33a640df44728cd30a8daed3d147cc5bf6b04b94610de77f3ca6c4d60\": rpc error: code = NotFound desc = could not find container \"c98ad5e33a640df44728cd30a8daed3d147cc5bf6b04b94610de77f3ca6c4d60\": container with ID starting with c98ad5e33a640df44728cd30a8daed3d147cc5bf6b04b94610de77f3ca6c4d60 not found: ID does not exist" Apr 20 14:30:26.730024 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.730006 2580 scope.go:117] "RemoveContainer" containerID="de27fd50a9cb93471a16121102e2d4d06307d2831ebde12d89dd6f3f2c4287ed" Apr 20 14:30:26.730194 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:30:26.730173 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de27fd50a9cb93471a16121102e2d4d06307d2831ebde12d89dd6f3f2c4287ed\": container with ID starting with de27fd50a9cb93471a16121102e2d4d06307d2831ebde12d89dd6f3f2c4287ed not found: ID does not exist" containerID="de27fd50a9cb93471a16121102e2d4d06307d2831ebde12d89dd6f3f2c4287ed" Apr 20 14:30:26.730232 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.730197 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de27fd50a9cb93471a16121102e2d4d06307d2831ebde12d89dd6f3f2c4287ed"} err="failed to get container status \"de27fd50a9cb93471a16121102e2d4d06307d2831ebde12d89dd6f3f2c4287ed\": rpc error: code = NotFound desc = could not find container \"de27fd50a9cb93471a16121102e2d4d06307d2831ebde12d89dd6f3f2c4287ed\": container with ID starting with de27fd50a9cb93471a16121102e2d4d06307d2831ebde12d89dd6f3f2c4287ed not found: ID does not exist" Apr 20 14:30:26.730232 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.730209 2580 scope.go:117] "RemoveContainer" containerID="fa0c529fd947ed0cfbaecefa03f47f69ae76432fdc1569aa9f2804b46b73b019" Apr 20 14:30:26.730389 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:30:26.730374 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa0c529fd947ed0cfbaecefa03f47f69ae76432fdc1569aa9f2804b46b73b019\": container with ID starting with fa0c529fd947ed0cfbaecefa03f47f69ae76432fdc1569aa9f2804b46b73b019 not found: ID does not exist" containerID="fa0c529fd947ed0cfbaecefa03f47f69ae76432fdc1569aa9f2804b46b73b019" Apr 20 14:30:26.730430 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.730393 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa0c529fd947ed0cfbaecefa03f47f69ae76432fdc1569aa9f2804b46b73b019"} err="failed to get container status \"fa0c529fd947ed0cfbaecefa03f47f69ae76432fdc1569aa9f2804b46b73b019\": rpc error: code = NotFound desc = could not find container \"fa0c529fd947ed0cfbaecefa03f47f69ae76432fdc1569aa9f2804b46b73b019\": container with ID starting with fa0c529fd947ed0cfbaecefa03f47f69ae76432fdc1569aa9f2804b46b73b019 not found: ID does not exist" Apr 20 14:30:26.730430 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.730405 2580 scope.go:117] "RemoveContainer" containerID="4d89303ae5cf9e9606281148d6051ca34495a0ff982792a0269acc9acfca3050" Apr 20 14:30:26.730586 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:30:26.730571 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d89303ae5cf9e9606281148d6051ca34495a0ff982792a0269acc9acfca3050\": container with ID starting with 4d89303ae5cf9e9606281148d6051ca34495a0ff982792a0269acc9acfca3050 not found: ID does not exist" containerID="4d89303ae5cf9e9606281148d6051ca34495a0ff982792a0269acc9acfca3050" Apr 20 14:30:26.730649 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.730588 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d89303ae5cf9e9606281148d6051ca34495a0ff982792a0269acc9acfca3050"} err="failed to get container status \"4d89303ae5cf9e9606281148d6051ca34495a0ff982792a0269acc9acfca3050\": rpc error: code = NotFound desc = could not find container \"4d89303ae5cf9e9606281148d6051ca34495a0ff982792a0269acc9acfca3050\": container with ID starting with 4d89303ae5cf9e9606281148d6051ca34495a0ff982792a0269acc9acfca3050 not found: ID does not exist" Apr 20 14:30:26.730649 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.730599 2580 scope.go:117] "RemoveContainer" containerID="a5703c8a46e1c9107c318b504487028469e72f259421f84736324cb020d74a85" Apr 20 14:30:26.730805 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:30:26.730788 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5703c8a46e1c9107c318b504487028469e72f259421f84736324cb020d74a85\": container with ID starting with a5703c8a46e1c9107c318b504487028469e72f259421f84736324cb020d74a85 not found: ID does not exist" containerID="a5703c8a46e1c9107c318b504487028469e72f259421f84736324cb020d74a85" Apr 20 14:30:26.730839 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.730808 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5703c8a46e1c9107c318b504487028469e72f259421f84736324cb020d74a85"} err="failed to get container status \"a5703c8a46e1c9107c318b504487028469e72f259421f84736324cb020d74a85\": rpc error: code = NotFound desc = could not find container \"a5703c8a46e1c9107c318b504487028469e72f259421f84736324cb020d74a85\": container with ID starting with a5703c8a46e1c9107c318b504487028469e72f259421f84736324cb020d74a85 not found: ID does not exist" Apr 20 14:30:26.730839 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.730820 2580 scope.go:117] "RemoveContainer" containerID="032c9c880f135107f9029eaa583cff941f75773ab920150cdc393cf9cd82d6c3" Apr 20 14:30:26.731014 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:30:26.730985 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"032c9c880f135107f9029eaa583cff941f75773ab920150cdc393cf9cd82d6c3\": container with ID starting with 032c9c880f135107f9029eaa583cff941f75773ab920150cdc393cf9cd82d6c3 not found: ID does not exist" containerID="032c9c880f135107f9029eaa583cff941f75773ab920150cdc393cf9cd82d6c3" Apr 20 14:30:26.731054 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.731020 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"032c9c880f135107f9029eaa583cff941f75773ab920150cdc393cf9cd82d6c3"} err="failed to get container status \"032c9c880f135107f9029eaa583cff941f75773ab920150cdc393cf9cd82d6c3\": rpc error: code = NotFound desc = could not find container \"032c9c880f135107f9029eaa583cff941f75773ab920150cdc393cf9cd82d6c3\": container with ID starting with 032c9c880f135107f9029eaa583cff941f75773ab920150cdc393cf9cd82d6c3 not found: ID does not exist" Apr 20 14:30:26.731054 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.731036 2580 scope.go:117] "RemoveContainer" containerID="7fe363164627534e520fff159c755472387cf854883d2991e7aacbbfe994cca8" Apr 20 14:30:26.731198 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.731182 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fe363164627534e520fff159c755472387cf854883d2991e7aacbbfe994cca8"} err="failed to get container status \"7fe363164627534e520fff159c755472387cf854883d2991e7aacbbfe994cca8\": rpc error: code = NotFound desc = could not find container \"7fe363164627534e520fff159c755472387cf854883d2991e7aacbbfe994cca8\": container with ID starting with 7fe363164627534e520fff159c755472387cf854883d2991e7aacbbfe994cca8 not found: ID does not exist" Apr 20 14:30:26.731235 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.731200 2580 scope.go:117] "RemoveContainer" containerID="c98ad5e33a640df44728cd30a8daed3d147cc5bf6b04b94610de77f3ca6c4d60" Apr 20 14:30:26.731347 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.731331 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c98ad5e33a640df44728cd30a8daed3d147cc5bf6b04b94610de77f3ca6c4d60"} err="failed to get container status \"c98ad5e33a640df44728cd30a8daed3d147cc5bf6b04b94610de77f3ca6c4d60\": rpc error: code = NotFound desc = could not find container \"c98ad5e33a640df44728cd30a8daed3d147cc5bf6b04b94610de77f3ca6c4d60\": container with ID starting with c98ad5e33a640df44728cd30a8daed3d147cc5bf6b04b94610de77f3ca6c4d60 not found: ID does not exist" Apr 20 14:30:26.731384 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.731348 2580 scope.go:117] "RemoveContainer" containerID="de27fd50a9cb93471a16121102e2d4d06307d2831ebde12d89dd6f3f2c4287ed" Apr 20 14:30:26.731559 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.731541 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de27fd50a9cb93471a16121102e2d4d06307d2831ebde12d89dd6f3f2c4287ed"} err="failed to get container status \"de27fd50a9cb93471a16121102e2d4d06307d2831ebde12d89dd6f3f2c4287ed\": rpc error: code = NotFound desc = could not find container \"de27fd50a9cb93471a16121102e2d4d06307d2831ebde12d89dd6f3f2c4287ed\": container with ID starting with de27fd50a9cb93471a16121102e2d4d06307d2831ebde12d89dd6f3f2c4287ed not found: ID does not exist" Apr 20 14:30:26.731603 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.731561 2580 scope.go:117] "RemoveContainer" containerID="fa0c529fd947ed0cfbaecefa03f47f69ae76432fdc1569aa9f2804b46b73b019" Apr 20 14:30:26.731794 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.731776 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa0c529fd947ed0cfbaecefa03f47f69ae76432fdc1569aa9f2804b46b73b019"} err="failed to get container status \"fa0c529fd947ed0cfbaecefa03f47f69ae76432fdc1569aa9f2804b46b73b019\": rpc error: code = NotFound desc = could not find container \"fa0c529fd947ed0cfbaecefa03f47f69ae76432fdc1569aa9f2804b46b73b019\": container with ID starting with fa0c529fd947ed0cfbaecefa03f47f69ae76432fdc1569aa9f2804b46b73b019 not found: ID does not exist" Apr 20 14:30:26.731831 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.731796 2580 scope.go:117] "RemoveContainer" containerID="4d89303ae5cf9e9606281148d6051ca34495a0ff982792a0269acc9acfca3050" Apr 20 14:30:26.731998 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.731982 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d89303ae5cf9e9606281148d6051ca34495a0ff982792a0269acc9acfca3050"} err="failed to get container status \"4d89303ae5cf9e9606281148d6051ca34495a0ff982792a0269acc9acfca3050\": rpc error: code = NotFound desc = could not find container \"4d89303ae5cf9e9606281148d6051ca34495a0ff982792a0269acc9acfca3050\": container with ID starting with 4d89303ae5cf9e9606281148d6051ca34495a0ff982792a0269acc9acfca3050 not found: ID does not exist" Apr 20 14:30:26.732033 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.732000 2580 scope.go:117] "RemoveContainer" containerID="a5703c8a46e1c9107c318b504487028469e72f259421f84736324cb020d74a85" Apr 20 14:30:26.732245 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.732227 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5703c8a46e1c9107c318b504487028469e72f259421f84736324cb020d74a85"} err="failed to get container status \"a5703c8a46e1c9107c318b504487028469e72f259421f84736324cb020d74a85\": rpc error: code = NotFound desc = could not find container \"a5703c8a46e1c9107c318b504487028469e72f259421f84736324cb020d74a85\": container with ID starting with a5703c8a46e1c9107c318b504487028469e72f259421f84736324cb020d74a85 not found: ID does not exist" Apr 20 14:30:26.732306 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.732246 2580 scope.go:117] "RemoveContainer" containerID="032c9c880f135107f9029eaa583cff941f75773ab920150cdc393cf9cd82d6c3" Apr 20 14:30:26.732432 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.732414 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"032c9c880f135107f9029eaa583cff941f75773ab920150cdc393cf9cd82d6c3"} err="failed to get container status \"032c9c880f135107f9029eaa583cff941f75773ab920150cdc393cf9cd82d6c3\": rpc error: code = NotFound desc = could not find container \"032c9c880f135107f9029eaa583cff941f75773ab920150cdc393cf9cd82d6c3\": container with ID starting with 032c9c880f135107f9029eaa583cff941f75773ab920150cdc393cf9cd82d6c3 not found: ID does not exist" Apr 20 14:30:26.736153 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.736133 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 14:30:26.736481 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.736468 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="kube-rbac-proxy-metric" Apr 20 14:30:26.736545 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.736483 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="kube-rbac-proxy-metric" Apr 20 14:30:26.736545 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.736517 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="kube-rbac-proxy" Apr 20 14:30:26.736545 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.736523 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="kube-rbac-proxy" Apr 20 14:30:26.736545 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.736531 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="alertmanager" Apr 20 14:30:26.736545 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.736536 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="alertmanager" Apr 20 14:30:26.736545 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.736542 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="prom-label-proxy" Apr 20 14:30:26.736545 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.736547 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="prom-label-proxy" Apr 20 14:30:26.736747 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.736560 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="config-reloader" Apr 20 14:30:26.736747 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.736566 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="config-reloader" Apr 20 14:30:26.736747 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.736573 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="init-config-reloader" Apr 20 14:30:26.736747 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.736578 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="init-config-reloader" Apr 20 14:30:26.736747 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.736587 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="kube-rbac-proxy-web" Apr 20 14:30:26.736747 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.736592 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="kube-rbac-proxy-web" Apr 20 14:30:26.736747 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.736638 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="kube-rbac-proxy-metric" Apr 20 14:30:26.736747 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.736646 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="config-reloader" Apr 20 14:30:26.736747 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.736653 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="prom-label-proxy" Apr 20 14:30:26.736747 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.736659 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="kube-rbac-proxy" Apr 20 14:30:26.736747 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.736665 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="alertmanager" Apr 20 14:30:26.736747 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.736671 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" containerName="kube-rbac-proxy-web" Apr 20 14:30:26.741788 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.741773 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.744608 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.744588 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 14:30:26.744721 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.744706 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 14:30:26.744785 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.744721 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 14:30:26.744872 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.744851 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 14:30:26.744953 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.744935 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 14:30:26.745022 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.744964 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 14:30:26.745022 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.744977 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-xrsmc\"" Apr 20 14:30:26.745022 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.744983 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 14:30:26.745022 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.745009 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 14:30:26.751209 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.751192 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 14:30:26.758214 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.758189 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 14:30:26.771381 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.771349 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.771381 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.771377 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-config-out\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.771567 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.771408 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.771567 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.771487 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.771567 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.771541 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.771567 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.771559 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.771729 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.771574 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-config-volume\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.771729 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.771608 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.771729 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.771632 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r986p\" (UniqueName: \"kubernetes.io/projected/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-kube-api-access-r986p\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.771729 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.771655 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.771729 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.771685 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.771729 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.771707 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-web-config\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.771729 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.771724 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.872952 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.872916 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.872952 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.872953 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-config-out\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.873186 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.872982 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.873186 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.873015 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.873186 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.873044 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.873186 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.873070 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.873186 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.873098 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-config-volume\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.873186 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.873125 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.874106 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.874073 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.874228 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.874115 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.874228 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.874165 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r986p\" (UniqueName: \"kubernetes.io/projected/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-kube-api-access-r986p\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.874228 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.874214 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.874384 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.874251 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.874384 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.874278 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-web-config\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.874384 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.874305 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.874620 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.874599 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.876051 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.876023 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.876051 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.876040 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.876208 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.876079 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-config-out\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.876208 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.876147 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.876327 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.876258 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.876975 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.876953 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.877077 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.877008 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-config-volume\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.877152 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.877136 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-web-config\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.877977 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.877959 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:26.883948 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:26.883928 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r986p\" (UniqueName: \"kubernetes.io/projected/5361d7ff-428a-4fe9-a55f-a6e058beb6b9-kube-api-access-r986p\") pod \"alertmanager-main-0\" (UID: \"5361d7ff-428a-4fe9-a55f-a6e058beb6b9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:27.051077 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:27.050998 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:30:27.178486 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:27.178299 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 14:30:27.181468 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:30:27.181442 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5361d7ff_428a_4fe9_a55f_a6e058beb6b9.slice/crio-440bd7125788088a42c01837c7203c52186cb6a03e2e35a48e604c263343af50 WatchSource:0}: Error finding container 440bd7125788088a42c01837c7203c52186cb6a03e2e35a48e604c263343af50: Status 404 returned error can't find the container with id 440bd7125788088a42c01837c7203c52186cb6a03e2e35a48e604c263343af50 Apr 20 14:30:27.677461 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:27.677424 2580 generic.go:358] "Generic (PLEG): container finished" podID="5361d7ff-428a-4fe9-a55f-a6e058beb6b9" containerID="6512018d23e3f7867c7032fc73098f99ffeda512601ca2a61dc6b850901af089" exitCode=0 Apr 20 14:30:27.677845 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:27.677467 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5361d7ff-428a-4fe9-a55f-a6e058beb6b9","Type":"ContainerDied","Data":"6512018d23e3f7867c7032fc73098f99ffeda512601ca2a61dc6b850901af089"} Apr 20 14:30:27.677845 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:27.677491 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5361d7ff-428a-4fe9-a55f-a6e058beb6b9","Type":"ContainerStarted","Data":"440bd7125788088a42c01837c7203c52186cb6a03e2e35a48e604c263343af50"} Apr 20 14:30:27.932295 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:27.932272 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c02b03da-cf73-405d-a560-0e05db9a7dfc" path="/var/lib/kubelet/pods/c02b03da-cf73-405d-a560-0e05db9a7dfc/volumes" Apr 20 14:30:28.683634 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:28.683603 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5361d7ff-428a-4fe9-a55f-a6e058beb6b9","Type":"ContainerStarted","Data":"8c3661aecdf70cb1c49182a67d717df5d5d94b464eea71b6be30de8de9cbb158"} Apr 20 14:30:28.683634 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:28.683641 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5361d7ff-428a-4fe9-a55f-a6e058beb6b9","Type":"ContainerStarted","Data":"72987fff54717ceb7c5da8c5672431b63013803ae55762676d8a00bd65dd1839"} Apr 20 14:30:28.684016 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:28.683651 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5361d7ff-428a-4fe9-a55f-a6e058beb6b9","Type":"ContainerStarted","Data":"cc188f730346055647b5aac8ef4c7cb4a42fb5f433c80fe9d81e3823ec0fd0de"} Apr 20 14:30:28.684016 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:28.683660 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5361d7ff-428a-4fe9-a55f-a6e058beb6b9","Type":"ContainerStarted","Data":"213a92bd11bf766d4f73c2b5f6d6d74182da246d6561bd50bd3e4d92eb6a8ee8"} Apr 20 14:30:28.684016 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:28.683668 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5361d7ff-428a-4fe9-a55f-a6e058beb6b9","Type":"ContainerStarted","Data":"f019961f2492ea661f267dd7399398464824926e96c039326c13d715b464203e"} Apr 20 14:30:28.684016 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:28.683676 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5361d7ff-428a-4fe9-a55f-a6e058beb6b9","Type":"ContainerStarted","Data":"e9c955e4dc8075c89fe294d75a7626746e64e9efeb56d46dab8a21e52f44475f"} Apr 20 14:30:28.713389 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:28.713335 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.713318598 podStartE2EDuration="2.713318598s" podCreationTimestamp="2026-04-20 14:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:30:28.710584936 +0000 UTC m=+215.430886368" watchObservedRunningTime="2026-04-20 14:30:28.713318598 +0000 UTC m=+215.433620030" Apr 20 14:30:29.142570 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.142525 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-864bbc56b7-v5shp"] Apr 20 14:30:29.146760 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.146732 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.149208 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.149176 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 14:30:29.149208 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.149199 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 14:30:29.149369 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.149295 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 14:30:29.149486 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.149471 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-lbhjf\"" Apr 20 14:30:29.149678 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.149664 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 14:30:29.149797 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.149779 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 14:30:29.155635 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.155613 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 14:30:29.158845 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.158821 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-864bbc56b7-v5shp"] Apr 20 14:30:29.195190 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.195150 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e7a48f3-a413-4589-ab1d-85e3cf805195-serving-certs-ca-bundle\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.195190 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.195183 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e7a48f3-a413-4589-ab1d-85e3cf805195-telemeter-trusted-ca-bundle\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.195406 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.195206 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zggss\" (UniqueName: \"kubernetes.io/projected/6e7a48f3-a413-4589-ab1d-85e3cf805195-kube-api-access-zggss\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.195406 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.195273 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6e7a48f3-a413-4589-ab1d-85e3cf805195-federate-client-tls\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.195406 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.195343 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6e7a48f3-a413-4589-ab1d-85e3cf805195-secret-telemeter-client\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.195406 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.195401 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6e7a48f3-a413-4589-ab1d-85e3cf805195-telemeter-client-tls\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.195616 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.195462 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6e7a48f3-a413-4589-ab1d-85e3cf805195-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.195616 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.195480 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e7a48f3-a413-4589-ab1d-85e3cf805195-metrics-client-ca\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.296392 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.296346 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e7a48f3-a413-4589-ab1d-85e3cf805195-serving-certs-ca-bundle\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.296392 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.296393 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e7a48f3-a413-4589-ab1d-85e3cf805195-telemeter-trusted-ca-bundle\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.296671 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.296423 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zggss\" (UniqueName: \"kubernetes.io/projected/6e7a48f3-a413-4589-ab1d-85e3cf805195-kube-api-access-zggss\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.296671 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.296459 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6e7a48f3-a413-4589-ab1d-85e3cf805195-federate-client-tls\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.296671 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.296529 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6e7a48f3-a413-4589-ab1d-85e3cf805195-secret-telemeter-client\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.296671 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.296556 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6e7a48f3-a413-4589-ab1d-85e3cf805195-telemeter-client-tls\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.296671 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.296613 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6e7a48f3-a413-4589-ab1d-85e3cf805195-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.296671 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.296636 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e7a48f3-a413-4589-ab1d-85e3cf805195-metrics-client-ca\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.297232 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.297199 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e7a48f3-a413-4589-ab1d-85e3cf805195-serving-certs-ca-bundle\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.299336 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.299304 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/6e7a48f3-a413-4589-ab1d-85e3cf805195-federate-client-tls\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.299336 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.299322 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/6e7a48f3-a413-4589-ab1d-85e3cf805195-telemeter-client-tls\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.299520 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.299322 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/6e7a48f3-a413-4589-ab1d-85e3cf805195-secret-telemeter-client\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.299520 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.299489 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6e7a48f3-a413-4589-ab1d-85e3cf805195-metrics-client-ca\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.299651 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.299629 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e7a48f3-a413-4589-ab1d-85e3cf805195-telemeter-trusted-ca-bundle\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.299804 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.299782 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6e7a48f3-a413-4589-ab1d-85e3cf805195-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.305709 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.305679 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zggss\" (UniqueName: \"kubernetes.io/projected/6e7a48f3-a413-4589-ab1d-85e3cf805195-kube-api-access-zggss\") pod \"telemeter-client-864bbc56b7-v5shp\" (UID: \"6e7a48f3-a413-4589-ab1d-85e3cf805195\") " pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.453576 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.453471 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 14:30:29.454181 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.454144 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="prometheus" containerID="cri-o://094eeaa4b1648bd61b9ef04837c68a3bdf98c199e59b5222e7deb0a426486104" gracePeriod=600 Apr 20 14:30:29.454289 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.454180 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="config-reloader" containerID="cri-o://133729ed9d7979b91e91e3ced086535439b5d2a9cfcdc0c3ee3f9970a6c21f58" gracePeriod=600 Apr 20 14:30:29.454289 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.454204 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="thanos-sidecar" containerID="cri-o://caed29d2329376e5c4e1b745a79fe5fd51fb6918474dae87816bd9df93e87b3d" gracePeriod=600 Apr 20 14:30:29.454289 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.454149 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="kube-rbac-proxy-thanos" containerID="cri-o://fe8bdcfb2ca62621ceffe4bd8769d0c365554c1f203928224f86aae5e1c0b4d9" gracePeriod=600 Apr 20 14:30:29.454447 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.454374 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="kube-rbac-proxy-web" containerID="cri-o://d456832db40297a46e3c05115bd9b94ab03a1d67a750ecd5cc99a69feb0fb523" gracePeriod=600 Apr 20 14:30:29.454527 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.454490 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="kube-rbac-proxy" containerID="cri-o://cb070998d128fb3e56a1f61b9a0626c80be8d78f2bbe21a4ce23d2832ec2899d" gracePeriod=600 Apr 20 14:30:29.467476 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.462602 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" Apr 20 14:30:29.624984 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.624938 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-864bbc56b7-v5shp"] Apr 20 14:30:29.627623 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:30:29.627565 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e7a48f3_a413_4589_ab1d_85e3cf805195.slice/crio-f637d3e1034cf0e7fd858a8f893f58c3156ce4114b70aa1a1b18b50715d92001 WatchSource:0}: Error finding container f637d3e1034cf0e7fd858a8f893f58c3156ce4114b70aa1a1b18b50715d92001: Status 404 returned error can't find the container with id f637d3e1034cf0e7fd858a8f893f58c3156ce4114b70aa1a1b18b50715d92001 Apr 20 14:30:29.691664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.691632 2580 generic.go:358] "Generic (PLEG): container finished" podID="b49ed096-fc59-422a-993a-259f96d09946" containerID="fe8bdcfb2ca62621ceffe4bd8769d0c365554c1f203928224f86aae5e1c0b4d9" exitCode=0 Apr 20 14:30:29.691664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.691657 2580 generic.go:358] "Generic (PLEG): container finished" podID="b49ed096-fc59-422a-993a-259f96d09946" containerID="cb070998d128fb3e56a1f61b9a0626c80be8d78f2bbe21a4ce23d2832ec2899d" exitCode=0 Apr 20 14:30:29.691664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.691664 2580 generic.go:358] "Generic (PLEG): container finished" podID="b49ed096-fc59-422a-993a-259f96d09946" containerID="d456832db40297a46e3c05115bd9b94ab03a1d67a750ecd5cc99a69feb0fb523" exitCode=0 Apr 20 14:30:29.691664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.691670 2580 generic.go:358] "Generic (PLEG): container finished" podID="b49ed096-fc59-422a-993a-259f96d09946" containerID="caed29d2329376e5c4e1b745a79fe5fd51fb6918474dae87816bd9df93e87b3d" exitCode=0 Apr 20 14:30:29.691664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.691675 2580 generic.go:358] "Generic (PLEG): container finished" podID="b49ed096-fc59-422a-993a-259f96d09946" containerID="133729ed9d7979b91e91e3ced086535439b5d2a9cfcdc0c3ee3f9970a6c21f58" exitCode=0 Apr 20 14:30:29.691664 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.691680 2580 generic.go:358] "Generic (PLEG): container finished" podID="b49ed096-fc59-422a-993a-259f96d09946" containerID="094eeaa4b1648bd61b9ef04837c68a3bdf98c199e59b5222e7deb0a426486104" exitCode=0 Apr 20 14:30:29.692310 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.691712 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b49ed096-fc59-422a-993a-259f96d09946","Type":"ContainerDied","Data":"fe8bdcfb2ca62621ceffe4bd8769d0c365554c1f203928224f86aae5e1c0b4d9"} Apr 20 14:30:29.692310 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.691756 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b49ed096-fc59-422a-993a-259f96d09946","Type":"ContainerDied","Data":"cb070998d128fb3e56a1f61b9a0626c80be8d78f2bbe21a4ce23d2832ec2899d"} Apr 20 14:30:29.692310 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.691772 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b49ed096-fc59-422a-993a-259f96d09946","Type":"ContainerDied","Data":"d456832db40297a46e3c05115bd9b94ab03a1d67a750ecd5cc99a69feb0fb523"} Apr 20 14:30:29.692310 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.691787 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b49ed096-fc59-422a-993a-259f96d09946","Type":"ContainerDied","Data":"caed29d2329376e5c4e1b745a79fe5fd51fb6918474dae87816bd9df93e87b3d"} Apr 20 14:30:29.692310 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.691799 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b49ed096-fc59-422a-993a-259f96d09946","Type":"ContainerDied","Data":"133729ed9d7979b91e91e3ced086535439b5d2a9cfcdc0c3ee3f9970a6c21f58"} Apr 20 14:30:29.692310 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.691812 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b49ed096-fc59-422a-993a-259f96d09946","Type":"ContainerDied","Data":"094eeaa4b1648bd61b9ef04837c68a3bdf98c199e59b5222e7deb0a426486104"} Apr 20 14:30:29.693012 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.692989 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" event={"ID":"6e7a48f3-a413-4589-ab1d-85e3cf805195","Type":"ContainerStarted","Data":"f637d3e1034cf0e7fd858a8f893f58c3156ce4114b70aa1a1b18b50715d92001"} Apr 20 14:30:29.713208 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.713184 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:29.800564 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.800531 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-prometheus-k8s-rulefiles-0\") pod \"b49ed096-fc59-422a-993a-259f96d09946\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " Apr 20 14:30:29.800754 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.800584 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-configmap-kubelet-serving-ca-bundle\") pod \"b49ed096-fc59-422a-993a-259f96d09946\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " Apr 20 14:30:29.800754 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.800627 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"b49ed096-fc59-422a-993a-259f96d09946\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " Apr 20 14:30:29.800754 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.800663 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-metrics-client-certs\") pod \"b49ed096-fc59-422a-993a-259f96d09946\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " Apr 20 14:30:29.800754 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.800695 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-prometheus-k8s-tls\") pod \"b49ed096-fc59-422a-993a-259f96d09946\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " Apr 20 14:30:29.800754 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.800739 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-kube-rbac-proxy\") pod \"b49ed096-fc59-422a-993a-259f96d09946\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " Apr 20 14:30:29.801018 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.800775 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-configmap-metrics-client-ca\") pod \"b49ed096-fc59-422a-993a-259f96d09946\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " Apr 20 14:30:29.801018 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.800801 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-grpc-tls\") pod \"b49ed096-fc59-422a-993a-259f96d09946\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " Apr 20 14:30:29.801018 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.800825 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-prometheus-trusted-ca-bundle\") pod \"b49ed096-fc59-422a-993a-259f96d09946\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " Apr 20 14:30:29.801018 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.800855 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b49ed096-fc59-422a-993a-259f96d09946-prometheus-k8s-db\") pod \"b49ed096-fc59-422a-993a-259f96d09946\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " Apr 20 14:30:29.801018 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.800880 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-configmap-serving-certs-ca-bundle\") pod \"b49ed096-fc59-422a-993a-259f96d09946\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " Apr 20 14:30:29.801018 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.800909 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b49ed096-fc59-422a-993a-259f96d09946-tls-assets\") pod \"b49ed096-fc59-422a-993a-259f96d09946\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " Apr 20 14:30:29.801018 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.800962 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b49ed096-fc59-422a-993a-259f96d09946-config-out\") pod \"b49ed096-fc59-422a-993a-259f96d09946\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " Apr 20 14:30:29.801018 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.800988 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-thanos-prometheus-http-client-file\") pod \"b49ed096-fc59-422a-993a-259f96d09946\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " Apr 20 14:30:29.801018 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.800982 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "b49ed096-fc59-422a-993a-259f96d09946" (UID: "b49ed096-fc59-422a-993a-259f96d09946"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:30:29.801018 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.801016 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh28w\" (UniqueName: \"kubernetes.io/projected/b49ed096-fc59-422a-993a-259f96d09946-kube-api-access-fh28w\") pod \"b49ed096-fc59-422a-993a-259f96d09946\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " Apr 20 14:30:29.801586 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.801043 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"b49ed096-fc59-422a-993a-259f96d09946\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " Apr 20 14:30:29.801586 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.801080 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-web-config\") pod \"b49ed096-fc59-422a-993a-259f96d09946\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " Apr 20 14:30:29.801586 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.801134 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-config\") pod \"b49ed096-fc59-422a-993a-259f96d09946\" (UID: \"b49ed096-fc59-422a-993a-259f96d09946\") " Apr 20 14:30:29.801739 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.801642 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "b49ed096-fc59-422a-993a-259f96d09946" (UID: "b49ed096-fc59-422a-993a-259f96d09946"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:30:29.803835 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.803805 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "b49ed096-fc59-422a-993a-259f96d09946" (UID: "b49ed096-fc59-422a-993a-259f96d09946"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:30:29.804266 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.804128 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "b49ed096-fc59-422a-993a-259f96d09946" (UID: "b49ed096-fc59-422a-993a-259f96d09946"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:30:29.804385 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.804264 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "b49ed096-fc59-422a-993a-259f96d09946" (UID: "b49ed096-fc59-422a-993a-259f96d09946"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:30:29.804385 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.804332 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b49ed096-fc59-422a-993a-259f96d09946-kube-api-access-fh28w" (OuterVolumeSpecName: "kube-api-access-fh28w") pod "b49ed096-fc59-422a-993a-259f96d09946" (UID: "b49ed096-fc59-422a-993a-259f96d09946"). InnerVolumeSpecName "kube-api-access-fh28w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:30:29.804848 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.804558 2580 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-configmap-metrics-client-ca\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:29.804848 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.804583 2580 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-prometheus-trusted-ca-bundle\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:29.804848 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.804599 2580 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-thanos-prometheus-http-client-file\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:29.804848 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.804613 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fh28w\" (UniqueName: \"kubernetes.io/projected/b49ed096-fc59-422a-993a-259f96d09946-kube-api-access-fh28w\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:29.804848 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.804627 2580 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:29.804848 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.804643 2580 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:29.804848 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.804843 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "b49ed096-fc59-422a-993a-259f96d09946" (UID: "b49ed096-fc59-422a-993a-259f96d09946"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:30:29.805926 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.805644 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "b49ed096-fc59-422a-993a-259f96d09946" (UID: "b49ed096-fc59-422a-993a-259f96d09946"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:30:29.806406 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.806381 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b49ed096-fc59-422a-993a-259f96d09946-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "b49ed096-fc59-422a-993a-259f96d09946" (UID: "b49ed096-fc59-422a-993a-259f96d09946"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:30:29.806934 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.806827 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "b49ed096-fc59-422a-993a-259f96d09946" (UID: "b49ed096-fc59-422a-993a-259f96d09946"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:30:29.807023 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.806987 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b49ed096-fc59-422a-993a-259f96d09946-config-out" (OuterVolumeSpecName: "config-out") pod "b49ed096-fc59-422a-993a-259f96d09946" (UID: "b49ed096-fc59-422a-993a-259f96d09946"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:30:29.807423 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.807281 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "b49ed096-fc59-422a-993a-259f96d09946" (UID: "b49ed096-fc59-422a-993a-259f96d09946"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:30:29.807423 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.807375 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "b49ed096-fc59-422a-993a-259f96d09946" (UID: "b49ed096-fc59-422a-993a-259f96d09946"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:30:29.807645 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.807620 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "b49ed096-fc59-422a-993a-259f96d09946" (UID: "b49ed096-fc59-422a-993a-259f96d09946"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:30:29.807759 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.807732 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "b49ed096-fc59-422a-993a-259f96d09946" (UID: "b49ed096-fc59-422a-993a-259f96d09946"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:30:29.807858 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.807793 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b49ed096-fc59-422a-993a-259f96d09946-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "b49ed096-fc59-422a-993a-259f96d09946" (UID: "b49ed096-fc59-422a-993a-259f96d09946"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:30:29.808216 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.808197 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-config" (OuterVolumeSpecName: "config") pod "b49ed096-fc59-422a-993a-259f96d09946" (UID: "b49ed096-fc59-422a-993a-259f96d09946"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:30:29.821295 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.821264 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-web-config" (OuterVolumeSpecName: "web-config") pod "b49ed096-fc59-422a-993a-259f96d09946" (UID: "b49ed096-fc59-422a-993a-259f96d09946"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:30:29.905520 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.905455 2580 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-metrics-client-certs\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:29.905520 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.905484 2580 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-prometheus-k8s-tls\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:29.905520 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.905494 2580 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-kube-rbac-proxy\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:29.905520 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.905524 2580 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-grpc-tls\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:29.905520 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.905533 2580 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b49ed096-fc59-422a-993a-259f96d09946-prometheus-k8s-db\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:29.905806 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.905544 2580 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b49ed096-fc59-422a-993a-259f96d09946-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:29.905806 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.905554 2580 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b49ed096-fc59-422a-993a-259f96d09946-tls-assets\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:29.905806 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.905563 2580 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b49ed096-fc59-422a-993a-259f96d09946-config-out\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:29.905806 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.905572 2580 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:29.905806 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.905581 2580 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-web-config\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:29.905806 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.905590 2580 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-config\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:29.905806 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:29.905598 2580 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b49ed096-fc59-422a-993a-259f96d09946-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:30:30.701433 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.700391 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b49ed096-fc59-422a-993a-259f96d09946","Type":"ContainerDied","Data":"254f38514cf9c277d7addb40a9fff1779ce6dd29b9af80a303c00677aa2c95e3"} Apr 20 14:30:30.701433 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.700441 2580 scope.go:117] "RemoveContainer" containerID="fe8bdcfb2ca62621ceffe4bd8769d0c365554c1f203928224f86aae5e1c0b4d9" Apr 20 14:30:30.701433 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.700670 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.710998 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.710974 2580 scope.go:117] "RemoveContainer" containerID="cb070998d128fb3e56a1f61b9a0626c80be8d78f2bbe21a4ce23d2832ec2899d" Apr 20 14:30:30.719657 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.719632 2580 scope.go:117] "RemoveContainer" containerID="d456832db40297a46e3c05115bd9b94ab03a1d67a750ecd5cc99a69feb0fb523" Apr 20 14:30:30.724829 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.724808 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 14:30:30.728171 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.728152 2580 scope.go:117] "RemoveContainer" containerID="caed29d2329376e5c4e1b745a79fe5fd51fb6918474dae87816bd9df93e87b3d" Apr 20 14:30:30.728985 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.728961 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 14:30:30.735840 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.735818 2580 scope.go:117] "RemoveContainer" containerID="133729ed9d7979b91e91e3ced086535439b5d2a9cfcdc0c3ee3f9970a6c21f58" Apr 20 14:30:30.742982 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.742964 2580 scope.go:117] "RemoveContainer" containerID="094eeaa4b1648bd61b9ef04837c68a3bdf98c199e59b5222e7deb0a426486104" Apr 20 14:30:30.750929 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.750907 2580 scope.go:117] "RemoveContainer" containerID="b46689b54626202931fd89ae0bb8df1370910d883efa83a4126c789d594bb32d" Apr 20 14:30:30.757221 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.757198 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 14:30:30.757679 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.757663 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="kube-rbac-proxy-web" Apr 20 14:30:30.757737 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.757682 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="kube-rbac-proxy-web" Apr 20 14:30:30.757737 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.757702 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="config-reloader" Apr 20 14:30:30.757737 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.757711 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="config-reloader" Apr 20 14:30:30.757737 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.757728 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="prometheus" Apr 20 14:30:30.757737 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.757736 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="prometheus" Apr 20 14:30:30.757888 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.757749 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="kube-rbac-proxy-thanos" Apr 20 14:30:30.757888 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.757757 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="kube-rbac-proxy-thanos" Apr 20 14:30:30.757888 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.757772 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="init-config-reloader" Apr 20 14:30:30.757888 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.757780 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="init-config-reloader" Apr 20 14:30:30.757888 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.757799 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="thanos-sidecar" Apr 20 14:30:30.757888 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.757807 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="thanos-sidecar" Apr 20 14:30:30.757888 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.757816 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="kube-rbac-proxy" Apr 20 14:30:30.757888 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.757823 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="kube-rbac-proxy" Apr 20 14:30:30.758117 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.757890 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="thanos-sidecar" Apr 20 14:30:30.758117 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.757903 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="kube-rbac-proxy" Apr 20 14:30:30.758117 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.757914 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="kube-rbac-proxy-thanos" Apr 20 14:30:30.758117 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.757927 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="kube-rbac-proxy-web" Apr 20 14:30:30.758117 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.757937 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="prometheus" Apr 20 14:30:30.758117 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.757947 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="b49ed096-fc59-422a-993a-259f96d09946" containerName="config-reloader" Apr 20 14:30:30.763358 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.763340 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.765988 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.765963 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 14:30:30.766092 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.766067 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 14:30:30.767049 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.767029 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 14:30:30.767311 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.767299 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 14:30:30.767509 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.767484 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-cid9rocamv4tr\"" Apr 20 14:30:30.767616 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.767601 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 14:30:30.767801 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.767780 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 14:30:30.767889 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.767814 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 14:30:30.768094 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.768078 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 14:30:30.768394 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.768377 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 14:30:30.769933 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.769275 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 14:30:30.771150 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.770651 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-rv2f2\"" Apr 20 14:30:30.771150 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.770811 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 14:30:30.778530 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.778460 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 14:30:30.778867 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.778845 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 14:30:30.782056 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.782037 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 14:30:30.814433 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.814401 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5db69b60-e33a-4cee-8179-2f467fcf7536-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.814584 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.814439 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.814584 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.814456 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.814584 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.814571 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.814724 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.814614 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.814724 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.814647 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.814724 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.814670 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5db69b60-e33a-4cee-8179-2f467fcf7536-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.814724 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.814699 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.814903 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.814764 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5db69b60-e33a-4cee-8179-2f467fcf7536-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.814903 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.814803 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.814903 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.814829 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5db69b60-e33a-4cee-8179-2f467fcf7536-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.814903 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.814874 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-config\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.814903 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.814889 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5db69b60-e33a-4cee-8179-2f467fcf7536-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.815072 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.814905 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-web-config\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.815072 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.814922 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxrmx\" (UniqueName: \"kubernetes.io/projected/5db69b60-e33a-4cee-8179-2f467fcf7536-kube-api-access-jxrmx\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.815072 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.814944 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5db69b60-e33a-4cee-8179-2f467fcf7536-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.815072 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.815014 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5db69b60-e33a-4cee-8179-2f467fcf7536-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.815072 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.815031 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5db69b60-e33a-4cee-8179-2f467fcf7536-config-out\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.916398 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.916361 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5db69b60-e33a-4cee-8179-2f467fcf7536-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.916605 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.916405 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.916605 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.916431 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.916605 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.916474 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.916605 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.916527 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.916605 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.916565 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.916605 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.916592 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5db69b60-e33a-4cee-8179-2f467fcf7536-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.916928 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.916621 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.916928 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.916650 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5db69b60-e33a-4cee-8179-2f467fcf7536-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.916928 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.916677 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.916928 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.916703 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5db69b60-e33a-4cee-8179-2f467fcf7536-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.916928 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.916745 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-config\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.916928 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.916770 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5db69b60-e33a-4cee-8179-2f467fcf7536-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.916928 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.916797 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-web-config\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.916928 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.916826 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxrmx\" (UniqueName: \"kubernetes.io/projected/5db69b60-e33a-4cee-8179-2f467fcf7536-kube-api-access-jxrmx\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.916928 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.916865 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5db69b60-e33a-4cee-8179-2f467fcf7536-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.916928 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.916897 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5db69b60-e33a-4cee-8179-2f467fcf7536-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.916928 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.916920 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5db69b60-e33a-4cee-8179-2f467fcf7536-config-out\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.917471 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.917225 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5db69b60-e33a-4cee-8179-2f467fcf7536-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.919176 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.918062 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5db69b60-e33a-4cee-8179-2f467fcf7536-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.919176 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.918419 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5db69b60-e33a-4cee-8179-2f467fcf7536-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.919443 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.919416 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/5db69b60-e33a-4cee-8179-2f467fcf7536-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.919581 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.919530 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5db69b60-e33a-4cee-8179-2f467fcf7536-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.919659 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.919637 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5db69b60-e33a-4cee-8179-2f467fcf7536-config-out\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.919893 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.919786 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.920228 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.919984 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.920313 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.920234 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.921136 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.921094 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.921539 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.921521 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-web-config\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.921864 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.921840 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.922091 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.922070 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5db69b60-e33a-4cee-8179-2f467fcf7536-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.922151 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.922072 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.922421 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.922389 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.922545 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.922524 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5db69b60-e33a-4cee-8179-2f467fcf7536-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.922625 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.922599 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5db69b60-e33a-4cee-8179-2f467fcf7536-config\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:30.927677 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:30.927650 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxrmx\" (UniqueName: \"kubernetes.io/projected/5db69b60-e33a-4cee-8179-2f467fcf7536-kube-api-access-jxrmx\") pod \"prometheus-k8s-0\" (UID: \"5db69b60-e33a-4cee-8179-2f467fcf7536\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:31.075208 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:31.075181 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:30:31.210389 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:31.210186 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 14:30:31.212836 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:30:31.212804 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5db69b60_e33a_4cee_8179_2f467fcf7536.slice/crio-56c225d1127bb6270cb9529d0e2faa64a8646f29e36ffde62eecc6d983184cde WatchSource:0}: Error finding container 56c225d1127bb6270cb9529d0e2faa64a8646f29e36ffde62eecc6d983184cde: Status 404 returned error can't find the container with id 56c225d1127bb6270cb9529d0e2faa64a8646f29e36ffde62eecc6d983184cde Apr 20 14:30:31.706532 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:31.706483 2580 generic.go:358] "Generic (PLEG): container finished" podID="5db69b60-e33a-4cee-8179-2f467fcf7536" containerID="8930f7586132e212582d2c7c0e098e4f4073c5854e2b86a83e3b2a5093a956a1" exitCode=0 Apr 20 14:30:31.706966 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:31.706579 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5db69b60-e33a-4cee-8179-2f467fcf7536","Type":"ContainerDied","Data":"8930f7586132e212582d2c7c0e098e4f4073c5854e2b86a83e3b2a5093a956a1"} Apr 20 14:30:31.706966 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:31.706616 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5db69b60-e33a-4cee-8179-2f467fcf7536","Type":"ContainerStarted","Data":"56c225d1127bb6270cb9529d0e2faa64a8646f29e36ffde62eecc6d983184cde"} Apr 20 14:30:31.935246 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:31.935211 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b49ed096-fc59-422a-993a-259f96d09946" path="/var/lib/kubelet/pods/b49ed096-fc59-422a-993a-259f96d09946/volumes" Apr 20 14:30:32.712800 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:32.712760 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5db69b60-e33a-4cee-8179-2f467fcf7536","Type":"ContainerStarted","Data":"f9612b26e3b308d83bd1a8e64c2ef386eb45f84a1766a8519f48bcaf06ac2586"} Apr 20 14:30:32.712800 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:32.712801 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5db69b60-e33a-4cee-8179-2f467fcf7536","Type":"ContainerStarted","Data":"a580a2329cf258cf9b582d868c107090dd7d50122cd3e69a122787357f485558"} Apr 20 14:30:32.713311 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:32.712816 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5db69b60-e33a-4cee-8179-2f467fcf7536","Type":"ContainerStarted","Data":"13037146d02889d539414eeb393b5b2e4b99f35610649bc56503d25455c5e751"} Apr 20 14:30:32.713311 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:32.712829 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5db69b60-e33a-4cee-8179-2f467fcf7536","Type":"ContainerStarted","Data":"9bc9ebd9aedb5764df4976bb0baf03715faf11bd9a93e832f88ca54c75663e3e"} Apr 20 14:30:32.713311 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:32.712841 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5db69b60-e33a-4cee-8179-2f467fcf7536","Type":"ContainerStarted","Data":"13ef7bdf1e87c4fe9ce307729b2b61c23002ba3f17c64925ae21dc679b806623"} Apr 20 14:30:32.713311 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:32.712852 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"5db69b60-e33a-4cee-8179-2f467fcf7536","Type":"ContainerStarted","Data":"3f1c62d8452e7f07d7b953be9c466e42818c19434ecf4fc47f328fcaebd39425"} Apr 20 14:30:32.714557 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:32.714533 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" event={"ID":"6e7a48f3-a413-4589-ab1d-85e3cf805195","Type":"ContainerStarted","Data":"363df6a41d329c710f1f7ee0cf2c6ba7d5545a215549dbbaf56118c4e9a5b95d"} Apr 20 14:30:32.714557 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:32.714560 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" event={"ID":"6e7a48f3-a413-4589-ab1d-85e3cf805195","Type":"ContainerStarted","Data":"e2aaf6edd02f6c229f46f47eb1dbfe39f8e685c40ee77f6b16ec2305a4d3fce3"} Apr 20 14:30:32.714755 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:32.714569 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" event={"ID":"6e7a48f3-a413-4589-ab1d-85e3cf805195","Type":"ContainerStarted","Data":"a111307a098ff69ba8c3d7cfc6e354869019354dd6d096c531135d1dbd2c0d4e"} Apr 20 14:30:32.768116 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:32.768021 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.768001329 podStartE2EDuration="2.768001329s" podCreationTimestamp="2026-04-20 14:30:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:30:32.767116963 +0000 UTC m=+219.487418411" watchObservedRunningTime="2026-04-20 14:30:32.768001329 +0000 UTC m=+219.488302761" Apr 20 14:30:32.805711 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:32.805655 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-864bbc56b7-v5shp" podStartSLOduration=1.441153482 podStartE2EDuration="3.805640469s" podCreationTimestamp="2026-04-20 14:30:29 +0000 UTC" firstStartedPulling="2026-04-20 14:30:29.629669568 +0000 UTC m=+216.349970978" lastFinishedPulling="2026-04-20 14:30:31.994156542 +0000 UTC m=+218.714457965" observedRunningTime="2026-04-20 14:30:32.805398496 +0000 UTC m=+219.525699941" watchObservedRunningTime="2026-04-20 14:30:32.805640469 +0000 UTC m=+219.525941913" Apr 20 14:30:36.075985 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:30:36.075951 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:31:31.075995 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:31:31.075910 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:31:31.091167 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:31:31.091140 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:31:31.914231 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:31:31.914201 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:31:53.820959 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:31:53.820929 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s4vzx_b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad/ovn-acl-logging/0.log" Apr 20 14:31:53.821612 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:31:53.821582 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s4vzx_b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad/ovn-acl-logging/0.log" Apr 20 14:31:53.825842 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:31:53.825817 2580 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 14:34:26.766328 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:26.766242 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-pstpn"] Apr 20 14:34:26.769558 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:26.769539 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-pstpn" Apr 20 14:34:26.772103 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:26.772085 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 14:34:26.772365 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:26.772344 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-6w2pj\"" Apr 20 14:34:26.772801 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:26.772787 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 14:34:26.779787 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:26.779759 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-pstpn"] Apr 20 14:34:26.869567 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:26.869526 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5a3f50d-829c-415e-8f5c-e231b16b429f-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-pstpn\" (UID: \"e5a3f50d-829c-415e-8f5c-e231b16b429f\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pstpn" Apr 20 14:34:26.869739 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:26.869585 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74rn4\" (UniqueName: \"kubernetes.io/projected/e5a3f50d-829c-415e-8f5c-e231b16b429f-kube-api-access-74rn4\") pod \"cert-manager-cainjector-8966b78d4-pstpn\" (UID: \"e5a3f50d-829c-415e-8f5c-e231b16b429f\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pstpn" Apr 20 14:34:26.970682 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:26.970631 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5a3f50d-829c-415e-8f5c-e231b16b429f-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-pstpn\" (UID: \"e5a3f50d-829c-415e-8f5c-e231b16b429f\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pstpn" Apr 20 14:34:26.970869 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:26.970706 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74rn4\" (UniqueName: \"kubernetes.io/projected/e5a3f50d-829c-415e-8f5c-e231b16b429f-kube-api-access-74rn4\") pod \"cert-manager-cainjector-8966b78d4-pstpn\" (UID: \"e5a3f50d-829c-415e-8f5c-e231b16b429f\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pstpn" Apr 20 14:34:26.984199 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:26.984165 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e5a3f50d-829c-415e-8f5c-e231b16b429f-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-pstpn\" (UID: \"e5a3f50d-829c-415e-8f5c-e231b16b429f\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pstpn" Apr 20 14:34:26.984364 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:26.984272 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74rn4\" (UniqueName: \"kubernetes.io/projected/e5a3f50d-829c-415e-8f5c-e231b16b429f-kube-api-access-74rn4\") pod \"cert-manager-cainjector-8966b78d4-pstpn\" (UID: \"e5a3f50d-829c-415e-8f5c-e231b16b429f\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-pstpn" Apr 20 14:34:27.092605 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:27.092573 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-pstpn" Apr 20 14:34:27.227248 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:27.227174 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-pstpn"] Apr 20 14:34:27.231079 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:34:27.231038 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5a3f50d_829c_415e_8f5c_e231b16b429f.slice/crio-53eb5bf37b1b31b565f96a37a10a92fcdc584c8d8d6c83b37b9149d044dbb5c8 WatchSource:0}: Error finding container 53eb5bf37b1b31b565f96a37a10a92fcdc584c8d8d6c83b37b9149d044dbb5c8: Status 404 returned error can't find the container with id 53eb5bf37b1b31b565f96a37a10a92fcdc584c8d8d6c83b37b9149d044dbb5c8 Apr 20 14:34:27.232959 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:27.232942 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 14:34:27.431383 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:27.431297 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-pstpn" event={"ID":"e5a3f50d-829c-415e-8f5c-e231b16b429f","Type":"ContainerStarted","Data":"53eb5bf37b1b31b565f96a37a10a92fcdc584c8d8d6c83b37b9149d044dbb5c8"} Apr 20 14:34:31.446812 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:31.446781 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-pstpn" event={"ID":"e5a3f50d-829c-415e-8f5c-e231b16b429f","Type":"ContainerStarted","Data":"3adee07d0f7043ff28c37c9eb33aa47ff57476fd11c1ba742358349be05264b8"} Apr 20 14:34:31.462415 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:31.462354 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-pstpn" podStartSLOduration=2.305263503 podStartE2EDuration="5.462331557s" podCreationTimestamp="2026-04-20 14:34:26 +0000 UTC" firstStartedPulling="2026-04-20 14:34:27.233074339 +0000 UTC m=+453.953375749" lastFinishedPulling="2026-04-20 14:34:30.390142379 +0000 UTC m=+457.110443803" observedRunningTime="2026-04-20 14:34:31.460783518 +0000 UTC m=+458.181084961" watchObservedRunningTime="2026-04-20 14:34:31.462331557 +0000 UTC m=+458.182632990" Apr 20 14:34:55.855528 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:55.855463 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-65c545df94-xlml9"] Apr 20 14:34:55.861341 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:55.861317 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-xlml9" Apr 20 14:34:55.863824 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:55.863798 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 14:34:55.864216 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:55.864197 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 14:34:55.864337 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:55.864225 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-wjn8h\"" Apr 20 14:34:55.864385 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:55.864370 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 14:34:55.864419 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:55.864403 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 14:34:55.877235 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:55.877209 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-65c545df94-xlml9"] Apr 20 14:34:55.900510 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:55.900453 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfcrf\" (UniqueName: \"kubernetes.io/projected/c265fc19-3a91-4dc4-9f05-7d671aed51f8-kube-api-access-sfcrf\") pod \"opendatahub-operator-controller-manager-65c545df94-xlml9\" (UID: \"c265fc19-3a91-4dc4-9f05-7d671aed51f8\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-xlml9" Apr 20 14:34:55.900691 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:55.900531 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c265fc19-3a91-4dc4-9f05-7d671aed51f8-webhook-cert\") pod \"opendatahub-operator-controller-manager-65c545df94-xlml9\" (UID: \"c265fc19-3a91-4dc4-9f05-7d671aed51f8\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-xlml9" Apr 20 14:34:55.900691 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:55.900601 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c265fc19-3a91-4dc4-9f05-7d671aed51f8-apiservice-cert\") pod \"opendatahub-operator-controller-manager-65c545df94-xlml9\" (UID: \"c265fc19-3a91-4dc4-9f05-7d671aed51f8\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-xlml9" Apr 20 14:34:56.001803 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:56.001765 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfcrf\" (UniqueName: \"kubernetes.io/projected/c265fc19-3a91-4dc4-9f05-7d671aed51f8-kube-api-access-sfcrf\") pod \"opendatahub-operator-controller-manager-65c545df94-xlml9\" (UID: \"c265fc19-3a91-4dc4-9f05-7d671aed51f8\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-xlml9" Apr 20 14:34:56.001983 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:56.001813 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c265fc19-3a91-4dc4-9f05-7d671aed51f8-webhook-cert\") pod \"opendatahub-operator-controller-manager-65c545df94-xlml9\" (UID: \"c265fc19-3a91-4dc4-9f05-7d671aed51f8\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-xlml9" Apr 20 14:34:56.001983 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:56.001846 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c265fc19-3a91-4dc4-9f05-7d671aed51f8-apiservice-cert\") pod \"opendatahub-operator-controller-manager-65c545df94-xlml9\" (UID: \"c265fc19-3a91-4dc4-9f05-7d671aed51f8\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-xlml9" Apr 20 14:34:56.004479 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:56.004452 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c265fc19-3a91-4dc4-9f05-7d671aed51f8-apiservice-cert\") pod \"opendatahub-operator-controller-manager-65c545df94-xlml9\" (UID: \"c265fc19-3a91-4dc4-9f05-7d671aed51f8\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-xlml9" Apr 20 14:34:56.004479 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:56.004470 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c265fc19-3a91-4dc4-9f05-7d671aed51f8-webhook-cert\") pod \"opendatahub-operator-controller-manager-65c545df94-xlml9\" (UID: \"c265fc19-3a91-4dc4-9f05-7d671aed51f8\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-xlml9" Apr 20 14:34:56.010845 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:56.010817 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfcrf\" (UniqueName: \"kubernetes.io/projected/c265fc19-3a91-4dc4-9f05-7d671aed51f8-kube-api-access-sfcrf\") pod \"opendatahub-operator-controller-manager-65c545df94-xlml9\" (UID: \"c265fc19-3a91-4dc4-9f05-7d671aed51f8\") " pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-xlml9" Apr 20 14:34:56.173359 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:56.173260 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-xlml9" Apr 20 14:34:56.308126 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:56.308098 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-65c545df94-xlml9"] Apr 20 14:34:56.311087 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:34:56.311051 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc265fc19_3a91_4dc4_9f05_7d671aed51f8.slice/crio-5a965cdf961d4edc4f025fe34f70318f8094eac406c75f89811fa9bdacb8aa88 WatchSource:0}: Error finding container 5a965cdf961d4edc4f025fe34f70318f8094eac406c75f89811fa9bdacb8aa88: Status 404 returned error can't find the container with id 5a965cdf961d4edc4f025fe34f70318f8094eac406c75f89811fa9bdacb8aa88 Apr 20 14:34:56.523973 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:56.523885 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-xlml9" event={"ID":"c265fc19-3a91-4dc4-9f05-7d671aed51f8","Type":"ContainerStarted","Data":"5a965cdf961d4edc4f025fe34f70318f8094eac406c75f89811fa9bdacb8aa88"} Apr 20 14:34:59.535613 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:59.535580 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-xlml9" event={"ID":"c265fc19-3a91-4dc4-9f05-7d671aed51f8","Type":"ContainerStarted","Data":"4152be0f44a82c31e46e5793a0c91c35ddf0c19df625db20d1b9e34d55fef03d"} Apr 20 14:34:59.536037 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:59.535635 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-xlml9" Apr 20 14:34:59.559566 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:34:59.559485 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-xlml9" podStartSLOduration=2.100867744 podStartE2EDuration="4.559469624s" podCreationTimestamp="2026-04-20 14:34:55 +0000 UTC" firstStartedPulling="2026-04-20 14:34:56.312995919 +0000 UTC m=+483.033297332" lastFinishedPulling="2026-04-20 14:34:58.77159779 +0000 UTC m=+485.491899212" observedRunningTime="2026-04-20 14:34:59.55705166 +0000 UTC m=+486.277353089" watchObservedRunningTime="2026-04-20 14:34:59.559469624 +0000 UTC m=+486.279771055" Apr 20 14:35:05.830907 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:05.830872 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-bc7d4767f-58dhd"] Apr 20 14:35:05.834706 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:05.834663 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-58dhd" Apr 20 14:35:05.836905 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:05.836884 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:35:05.837035 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:05.837014 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 14:35:05.837094 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:05.837024 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 14:35:05.837760 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:05.837744 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-gcnhf\"" Apr 20 14:35:05.837854 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:05.837839 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 14:35:05.837916 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:05.837847 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 14:35:05.844289 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:05.844264 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-bc7d4767f-58dhd"] Apr 20 14:35:05.874441 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:05.874413 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b13fc4d6-1b50-4552-abc0-b185e44ca8c7-metrics-cert\") pod \"lws-controller-manager-bc7d4767f-58dhd\" (UID: \"b13fc4d6-1b50-4552-abc0-b185e44ca8c7\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-58dhd" Apr 20 14:35:05.874633 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:05.874483 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdp4s\" (UniqueName: \"kubernetes.io/projected/b13fc4d6-1b50-4552-abc0-b185e44ca8c7-kube-api-access-jdp4s\") pod \"lws-controller-manager-bc7d4767f-58dhd\" (UID: \"b13fc4d6-1b50-4552-abc0-b185e44ca8c7\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-58dhd" Apr 20 14:35:05.874633 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:05.874577 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b13fc4d6-1b50-4552-abc0-b185e44ca8c7-manager-config\") pod \"lws-controller-manager-bc7d4767f-58dhd\" (UID: \"b13fc4d6-1b50-4552-abc0-b185e44ca8c7\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-58dhd" Apr 20 14:35:05.874725 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:05.874637 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b13fc4d6-1b50-4552-abc0-b185e44ca8c7-cert\") pod \"lws-controller-manager-bc7d4767f-58dhd\" (UID: \"b13fc4d6-1b50-4552-abc0-b185e44ca8c7\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-58dhd" Apr 20 14:35:05.975803 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:05.975767 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b13fc4d6-1b50-4552-abc0-b185e44ca8c7-metrics-cert\") pod \"lws-controller-manager-bc7d4767f-58dhd\" (UID: \"b13fc4d6-1b50-4552-abc0-b185e44ca8c7\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-58dhd" Apr 20 14:35:05.976017 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:05.975869 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdp4s\" (UniqueName: \"kubernetes.io/projected/b13fc4d6-1b50-4552-abc0-b185e44ca8c7-kube-api-access-jdp4s\") pod \"lws-controller-manager-bc7d4767f-58dhd\" (UID: \"b13fc4d6-1b50-4552-abc0-b185e44ca8c7\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-58dhd" Apr 20 14:35:05.976017 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:05.975895 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b13fc4d6-1b50-4552-abc0-b185e44ca8c7-manager-config\") pod \"lws-controller-manager-bc7d4767f-58dhd\" (UID: \"b13fc4d6-1b50-4552-abc0-b185e44ca8c7\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-58dhd" Apr 20 14:35:05.976017 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:05.976000 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b13fc4d6-1b50-4552-abc0-b185e44ca8c7-cert\") pod \"lws-controller-manager-bc7d4767f-58dhd\" (UID: \"b13fc4d6-1b50-4552-abc0-b185e44ca8c7\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-58dhd" Apr 20 14:35:05.976648 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:05.976615 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b13fc4d6-1b50-4552-abc0-b185e44ca8c7-manager-config\") pod \"lws-controller-manager-bc7d4767f-58dhd\" (UID: \"b13fc4d6-1b50-4552-abc0-b185e44ca8c7\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-58dhd" Apr 20 14:35:05.978862 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:05.978844 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b13fc4d6-1b50-4552-abc0-b185e44ca8c7-metrics-cert\") pod \"lws-controller-manager-bc7d4767f-58dhd\" (UID: \"b13fc4d6-1b50-4552-abc0-b185e44ca8c7\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-58dhd" Apr 20 14:35:05.978924 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:05.978890 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b13fc4d6-1b50-4552-abc0-b185e44ca8c7-cert\") pod \"lws-controller-manager-bc7d4767f-58dhd\" (UID: \"b13fc4d6-1b50-4552-abc0-b185e44ca8c7\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-58dhd" Apr 20 14:35:05.993010 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:05.992975 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdp4s\" (UniqueName: \"kubernetes.io/projected/b13fc4d6-1b50-4552-abc0-b185e44ca8c7-kube-api-access-jdp4s\") pod \"lws-controller-manager-bc7d4767f-58dhd\" (UID: \"b13fc4d6-1b50-4552-abc0-b185e44ca8c7\") " pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-58dhd" Apr 20 14:35:06.144873 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:06.144784 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-58dhd" Apr 20 14:35:06.280007 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:06.279984 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-bc7d4767f-58dhd"] Apr 20 14:35:06.282609 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:35:06.282581 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb13fc4d6_1b50_4552_abc0_b185e44ca8c7.slice/crio-8d83df7199cc5bab6e9711917833a101382363a02da1d530b168387f1cdf5942 WatchSource:0}: Error finding container 8d83df7199cc5bab6e9711917833a101382363a02da1d530b168387f1cdf5942: Status 404 returned error can't find the container with id 8d83df7199cc5bab6e9711917833a101382363a02da1d530b168387f1cdf5942 Apr 20 14:35:06.560557 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:06.560456 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-58dhd" event={"ID":"b13fc4d6-1b50-4552-abc0-b185e44ca8c7","Type":"ContainerStarted","Data":"8d83df7199cc5bab6e9711917833a101382363a02da1d530b168387f1cdf5942"} Apr 20 14:35:09.572614 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:09.572573 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-58dhd" event={"ID":"b13fc4d6-1b50-4552-abc0-b185e44ca8c7","Type":"ContainerStarted","Data":"d9d833dc772f3efa63fe66370374a37d68be0e5e6669e9d638d7b5a4c3136083"} Apr 20 14:35:09.573003 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:09.572702 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-58dhd" Apr 20 14:35:09.592013 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:09.591960 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-58dhd" podStartSLOduration=1.951757011 podStartE2EDuration="4.591945968s" podCreationTimestamp="2026-04-20 14:35:05 +0000 UTC" firstStartedPulling="2026-04-20 14:35:06.284382421 +0000 UTC m=+493.004683831" lastFinishedPulling="2026-04-20 14:35:08.924571379 +0000 UTC m=+495.644872788" observedRunningTime="2026-04-20 14:35:09.59110541 +0000 UTC m=+496.311406867" watchObservedRunningTime="2026-04-20 14:35:09.591945968 +0000 UTC m=+496.312247428" Apr 20 14:35:10.542957 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:10.542927 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-65c545df94-xlml9" Apr 20 14:35:14.609432 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:14.609395 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-87db58fcf-7gt29"] Apr 20 14:35:14.612735 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:14.612716 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-87db58fcf-7gt29" Apr 20 14:35:14.616049 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:14.616019 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-sq2fv\"" Apr 20 14:35:14.616205 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:14.616058 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 14:35:14.616205 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:14.616147 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 14:35:14.616205 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:14.616150 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 14:35:14.616356 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:14.616272 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 14:35:14.622558 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:14.622536 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-87db58fcf-7gt29"] Apr 20 14:35:14.654129 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:14.654095 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt7s4\" (UniqueName: \"kubernetes.io/projected/7f509089-8b77-4fa4-8afb-ed3227e0f2d2-kube-api-access-pt7s4\") pod \"kube-auth-proxy-87db58fcf-7gt29\" (UID: \"7f509089-8b77-4fa4-8afb-ed3227e0f2d2\") " pod="openshift-ingress/kube-auth-proxy-87db58fcf-7gt29" Apr 20 14:35:14.654307 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:14.654151 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f509089-8b77-4fa4-8afb-ed3227e0f2d2-tls-certs\") pod \"kube-auth-proxy-87db58fcf-7gt29\" (UID: \"7f509089-8b77-4fa4-8afb-ed3227e0f2d2\") " pod="openshift-ingress/kube-auth-proxy-87db58fcf-7gt29" Apr 20 14:35:14.654307 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:14.654173 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f509089-8b77-4fa4-8afb-ed3227e0f2d2-tmp\") pod \"kube-auth-proxy-87db58fcf-7gt29\" (UID: \"7f509089-8b77-4fa4-8afb-ed3227e0f2d2\") " pod="openshift-ingress/kube-auth-proxy-87db58fcf-7gt29" Apr 20 14:35:14.755301 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:14.755256 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f509089-8b77-4fa4-8afb-ed3227e0f2d2-tmp\") pod \"kube-auth-proxy-87db58fcf-7gt29\" (UID: \"7f509089-8b77-4fa4-8afb-ed3227e0f2d2\") " pod="openshift-ingress/kube-auth-proxy-87db58fcf-7gt29" Apr 20 14:35:14.755453 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:14.755379 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pt7s4\" (UniqueName: \"kubernetes.io/projected/7f509089-8b77-4fa4-8afb-ed3227e0f2d2-kube-api-access-pt7s4\") pod \"kube-auth-proxy-87db58fcf-7gt29\" (UID: \"7f509089-8b77-4fa4-8afb-ed3227e0f2d2\") " pod="openshift-ingress/kube-auth-proxy-87db58fcf-7gt29" Apr 20 14:35:14.755453 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:14.755440 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f509089-8b77-4fa4-8afb-ed3227e0f2d2-tls-certs\") pod \"kube-auth-proxy-87db58fcf-7gt29\" (UID: \"7f509089-8b77-4fa4-8afb-ed3227e0f2d2\") " pod="openshift-ingress/kube-auth-proxy-87db58fcf-7gt29" Apr 20 14:35:14.757787 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:14.757758 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f509089-8b77-4fa4-8afb-ed3227e0f2d2-tmp\") pod \"kube-auth-proxy-87db58fcf-7gt29\" (UID: \"7f509089-8b77-4fa4-8afb-ed3227e0f2d2\") " pod="openshift-ingress/kube-auth-proxy-87db58fcf-7gt29" Apr 20 14:35:14.757915 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:14.757878 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7f509089-8b77-4fa4-8afb-ed3227e0f2d2-tls-certs\") pod \"kube-auth-proxy-87db58fcf-7gt29\" (UID: \"7f509089-8b77-4fa4-8afb-ed3227e0f2d2\") " pod="openshift-ingress/kube-auth-proxy-87db58fcf-7gt29" Apr 20 14:35:14.766608 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:14.766573 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt7s4\" (UniqueName: \"kubernetes.io/projected/7f509089-8b77-4fa4-8afb-ed3227e0f2d2-kube-api-access-pt7s4\") pod \"kube-auth-proxy-87db58fcf-7gt29\" (UID: \"7f509089-8b77-4fa4-8afb-ed3227e0f2d2\") " pod="openshift-ingress/kube-auth-proxy-87db58fcf-7gt29" Apr 20 14:35:14.924346 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:14.924252 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-87db58fcf-7gt29" Apr 20 14:35:15.048627 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:15.048600 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-87db58fcf-7gt29"] Apr 20 14:35:15.051262 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:35:15.051236 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f509089_8b77_4fa4_8afb_ed3227e0f2d2.slice/crio-08404496eb782c6bbc8680919cfe62b6a334bdf5d73739458c058eb194bfd48e WatchSource:0}: Error finding container 08404496eb782c6bbc8680919cfe62b6a334bdf5d73739458c058eb194bfd48e: Status 404 returned error can't find the container with id 08404496eb782c6bbc8680919cfe62b6a334bdf5d73739458c058eb194bfd48e Apr 20 14:35:15.594412 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:15.594376 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-87db58fcf-7gt29" event={"ID":"7f509089-8b77-4fa4-8afb-ed3227e0f2d2","Type":"ContainerStarted","Data":"08404496eb782c6bbc8680919cfe62b6a334bdf5d73739458c058eb194bfd48e"} Apr 20 14:35:19.613482 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:19.613440 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-87db58fcf-7gt29" event={"ID":"7f509089-8b77-4fa4-8afb-ed3227e0f2d2","Type":"ContainerStarted","Data":"cc716b016ab65535699dfcef482f0d4ef7ce1bf528a246915a5065cce8a86e90"} Apr 20 14:35:19.630340 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:19.630289 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-87db58fcf-7gt29" podStartSLOduration=1.574240572 podStartE2EDuration="5.630276613s" podCreationTimestamp="2026-04-20 14:35:14 +0000 UTC" firstStartedPulling="2026-04-20 14:35:15.052871127 +0000 UTC m=+501.773172536" lastFinishedPulling="2026-04-20 14:35:19.108907164 +0000 UTC m=+505.829208577" observedRunningTime="2026-04-20 14:35:19.628049345 +0000 UTC m=+506.348350777" watchObservedRunningTime="2026-04-20 14:35:19.630276613 +0000 UTC m=+506.350578091" Apr 20 14:35:20.578474 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:35:20.578444 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-bc7d4767f-58dhd" Apr 20 14:36:53.848459 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:36:53.848427 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s4vzx_b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad/ovn-acl-logging/0.log" Apr 20 14:36:53.849351 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:36:53.849328 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s4vzx_b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad/ovn-acl-logging/0.log" Apr 20 14:36:59.025093 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:36:59.025060 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tzcgz"] Apr 20 14:36:59.028382 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:36:59.028362 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tzcgz" Apr 20 14:36:59.030735 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:36:59.030711 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 20 14:36:59.030850 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:36:59.030752 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 14:36:59.030850 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:36:59.030781 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 14:36:59.031465 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:36:59.031450 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 20 14:36:59.031541 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:36:59.031453 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-jksts\"" Apr 20 14:36:59.036760 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:36:59.036737 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tzcgz"] Apr 20 14:36:59.122908 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:36:59.122868 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkrp4\" (UniqueName: \"kubernetes.io/projected/7d0159b6-53a2-408e-b7e6-777d32d1af26-kube-api-access-zkrp4\") pod \"kuadrant-console-plugin-6cb54b5c86-tzcgz\" (UID: \"7d0159b6-53a2-408e-b7e6-777d32d1af26\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tzcgz" Apr 20 14:36:59.123083 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:36:59.122930 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7d0159b6-53a2-408e-b7e6-777d32d1af26-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-tzcgz\" (UID: \"7d0159b6-53a2-408e-b7e6-777d32d1af26\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tzcgz" Apr 20 14:36:59.123083 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:36:59.122990 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d0159b6-53a2-408e-b7e6-777d32d1af26-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-tzcgz\" (UID: \"7d0159b6-53a2-408e-b7e6-777d32d1af26\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tzcgz" Apr 20 14:36:59.224014 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:36:59.223973 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d0159b6-53a2-408e-b7e6-777d32d1af26-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-tzcgz\" (UID: \"7d0159b6-53a2-408e-b7e6-777d32d1af26\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tzcgz" Apr 20 14:36:59.224189 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:36:59.224038 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zkrp4\" (UniqueName: \"kubernetes.io/projected/7d0159b6-53a2-408e-b7e6-777d32d1af26-kube-api-access-zkrp4\") pod \"kuadrant-console-plugin-6cb54b5c86-tzcgz\" (UID: \"7d0159b6-53a2-408e-b7e6-777d32d1af26\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tzcgz" Apr 20 14:36:59.224189 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:36:59.224070 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7d0159b6-53a2-408e-b7e6-777d32d1af26-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-tzcgz\" (UID: \"7d0159b6-53a2-408e-b7e6-777d32d1af26\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tzcgz" Apr 20 14:36:59.224189 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:36:59.224126 2580 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 20 14:36:59.224305 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:36:59.224210 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d0159b6-53a2-408e-b7e6-777d32d1af26-plugin-serving-cert podName:7d0159b6-53a2-408e-b7e6-777d32d1af26 nodeName:}" failed. No retries permitted until 2026-04-20 14:36:59.724187492 +0000 UTC m=+606.444488902 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/7d0159b6-53a2-408e-b7e6-777d32d1af26-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-tzcgz" (UID: "7d0159b6-53a2-408e-b7e6-777d32d1af26") : secret "plugin-serving-cert" not found Apr 20 14:36:59.224631 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:36:59.224613 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/7d0159b6-53a2-408e-b7e6-777d32d1af26-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-tzcgz\" (UID: \"7d0159b6-53a2-408e-b7e6-777d32d1af26\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tzcgz" Apr 20 14:36:59.253454 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:36:59.253417 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkrp4\" (UniqueName: \"kubernetes.io/projected/7d0159b6-53a2-408e-b7e6-777d32d1af26-kube-api-access-zkrp4\") pod \"kuadrant-console-plugin-6cb54b5c86-tzcgz\" (UID: \"7d0159b6-53a2-408e-b7e6-777d32d1af26\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tzcgz" Apr 20 14:36:59.729477 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:36:59.729427 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d0159b6-53a2-408e-b7e6-777d32d1af26-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-tzcgz\" (UID: \"7d0159b6-53a2-408e-b7e6-777d32d1af26\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tzcgz" Apr 20 14:36:59.731960 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:36:59.731928 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d0159b6-53a2-408e-b7e6-777d32d1af26-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-tzcgz\" (UID: \"7d0159b6-53a2-408e-b7e6-777d32d1af26\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tzcgz" Apr 20 14:36:59.938782 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:36:59.938745 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tzcgz" Apr 20 14:37:00.062142 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:00.062113 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tzcgz"] Apr 20 14:37:00.064960 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:37:00.064932 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d0159b6_53a2_408e_b7e6_777d32d1af26.slice/crio-9164d044462b2e5ff1b319347ee245323dad0e1a4c951dc13b76b221c432a091 WatchSource:0}: Error finding container 9164d044462b2e5ff1b319347ee245323dad0e1a4c951dc13b76b221c432a091: Status 404 returned error can't find the container with id 9164d044462b2e5ff1b319347ee245323dad0e1a4c951dc13b76b221c432a091 Apr 20 14:37:00.973728 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:00.973689 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tzcgz" event={"ID":"7d0159b6-53a2-408e-b7e6-777d32d1af26","Type":"ContainerStarted","Data":"9164d044462b2e5ff1b319347ee245323dad0e1a4c951dc13b76b221c432a091"} Apr 20 14:37:08.655760 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:08.655724 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sqc8h"] Apr 20 14:37:08.659529 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:08.659491 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sqc8h" Apr 20 14:37:08.663439 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:08.663418 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-dx4gf\"" Apr 20 14:37:08.673705 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:08.673673 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sqc8h"] Apr 20 14:37:08.816326 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:08.816271 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2608242f-1c1d-415e-9615-b7dabf901d74-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-sqc8h\" (UID: \"2608242f-1c1d-415e-9615-b7dabf901d74\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sqc8h" Apr 20 14:37:08.816548 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:08.816433 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wx2w\" (UniqueName: \"kubernetes.io/projected/2608242f-1c1d-415e-9615-b7dabf901d74-kube-api-access-7wx2w\") pod \"kuadrant-operator-controller-manager-84b657d985-sqc8h\" (UID: \"2608242f-1c1d-415e-9615-b7dabf901d74\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sqc8h" Apr 20 14:37:08.917429 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:08.917332 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2608242f-1c1d-415e-9615-b7dabf901d74-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-sqc8h\" (UID: \"2608242f-1c1d-415e-9615-b7dabf901d74\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sqc8h" Apr 20 14:37:08.917620 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:08.917485 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wx2w\" (UniqueName: \"kubernetes.io/projected/2608242f-1c1d-415e-9615-b7dabf901d74-kube-api-access-7wx2w\") pod \"kuadrant-operator-controller-manager-84b657d985-sqc8h\" (UID: \"2608242f-1c1d-415e-9615-b7dabf901d74\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sqc8h" Apr 20 14:37:08.917996 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:08.917965 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2608242f-1c1d-415e-9615-b7dabf901d74-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-sqc8h\" (UID: \"2608242f-1c1d-415e-9615-b7dabf901d74\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sqc8h" Apr 20 14:37:08.929737 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:08.929700 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wx2w\" (UniqueName: \"kubernetes.io/projected/2608242f-1c1d-415e-9615-b7dabf901d74-kube-api-access-7wx2w\") pod \"kuadrant-operator-controller-manager-84b657d985-sqc8h\" (UID: \"2608242f-1c1d-415e-9615-b7dabf901d74\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sqc8h" Apr 20 14:37:08.970259 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:08.970213 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sqc8h" Apr 20 14:37:09.100185 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:09.100061 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sqc8h"] Apr 20 14:37:09.104269 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:37:09.104233 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2608242f_1c1d_415e_9615_b7dabf901d74.slice/crio-02b5596e34525ec88327b276622afd22158e6c94f133a7b0f618c5ea76336098 WatchSource:0}: Error finding container 02b5596e34525ec88327b276622afd22158e6c94f133a7b0f618c5ea76336098: Status 404 returned error can't find the container with id 02b5596e34525ec88327b276622afd22158e6c94f133a7b0f618c5ea76336098 Apr 20 14:37:09.251166 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:09.251083 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sqc8h"] Apr 20 14:37:09.268059 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:09.268029 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sqc8h"] Apr 20 14:37:29.089868 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:29.089828 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sqc8h" podUID="2608242f-1c1d-415e-9615-b7dabf901d74" containerName="manager" containerID="cri-o://0becff61efb104741e445c4e494aae44ca71dd76ecc8ab57c68c7ba677107633" gracePeriod=2 Apr 20 14:37:29.091232 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:29.091201 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tzcgz" event={"ID":"7d0159b6-53a2-408e-b7e6-777d32d1af26","Type":"ContainerStarted","Data":"44dd900f79b9c35daf69c463e56001bc3826174a1fba02117f51e8ffd760ad1a"} Apr 20 14:37:29.092273 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:29.092249 2580 status_manager.go:895] "Failed to get status for pod" podUID="2608242f-1c1d-415e-9615-b7dabf901d74" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sqc8h" err="pods \"kuadrant-operator-controller-manager-84b657d985-sqc8h\" is forbidden: User \"system:node:ip-10-0-142-166.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-166.ec2.internal' and this object" Apr 20 14:37:29.094034 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:29.094008 2580 status_manager.go:895] "Failed to get status for pod" podUID="2608242f-1c1d-415e-9615-b7dabf901d74" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sqc8h" err="pods \"kuadrant-operator-controller-manager-84b657d985-sqc8h\" is forbidden: User \"system:node:ip-10-0-142-166.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-166.ec2.internal' and this object" Apr 20 14:37:29.110816 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:29.110762 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-tzcgz" podStartSLOduration=1.585963731 podStartE2EDuration="30.110746095s" podCreationTimestamp="2026-04-20 14:36:59 +0000 UTC" firstStartedPulling="2026-04-20 14:37:00.06667462 +0000 UTC m=+606.786976033" lastFinishedPulling="2026-04-20 14:37:28.591456985 +0000 UTC m=+635.311758397" observedRunningTime="2026-04-20 14:37:29.108629414 +0000 UTC m=+635.828930846" watchObservedRunningTime="2026-04-20 14:37:29.110746095 +0000 UTC m=+635.831047526" Apr 20 14:37:29.343895 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:29.343825 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sqc8h" Apr 20 14:37:29.346030 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:29.345998 2580 status_manager.go:895] "Failed to get status for pod" podUID="2608242f-1c1d-415e-9615-b7dabf901d74" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sqc8h" err="pods \"kuadrant-operator-controller-manager-84b657d985-sqc8h\" is forbidden: User \"system:node:ip-10-0-142-166.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-166.ec2.internal' and this object" Apr 20 14:37:29.423132 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:29.423094 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2608242f-1c1d-415e-9615-b7dabf901d74-extensions-socket-volume\") pod \"2608242f-1c1d-415e-9615-b7dabf901d74\" (UID: \"2608242f-1c1d-415e-9615-b7dabf901d74\") " Apr 20 14:37:29.423310 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:29.423217 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wx2w\" (UniqueName: \"kubernetes.io/projected/2608242f-1c1d-415e-9615-b7dabf901d74-kube-api-access-7wx2w\") pod \"2608242f-1c1d-415e-9615-b7dabf901d74\" (UID: \"2608242f-1c1d-415e-9615-b7dabf901d74\") " Apr 20 14:37:29.423420 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:29.423389 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2608242f-1c1d-415e-9615-b7dabf901d74-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "2608242f-1c1d-415e-9615-b7dabf901d74" (UID: "2608242f-1c1d-415e-9615-b7dabf901d74"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:37:29.425349 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:29.425321 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2608242f-1c1d-415e-9615-b7dabf901d74-kube-api-access-7wx2w" (OuterVolumeSpecName: "kube-api-access-7wx2w") pod "2608242f-1c1d-415e-9615-b7dabf901d74" (UID: "2608242f-1c1d-415e-9615-b7dabf901d74"). InnerVolumeSpecName "kube-api-access-7wx2w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:37:29.524119 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:29.524084 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7wx2w\" (UniqueName: \"kubernetes.io/projected/2608242f-1c1d-415e-9615-b7dabf901d74-kube-api-access-7wx2w\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:37:29.524119 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:29.524118 2580 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/2608242f-1c1d-415e-9615-b7dabf901d74-extensions-socket-volume\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:37:29.932596 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:29.932515 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2608242f-1c1d-415e-9615-b7dabf901d74" path="/var/lib/kubelet/pods/2608242f-1c1d-415e-9615-b7dabf901d74/volumes" Apr 20 14:37:30.095974 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:30.095940 2580 generic.go:358] "Generic (PLEG): container finished" podID="2608242f-1c1d-415e-9615-b7dabf901d74" containerID="0becff61efb104741e445c4e494aae44ca71dd76ecc8ab57c68c7ba677107633" exitCode=2 Apr 20 14:37:30.096428 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:30.095988 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sqc8h" Apr 20 14:37:30.096428 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:30.096039 2580 scope.go:117] "RemoveContainer" containerID="0becff61efb104741e445c4e494aae44ca71dd76ecc8ab57c68c7ba677107633" Apr 20 14:37:30.098065 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:30.098018 2580 status_manager.go:895] "Failed to get status for pod" podUID="2608242f-1c1d-415e-9615-b7dabf901d74" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sqc8h" err="pods \"kuadrant-operator-controller-manager-84b657d985-sqc8h\" is forbidden: User \"system:node:ip-10-0-142-166.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-166.ec2.internal' and this object" Apr 20 14:37:30.100338 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:30.100313 2580 status_manager.go:895] "Failed to get status for pod" podUID="2608242f-1c1d-415e-9615-b7dabf901d74" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sqc8h" err="pods \"kuadrant-operator-controller-manager-84b657d985-sqc8h\" is forbidden: User \"system:node:ip-10-0-142-166.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-142-166.ec2.internal' and this object" Apr 20 14:37:30.104513 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:30.104478 2580 scope.go:117] "RemoveContainer" containerID="0becff61efb104741e445c4e494aae44ca71dd76ecc8ab57c68c7ba677107633" Apr 20 14:37:30.104810 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:37:30.104791 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0becff61efb104741e445c4e494aae44ca71dd76ecc8ab57c68c7ba677107633\": container with ID starting with 0becff61efb104741e445c4e494aae44ca71dd76ecc8ab57c68c7ba677107633 not found: ID does not exist" containerID="0becff61efb104741e445c4e494aae44ca71dd76ecc8ab57c68c7ba677107633" Apr 20 14:37:30.104876 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:30.104820 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0becff61efb104741e445c4e494aae44ca71dd76ecc8ab57c68c7ba677107633"} err="failed to get container status \"0becff61efb104741e445c4e494aae44ca71dd76ecc8ab57c68c7ba677107633\": rpc error: code = NotFound desc = could not find container \"0becff61efb104741e445c4e494aae44ca71dd76ecc8ab57c68c7ba677107633\": container with ID starting with 0becff61efb104741e445c4e494aae44ca71dd76ecc8ab57c68c7ba677107633 not found: ID does not exist" Apr 20 14:37:51.961649 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:51.961614 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:37:51.962059 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:51.962033 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2608242f-1c1d-415e-9615-b7dabf901d74" containerName="manager" Apr 20 14:37:51.962059 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:51.962045 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="2608242f-1c1d-415e-9615-b7dabf901d74" containerName="manager" Apr 20 14:37:51.962162 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:51.962115 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="2608242f-1c1d-415e-9615-b7dabf901d74" containerName="manager" Apr 20 14:37:51.965084 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:51.965064 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-7bndp" Apr 20 14:37:51.967074 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:51.967056 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 14:37:51.974783 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:51.974756 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:37:52.000456 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:52.000414 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:37:52.027486 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:52.027450 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n96p\" (UniqueName: \"kubernetes.io/projected/e44a62ce-e4b4-4f76-a56e-e45b26bd9b13-kube-api-access-6n96p\") pod \"limitador-limitador-78c99df468-7bndp\" (UID: \"e44a62ce-e4b4-4f76-a56e-e45b26bd9b13\") " pod="kuadrant-system/limitador-limitador-78c99df468-7bndp" Apr 20 14:37:52.027679 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:52.027551 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e44a62ce-e4b4-4f76-a56e-e45b26bd9b13-config-file\") pod \"limitador-limitador-78c99df468-7bndp\" (UID: \"e44a62ce-e4b4-4f76-a56e-e45b26bd9b13\") " pod="kuadrant-system/limitador-limitador-78c99df468-7bndp" Apr 20 14:37:52.128076 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:52.128033 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6n96p\" (UniqueName: \"kubernetes.io/projected/e44a62ce-e4b4-4f76-a56e-e45b26bd9b13-kube-api-access-6n96p\") pod \"limitador-limitador-78c99df468-7bndp\" (UID: \"e44a62ce-e4b4-4f76-a56e-e45b26bd9b13\") " pod="kuadrant-system/limitador-limitador-78c99df468-7bndp" Apr 20 14:37:52.128258 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:52.128125 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e44a62ce-e4b4-4f76-a56e-e45b26bd9b13-config-file\") pod \"limitador-limitador-78c99df468-7bndp\" (UID: \"e44a62ce-e4b4-4f76-a56e-e45b26bd9b13\") " pod="kuadrant-system/limitador-limitador-78c99df468-7bndp" Apr 20 14:37:52.128778 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:52.128753 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/e44a62ce-e4b4-4f76-a56e-e45b26bd9b13-config-file\") pod \"limitador-limitador-78c99df468-7bndp\" (UID: \"e44a62ce-e4b4-4f76-a56e-e45b26bd9b13\") " pod="kuadrant-system/limitador-limitador-78c99df468-7bndp" Apr 20 14:37:52.136317 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:52.136296 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n96p\" (UniqueName: \"kubernetes.io/projected/e44a62ce-e4b4-4f76-a56e-e45b26bd9b13-kube-api-access-6n96p\") pod \"limitador-limitador-78c99df468-7bndp\" (UID: \"e44a62ce-e4b4-4f76-a56e-e45b26bd9b13\") " pod="kuadrant-system/limitador-limitador-78c99df468-7bndp" Apr 20 14:37:52.275488 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:52.275387 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-7bndp" Apr 20 14:37:52.402221 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:52.402198 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:37:52.404917 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:37:52.404889 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode44a62ce_e4b4_4f76_a56e_e45b26bd9b13.slice/crio-8314d848d17cdd351dbae990f335cd9a966b7b2199be2c355179e16af2e0206b WatchSource:0}: Error finding container 8314d848d17cdd351dbae990f335cd9a966b7b2199be2c355179e16af2e0206b: Status 404 returned error can't find the container with id 8314d848d17cdd351dbae990f335cd9a966b7b2199be2c355179e16af2e0206b Apr 20 14:37:53.177482 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:53.177433 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-7bndp" event={"ID":"e44a62ce-e4b4-4f76-a56e-e45b26bd9b13","Type":"ContainerStarted","Data":"8314d848d17cdd351dbae990f335cd9a966b7b2199be2c355179e16af2e0206b"} Apr 20 14:37:55.186920 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:55.186882 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-7bndp" event={"ID":"e44a62ce-e4b4-4f76-a56e-e45b26bd9b13","Type":"ContainerStarted","Data":"2d83fa2aff92fc9e6a3c99f779c646a2914c1f40c9c9c158d9a9a61efc15dd31"} Apr 20 14:37:55.187304 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:55.187049 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-7bndp" Apr 20 14:37:55.202080 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:37:55.202029 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-7bndp" podStartSLOduration=1.541900911 podStartE2EDuration="4.202013728s" podCreationTimestamp="2026-04-20 14:37:51 +0000 UTC" firstStartedPulling="2026-04-20 14:37:52.407296654 +0000 UTC m=+659.127598070" lastFinishedPulling="2026-04-20 14:37:55.067409466 +0000 UTC m=+661.787710887" observedRunningTime="2026-04-20 14:37:55.200766726 +0000 UTC m=+661.921068159" watchObservedRunningTime="2026-04-20 14:37:55.202013728 +0000 UTC m=+661.922315161" Apr 20 14:38:06.192073 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:38:06.192032 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-7bndp" Apr 20 14:38:30.621166 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:38:30.621124 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:39:21.026562 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:39:21.026525 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:39:24.934942 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:39:24.934902 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:39:28.523026 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:39:28.522983 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:39:32.431696 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:39:32.431660 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:39:58.732824 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:39:58.732785 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:40:04.219973 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:40:04.219936 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:40:46.125308 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:40:46.125271 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:40:50.815890 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:40:50.815856 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:40:58.126263 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:40:58.126228 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:41:08.539039 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:41:08.539000 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:41:16.516662 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:41:16.516621 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:41:27.924347 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:41:27.924310 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:41:35.923390 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:41:35.923356 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:41:46.820992 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:41:46.820911 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:41:53.873901 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:41:53.873873 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s4vzx_b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad/ovn-acl-logging/0.log" Apr 20 14:41:53.876763 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:41:53.876743 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s4vzx_b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad/ovn-acl-logging/0.log" Apr 20 14:41:55.411314 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:41:55.411281 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-79cb8b9576-xzhb8"] Apr 20 14:41:55.414402 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:41:55.414385 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-79cb8b9576-xzhb8" Apr 20 14:41:55.416596 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:41:55.416577 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-sc5bg\"" Apr 20 14:41:55.422400 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:41:55.422378 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-79cb8b9576-xzhb8"] Apr 20 14:41:55.527877 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:41:55.527842 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psp2j\" (UniqueName: \"kubernetes.io/projected/9094e779-d79f-4892-95c8-8233b6c66b58-kube-api-access-psp2j\") pod \"maas-controller-79cb8b9576-xzhb8\" (UID: \"9094e779-d79f-4892-95c8-8233b6c66b58\") " pod="opendatahub/maas-controller-79cb8b9576-xzhb8" Apr 20 14:41:55.628606 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:41:55.628568 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-psp2j\" (UniqueName: \"kubernetes.io/projected/9094e779-d79f-4892-95c8-8233b6c66b58-kube-api-access-psp2j\") pod \"maas-controller-79cb8b9576-xzhb8\" (UID: \"9094e779-d79f-4892-95c8-8233b6c66b58\") " pod="opendatahub/maas-controller-79cb8b9576-xzhb8" Apr 20 14:41:55.636379 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:41:55.636349 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-psp2j\" (UniqueName: \"kubernetes.io/projected/9094e779-d79f-4892-95c8-8233b6c66b58-kube-api-access-psp2j\") pod \"maas-controller-79cb8b9576-xzhb8\" (UID: \"9094e779-d79f-4892-95c8-8233b6c66b58\") " pod="opendatahub/maas-controller-79cb8b9576-xzhb8" Apr 20 14:41:55.725394 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:41:55.725298 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-79cb8b9576-xzhb8" Apr 20 14:41:56.052896 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:41:56.052871 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-79cb8b9576-xzhb8"] Apr 20 14:41:56.055241 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:41:56.055216 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9094e779_d79f_4892_95c8_8233b6c66b58.slice/crio-25afc247d39cd260962ba27ce797930c48be62d83e526d2b80833b2645752b54 WatchSource:0}: Error finding container 25afc247d39cd260962ba27ce797930c48be62d83e526d2b80833b2645752b54: Status 404 returned error can't find the container with id 25afc247d39cd260962ba27ce797930c48be62d83e526d2b80833b2645752b54 Apr 20 14:41:56.056465 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:41:56.056444 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 14:41:57.044936 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:41:57.044893 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-79cb8b9576-xzhb8" event={"ID":"9094e779-d79f-4892-95c8-8233b6c66b58","Type":"ContainerStarted","Data":"25afc247d39cd260962ba27ce797930c48be62d83e526d2b80833b2645752b54"} Apr 20 14:41:59.053931 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:41:59.053896 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-79cb8b9576-xzhb8" event={"ID":"9094e779-d79f-4892-95c8-8233b6c66b58","Type":"ContainerStarted","Data":"b961dabf68da9235a837eff030ada09559725c2c6b25e112faea676609eb64f5"} Apr 20 14:41:59.054303 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:41:59.053958 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-79cb8b9576-xzhb8" Apr 20 14:41:59.068519 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:41:59.068451 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-79cb8b9576-xzhb8" podStartSLOduration=1.782316771 podStartE2EDuration="4.068439393s" podCreationTimestamp="2026-04-20 14:41:55 +0000 UTC" firstStartedPulling="2026-04-20 14:41:56.056626049 +0000 UTC m=+902.776927460" lastFinishedPulling="2026-04-20 14:41:58.342748669 +0000 UTC m=+905.063050082" observedRunningTime="2026-04-20 14:41:59.068067317 +0000 UTC m=+905.788368774" watchObservedRunningTime="2026-04-20 14:41:59.068439393 +0000 UTC m=+905.788740825" Apr 20 14:42:10.065668 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:42:10.065634 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-79cb8b9576-xzhb8" Apr 20 14:42:51.122063 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:42:51.122026 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:43:07.268157 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:43:07.268118 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:43:45.021456 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:43:45.021375 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:44:02.618880 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:44:02.618847 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:44:16.737708 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:44:16.737667 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:44:32.720039 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:44:32.720002 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:45:00.144439 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:00.144358 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29611605-jmc2d"] Apr 20 14:45:00.147344 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:00.147327 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611605-jmc2d" Apr 20 14:45:00.149540 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:00.149521 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-5vm9x\"" Apr 20 14:45:00.160872 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:00.160836 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611605-jmc2d"] Apr 20 14:45:00.193051 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:00.193006 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t98b\" (UniqueName: \"kubernetes.io/projected/02647406-b8c2-441a-8dac-c1104385e9c7-kube-api-access-5t98b\") pod \"maas-api-key-cleanup-29611605-jmc2d\" (UID: \"02647406-b8c2-441a-8dac-c1104385e9c7\") " pod="opendatahub/maas-api-key-cleanup-29611605-jmc2d" Apr 20 14:45:00.294359 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:00.294317 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5t98b\" (UniqueName: \"kubernetes.io/projected/02647406-b8c2-441a-8dac-c1104385e9c7-kube-api-access-5t98b\") pod \"maas-api-key-cleanup-29611605-jmc2d\" (UID: \"02647406-b8c2-441a-8dac-c1104385e9c7\") " pod="opendatahub/maas-api-key-cleanup-29611605-jmc2d" Apr 20 14:45:00.303169 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:00.303131 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t98b\" (UniqueName: \"kubernetes.io/projected/02647406-b8c2-441a-8dac-c1104385e9c7-kube-api-access-5t98b\") pod \"maas-api-key-cleanup-29611605-jmc2d\" (UID: \"02647406-b8c2-441a-8dac-c1104385e9c7\") " pod="opendatahub/maas-api-key-cleanup-29611605-jmc2d" Apr 20 14:45:00.457602 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:00.457494 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611605-jmc2d" Apr 20 14:45:00.583001 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:00.582974 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611605-jmc2d"] Apr 20 14:45:00.585296 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:45:00.585265 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02647406_b8c2_441a_8dac_c1104385e9c7.slice/crio-6440d5bed3c269428ffb3c4d4ea66a1e1f884dc59b7f821a0875555de5d11243 WatchSource:0}: Error finding container 6440d5bed3c269428ffb3c4d4ea66a1e1f884dc59b7f821a0875555de5d11243: Status 404 returned error can't find the container with id 6440d5bed3c269428ffb3c4d4ea66a1e1f884dc59b7f821a0875555de5d11243 Apr 20 14:45:00.687715 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:00.687673 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611605-jmc2d" event={"ID":"02647406-b8c2-441a-8dac-c1104385e9c7","Type":"ContainerStarted","Data":"6440d5bed3c269428ffb3c4d4ea66a1e1f884dc59b7f821a0875555de5d11243"} Apr 20 14:45:01.692454 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:01.692415 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611605-jmc2d" event={"ID":"02647406-b8c2-441a-8dac-c1104385e9c7","Type":"ContainerStarted","Data":"6f7423da727622099f21f1ed275f18d48c1fc83cbc07b2660d4fa1a5d9dd6129"} Apr 20 14:45:01.706631 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:01.706573 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29611605-jmc2d" podStartSLOduration=1.023736734 podStartE2EDuration="1.706557313s" podCreationTimestamp="2026-04-20 14:45:00 +0000 UTC" firstStartedPulling="2026-04-20 14:45:00.587292794 +0000 UTC m=+1087.307594218" lastFinishedPulling="2026-04-20 14:45:01.270113383 +0000 UTC m=+1087.990414797" observedRunningTime="2026-04-20 14:45:01.706053299 +0000 UTC m=+1088.426354756" watchObservedRunningTime="2026-04-20 14:45:01.706557313 +0000 UTC m=+1088.426858744" Apr 20 14:45:22.767268 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:22.767231 2580 generic.go:358] "Generic (PLEG): container finished" podID="02647406-b8c2-441a-8dac-c1104385e9c7" containerID="6f7423da727622099f21f1ed275f18d48c1fc83cbc07b2660d4fa1a5d9dd6129" exitCode=6 Apr 20 14:45:22.767696 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:22.767306 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611605-jmc2d" event={"ID":"02647406-b8c2-441a-8dac-c1104385e9c7","Type":"ContainerDied","Data":"6f7423da727622099f21f1ed275f18d48c1fc83cbc07b2660d4fa1a5d9dd6129"} Apr 20 14:45:22.767746 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:22.767695 2580 scope.go:117] "RemoveContainer" containerID="6f7423da727622099f21f1ed275f18d48c1fc83cbc07b2660d4fa1a5d9dd6129" Apr 20 14:45:23.772520 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:23.772456 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611605-jmc2d" event={"ID":"02647406-b8c2-441a-8dac-c1104385e9c7","Type":"ContainerStarted","Data":"ab5032f030db4f5615f6f97b7ecd8e6374431bb1778bd6f9590e4481bdf0a0d5"} Apr 20 14:45:27.019737 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:27.019688 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:45:35.826807 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:35.826770 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:45:43.842566 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:43.842531 2580 generic.go:358] "Generic (PLEG): container finished" podID="02647406-b8c2-441a-8dac-c1104385e9c7" containerID="ab5032f030db4f5615f6f97b7ecd8e6374431bb1778bd6f9590e4481bdf0a0d5" exitCode=6 Apr 20 14:45:43.842977 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:43.842601 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611605-jmc2d" event={"ID":"02647406-b8c2-441a-8dac-c1104385e9c7","Type":"ContainerDied","Data":"ab5032f030db4f5615f6f97b7ecd8e6374431bb1778bd6f9590e4481bdf0a0d5"} Apr 20 14:45:43.842977 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:43.842642 2580 scope.go:117] "RemoveContainer" containerID="6f7423da727622099f21f1ed275f18d48c1fc83cbc07b2660d4fa1a5d9dd6129" Apr 20 14:45:43.843062 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:43.842982 2580 scope.go:117] "RemoveContainer" containerID="ab5032f030db4f5615f6f97b7ecd8e6374431bb1778bd6f9590e4481bdf0a0d5" Apr 20 14:45:43.843234 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:45:43.843216 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29611605-jmc2d_opendatahub(02647406-b8c2-441a-8dac-c1104385e9c7)\"" pod="opendatahub/maas-api-key-cleanup-29611605-jmc2d" podUID="02647406-b8c2-441a-8dac-c1104385e9c7" Apr 20 14:45:51.926148 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:51.926115 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:45:56.927673 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:56.927643 2580 scope.go:117] "RemoveContainer" containerID="ab5032f030db4f5615f6f97b7ecd8e6374431bb1778bd6f9590e4481bdf0a0d5" Apr 20 14:45:57.899115 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:57.899079 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611605-jmc2d" event={"ID":"02647406-b8c2-441a-8dac-c1104385e9c7","Type":"ContainerStarted","Data":"7a8fff37b72ee337aa2b925217f03dbb09cd337ea014f25ee3f3a171869daebf"} Apr 20 14:45:57.952820 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:57.952789 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611605-jmc2d"] Apr 20 14:45:58.902698 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:45:58.902658 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29611605-jmc2d" podUID="02647406-b8c2-441a-8dac-c1104385e9c7" containerName="cleanup" containerID="cri-o://7a8fff37b72ee337aa2b925217f03dbb09cd337ea014f25ee3f3a171869daebf" gracePeriod=30 Apr 20 14:46:00.320966 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:00.320930 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:46:17.422701 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:17.422568 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:46:17.653576 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:17.653547 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611605-jmc2d" Apr 20 14:46:17.695473 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:17.695392 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t98b\" (UniqueName: \"kubernetes.io/projected/02647406-b8c2-441a-8dac-c1104385e9c7-kube-api-access-5t98b\") pod \"02647406-b8c2-441a-8dac-c1104385e9c7\" (UID: \"02647406-b8c2-441a-8dac-c1104385e9c7\") " Apr 20 14:46:17.697677 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:17.697641 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02647406-b8c2-441a-8dac-c1104385e9c7-kube-api-access-5t98b" (OuterVolumeSpecName: "kube-api-access-5t98b") pod "02647406-b8c2-441a-8dac-c1104385e9c7" (UID: "02647406-b8c2-441a-8dac-c1104385e9c7"). InnerVolumeSpecName "kube-api-access-5t98b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:46:17.796613 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:17.796567 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5t98b\" (UniqueName: \"kubernetes.io/projected/02647406-b8c2-441a-8dac-c1104385e9c7-kube-api-access-5t98b\") on node \"ip-10-0-142-166.ec2.internal\" DevicePath \"\"" Apr 20 14:46:17.968037 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:17.967946 2580 generic.go:358] "Generic (PLEG): container finished" podID="02647406-b8c2-441a-8dac-c1104385e9c7" containerID="7a8fff37b72ee337aa2b925217f03dbb09cd337ea014f25ee3f3a171869daebf" exitCode=6 Apr 20 14:46:17.968037 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:17.968022 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611605-jmc2d" Apr 20 14:46:17.968220 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:17.968041 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611605-jmc2d" event={"ID":"02647406-b8c2-441a-8dac-c1104385e9c7","Type":"ContainerDied","Data":"7a8fff37b72ee337aa2b925217f03dbb09cd337ea014f25ee3f3a171869daebf"} Apr 20 14:46:17.968220 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:17.968093 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611605-jmc2d" event={"ID":"02647406-b8c2-441a-8dac-c1104385e9c7","Type":"ContainerDied","Data":"6440d5bed3c269428ffb3c4d4ea66a1e1f884dc59b7f821a0875555de5d11243"} Apr 20 14:46:17.968220 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:17.968114 2580 scope.go:117] "RemoveContainer" containerID="7a8fff37b72ee337aa2b925217f03dbb09cd337ea014f25ee3f3a171869daebf" Apr 20 14:46:17.977277 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:17.977255 2580 scope.go:117] "RemoveContainer" containerID="ab5032f030db4f5615f6f97b7ecd8e6374431bb1778bd6f9590e4481bdf0a0d5" Apr 20 14:46:17.983778 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:17.983748 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611605-jmc2d"] Apr 20 14:46:17.986415 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:17.986393 2580 scope.go:117] "RemoveContainer" containerID="7a8fff37b72ee337aa2b925217f03dbb09cd337ea014f25ee3f3a171869daebf" Apr 20 14:46:17.986709 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:17.986691 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611605-jmc2d"] Apr 20 14:46:17.986768 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:46:17.986702 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a8fff37b72ee337aa2b925217f03dbb09cd337ea014f25ee3f3a171869daebf\": container with ID starting with 7a8fff37b72ee337aa2b925217f03dbb09cd337ea014f25ee3f3a171869daebf not found: ID does not exist" containerID="7a8fff37b72ee337aa2b925217f03dbb09cd337ea014f25ee3f3a171869daebf" Apr 20 14:46:17.986768 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:17.986727 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a8fff37b72ee337aa2b925217f03dbb09cd337ea014f25ee3f3a171869daebf"} err="failed to get container status \"7a8fff37b72ee337aa2b925217f03dbb09cd337ea014f25ee3f3a171869daebf\": rpc error: code = NotFound desc = could not find container \"7a8fff37b72ee337aa2b925217f03dbb09cd337ea014f25ee3f3a171869daebf\": container with ID starting with 7a8fff37b72ee337aa2b925217f03dbb09cd337ea014f25ee3f3a171869daebf not found: ID does not exist" Apr 20 14:46:17.986768 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:17.986747 2580 scope.go:117] "RemoveContainer" containerID="ab5032f030db4f5615f6f97b7ecd8e6374431bb1778bd6f9590e4481bdf0a0d5" Apr 20 14:46:17.986994 ip-10-0-142-166 kubenswrapper[2580]: E0420 14:46:17.986978 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab5032f030db4f5615f6f97b7ecd8e6374431bb1778bd6f9590e4481bdf0a0d5\": container with ID starting with ab5032f030db4f5615f6f97b7ecd8e6374431bb1778bd6f9590e4481bdf0a0d5 not found: ID does not exist" containerID="ab5032f030db4f5615f6f97b7ecd8e6374431bb1778bd6f9590e4481bdf0a0d5" Apr 20 14:46:17.987039 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:17.987001 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab5032f030db4f5615f6f97b7ecd8e6374431bb1778bd6f9590e4481bdf0a0d5"} err="failed to get container status \"ab5032f030db4f5615f6f97b7ecd8e6374431bb1778bd6f9590e4481bdf0a0d5\": rpc error: code = NotFound desc = could not find container \"ab5032f030db4f5615f6f97b7ecd8e6374431bb1778bd6f9590e4481bdf0a0d5\": container with ID starting with ab5032f030db4f5615f6f97b7ecd8e6374431bb1778bd6f9590e4481bdf0a0d5 not found: ID does not exist" Apr 20 14:46:19.932141 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:19.932107 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02647406-b8c2-441a-8dac-c1104385e9c7" path="/var/lib/kubelet/pods/02647406-b8c2-441a-8dac-c1104385e9c7/volumes" Apr 20 14:46:25.924060 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:25.924021 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:46:53.899190 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:53.899157 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s4vzx_b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad/ovn-acl-logging/0.log" Apr 20 14:46:53.903907 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:53.903881 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s4vzx_b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad/ovn-acl-logging/0.log" Apr 20 14:46:59.121877 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:46:59.121840 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:47:07.015770 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:47:07.015731 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:47:15.420481 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:47:15.420443 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:47:23.915338 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:47:23.915282 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:47:32.622248 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:47:32.622210 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:47:50.118604 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:47:50.118519 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:48:03.326596 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:48:03.326556 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:48:49.221853 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:48:49.221815 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:48:57.616191 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:48:57.616144 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:49:06.830083 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:49:06.830042 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:49:15.626705 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:49:15.626619 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:49:24.924563 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:49:24.924524 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:49:32.923615 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:49:32.923577 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:49:41.723704 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:49:41.723667 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:49:50.935059 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:49:50.935025 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:49:59.625313 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:49:59.625272 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:50:08.024722 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:50:08.024682 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:50:17.021104 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:50:17.021061 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:50:25.516229 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:50:25.516186 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:50:34.429538 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:50:34.429487 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:50:42.517645 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:50:42.517548 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:50:51.534483 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:50:51.534440 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:50:59.817933 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:50:59.817895 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:51:08.521556 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:51:08.521520 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:51:17.833795 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:51:17.833756 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:51:53.926186 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:51:53.926156 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s4vzx_b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad/ovn-acl-logging/0.log" Apr 20 14:51:53.937053 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:51:53.937023 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s4vzx_b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad/ovn-acl-logging/0.log" Apr 20 14:53:15.454306 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:15.454271 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2j7nn"] Apr 20 14:53:15.455998 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:15.454662 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02647406-b8c2-441a-8dac-c1104385e9c7" containerName="cleanup" Apr 20 14:53:15.455998 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:15.454674 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="02647406-b8c2-441a-8dac-c1104385e9c7" containerName="cleanup" Apr 20 14:53:15.455998 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:15.454692 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02647406-b8c2-441a-8dac-c1104385e9c7" containerName="cleanup" Apr 20 14:53:15.455998 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:15.454700 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="02647406-b8c2-441a-8dac-c1104385e9c7" containerName="cleanup" Apr 20 14:53:15.455998 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:15.454779 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="02647406-b8c2-441a-8dac-c1104385e9c7" containerName="cleanup" Apr 20 14:53:15.455998 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:15.454788 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="02647406-b8c2-441a-8dac-c1104385e9c7" containerName="cleanup" Apr 20 14:53:15.457068 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:15.457051 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2j7nn" Apr 20 14:53:15.459514 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:15.459477 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-dx4gf\"" Apr 20 14:53:15.470550 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:15.470530 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2j7nn"] Apr 20 14:53:15.546418 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:15.546368 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h2wv\" (UniqueName: \"kubernetes.io/projected/41a79f78-54ae-45c3-a390-9ca85366514b-kube-api-access-9h2wv\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2j7nn\" (UID: \"41a79f78-54ae-45c3-a390-9ca85366514b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2j7nn" Apr 20 14:53:15.546634 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:15.546431 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/41a79f78-54ae-45c3-a390-9ca85366514b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2j7nn\" (UID: \"41a79f78-54ae-45c3-a390-9ca85366514b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2j7nn" Apr 20 14:53:15.646981 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:15.646926 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9h2wv\" (UniqueName: \"kubernetes.io/projected/41a79f78-54ae-45c3-a390-9ca85366514b-kube-api-access-9h2wv\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2j7nn\" (UID: \"41a79f78-54ae-45c3-a390-9ca85366514b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2j7nn" Apr 20 14:53:15.646981 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:15.646991 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/41a79f78-54ae-45c3-a390-9ca85366514b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2j7nn\" (UID: \"41a79f78-54ae-45c3-a390-9ca85366514b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2j7nn" Apr 20 14:53:15.647376 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:15.647352 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/41a79f78-54ae-45c3-a390-9ca85366514b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2j7nn\" (UID: \"41a79f78-54ae-45c3-a390-9ca85366514b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2j7nn" Apr 20 14:53:15.658355 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:15.658326 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h2wv\" (UniqueName: \"kubernetes.io/projected/41a79f78-54ae-45c3-a390-9ca85366514b-kube-api-access-9h2wv\") pod \"kuadrant-operator-controller-manager-55c7f4c975-2j7nn\" (UID: \"41a79f78-54ae-45c3-a390-9ca85366514b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2j7nn" Apr 20 14:53:15.767018 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:15.766907 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2j7nn" Apr 20 14:53:15.900766 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:15.900726 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2j7nn"] Apr 20 14:53:15.904538 ip-10-0-142-166 kubenswrapper[2580]: W0420 14:53:15.904492 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41a79f78_54ae_45c3_a390_9ca85366514b.slice/crio-2ca2f08f1bf6e66576ab49a3729f6ec2c4f6148af2c7d18b2413b31913f634aa WatchSource:0}: Error finding container 2ca2f08f1bf6e66576ab49a3729f6ec2c4f6148af2c7d18b2413b31913f634aa: Status 404 returned error can't find the container with id 2ca2f08f1bf6e66576ab49a3729f6ec2c4f6148af2c7d18b2413b31913f634aa Apr 20 14:53:15.906965 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:15.906948 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 14:53:16.431440 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:16.431397 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2j7nn" event={"ID":"41a79f78-54ae-45c3-a390-9ca85366514b","Type":"ContainerStarted","Data":"16d066552f2bd7fbb0252023860858d31493b39a86359a3fb261ddfb4d0d7000"} Apr 20 14:53:16.431440 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:16.431433 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2j7nn" event={"ID":"41a79f78-54ae-45c3-a390-9ca85366514b","Type":"ContainerStarted","Data":"2ca2f08f1bf6e66576ab49a3729f6ec2c4f6148af2c7d18b2413b31913f634aa"} Apr 20 14:53:16.431744 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:16.431525 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2j7nn" Apr 20 14:53:16.450026 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:16.449852 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2j7nn" podStartSLOduration=1.449836752 podStartE2EDuration="1.449836752s" podCreationTimestamp="2026-04-20 14:53:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:53:16.448898459 +0000 UTC m=+1583.169199892" watchObservedRunningTime="2026-04-20 14:53:16.449836752 +0000 UTC m=+1583.170138183" Apr 20 14:53:27.437844 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:27.437809 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-2j7nn" Apr 20 14:53:35.627158 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:35.627107 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:53:40.822985 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:53:40.822948 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:54:05.427245 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:54:05.427209 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:54:09.925378 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:54:09.925337 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:54:20.325365 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:54:20.325323 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:54:31.224895 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:54:31.224851 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:54:39.125746 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:54:39.125699 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:54:50.223878 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:54:50.223836 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:54:59.226317 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:54:59.226283 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:55:09.023631 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:55:09.023596 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:55:18.419312 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:55:18.419223 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:55:28.425642 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:55:28.425606 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:55:37.129605 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:55:37.129556 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:56:11.229113 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:56:11.229077 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:56:53.227205 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:56:53.227162 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:56:53.959775 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:56:53.959736 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s4vzx_b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad/ovn-acl-logging/0.log" Apr 20 14:56:53.965888 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:56:53.965863 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s4vzx_b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad/ovn-acl-logging/0.log" Apr 20 14:57:02.224130 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:57:02.224085 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:57:10.126120 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:57:10.126081 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:57:18.621058 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:57:18.621005 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:57:27.522059 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:57:27.522026 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:57:40.634489 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:57:40.634438 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:57:49.729951 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:57:49.729912 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:57:55.919771 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:57:55.919724 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:58:06.340639 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:58:06.340606 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:58:14.523708 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:58:14.523627 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:58:21.821245 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:58:21.821208 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:58:33.327381 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:58:33.327341 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:58:50.326680 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:58:50.326642 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:58:59.747632 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:58:59.747597 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:59:07.431917 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:59:07.431880 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:59:15.730565 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:59:15.730532 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:59:33.248770 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:59:33.248732 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:59:41.332792 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:59:41.332748 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:59:50.826004 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:59:50.825920 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 14:59:59.120713 ip-10-0-142-166 kubenswrapper[2580]: I0420 14:59:59.120673 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 15:00:08.125885 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:00:08.125846 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 15:00:16.526918 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:00:16.526882 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 15:00:24.991300 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:00:24.991263 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 15:00:36.131819 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:00:36.131781 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 15:00:45.126532 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:00:45.126478 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 15:00:58.513331 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:00:58.513291 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 15:01:07.426701 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:01:07.426660 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 15:01:15.824587 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:01:15.824510 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 15:01:23.132090 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:01:23.132049 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 15:01:30.427777 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:01:30.427734 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 15:01:48.518438 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:01:48.518398 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 15:01:53.994307 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:01:53.994281 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s4vzx_b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad/ovn-acl-logging/0.log" Apr 20 15:01:53.996391 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:01:53.996369 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-s4vzx_b206eb53-cb8b-4d6e-801e-c9b9fe10e3ad/ovn-acl-logging/0.log" Apr 20 15:01:56.429108 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:01:56.429074 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 15:02:05.133723 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:02:05.133682 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 15:02:13.139686 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:02:13.139645 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 15:02:37.831032 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:02:37.830990 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 15:02:50.328898 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:02:50.328815 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-7bndp"] Apr 20 15:02:55.871762 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:02:55.871729 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-79cb8b9576-xzhb8_9094e779-d79f-4892-95c8-8233b6c66b58/manager/0.log" Apr 20 15:02:56.336025 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:02:56.335999 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-65c545df94-xlml9_c265fc19-3a91-4dc4-9f05-7d671aed51f8/manager/0.log" Apr 20 15:02:58.157105 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:02:58.157073 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-tzcgz_7d0159b6-53a2-408e-b7e6-777d32d1af26/kuadrant-console-plugin/0.log" Apr 20 15:02:58.412150 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:02:58.412070 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-2j7nn_41a79f78-54ae-45c3-a390-9ca85366514b/manager/0.log" Apr 20 15:02:58.517849 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:02:58.517820 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-7bndp_e44a62ce-e4b4-4f76-a56e-e45b26bd9b13/limitador/0.log" Apr 20 15:02:59.272036 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:02:59.272004 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-87db58fcf-7gt29_7f509089-8b77-4fa4-8afb-ed3227e0f2d2/kube-auth-proxy/0.log" Apr 20 15:03:07.881830 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:07.881799 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-plgss_fd645412-a5e8-4419-9535-27ac65a5ee65/global-pull-secret-syncer/0.log" Apr 20 15:03:08.091828 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:08.091794 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-xhmdf_c2fe8871-85d4-447d-a6ca-09895bb1faae/konnectivity-agent/0.log" Apr 20 15:03:08.182447 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:08.182362 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-166.ec2.internal_b77815b0bb4108444a9fb656311ac214/haproxy/0.log" Apr 20 15:03:12.782053 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:12.782025 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-tzcgz_7d0159b6-53a2-408e-b7e6-777d32d1af26/kuadrant-console-plugin/0.log" Apr 20 15:03:12.881020 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:12.880984 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-2j7nn_41a79f78-54ae-45c3-a390-9ca85366514b/manager/0.log" Apr 20 15:03:12.900593 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:12.900566 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-7bndp_e44a62ce-e4b4-4f76-a56e-e45b26bd9b13/limitador/0.log" Apr 20 15:03:14.397929 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:14.397900 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5361d7ff-428a-4fe9-a55f-a6e058beb6b9/alertmanager/0.log" Apr 20 15:03:14.421714 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:14.421690 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5361d7ff-428a-4fe9-a55f-a6e058beb6b9/config-reloader/0.log" Apr 20 15:03:14.447540 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:14.447509 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5361d7ff-428a-4fe9-a55f-a6e058beb6b9/kube-rbac-proxy-web/0.log" Apr 20 15:03:14.468636 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:14.468611 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5361d7ff-428a-4fe9-a55f-a6e058beb6b9/kube-rbac-proxy/0.log" Apr 20 15:03:14.489607 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:14.489584 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5361d7ff-428a-4fe9-a55f-a6e058beb6b9/kube-rbac-proxy-metric/0.log" Apr 20 15:03:14.513136 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:14.513114 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5361d7ff-428a-4fe9-a55f-a6e058beb6b9/prom-label-proxy/0.log" Apr 20 15:03:14.535008 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:14.534985 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5361d7ff-428a-4fe9-a55f-a6e058beb6b9/init-config-reloader/0.log" Apr 20 15:03:14.578707 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:14.578676 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-lcqfj_a04aab45-434d-4c28-a9f8-631e889ece22/cluster-monitoring-operator/0.log" Apr 20 15:03:14.619768 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:14.619737 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-9fd44_04817841-de4d-4aa5-b903-08642105cdfb/kube-state-metrics/0.log" Apr 20 15:03:14.643732 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:14.643708 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-9fd44_04817841-de4d-4aa5-b903-08642105cdfb/kube-rbac-proxy-main/0.log" Apr 20 15:03:14.663850 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:14.663792 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-9fd44_04817841-de4d-4aa5-b903-08642105cdfb/kube-rbac-proxy-self/0.log" Apr 20 15:03:14.849603 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:14.849574 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v9pc4_ccedf4b7-a764-4c69-ae30-16d122f1bddc/node-exporter/0.log" Apr 20 15:03:14.891976 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:14.891950 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v9pc4_ccedf4b7-a764-4c69-ae30-16d122f1bddc/kube-rbac-proxy/0.log" Apr 20 15:03:14.946343 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:14.946265 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-v9pc4_ccedf4b7-a764-4c69-ae30-16d122f1bddc/init-textfile/0.log" Apr 20 15:03:15.094609 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:15.094577 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-jjdk7_66a3170e-bdf6-4b55-a50b-8c583852bce8/kube-rbac-proxy-main/0.log" Apr 20 15:03:15.118080 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:15.118050 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-jjdk7_66a3170e-bdf6-4b55-a50b-8c583852bce8/kube-rbac-proxy-self/0.log" Apr 20 15:03:15.142264 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:15.142238 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-jjdk7_66a3170e-bdf6-4b55-a50b-8c583852bce8/openshift-state-metrics/0.log" Apr 20 15:03:15.180760 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:15.180732 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5db69b60-e33a-4cee-8179-2f467fcf7536/prometheus/0.log" Apr 20 15:03:15.202518 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:15.202430 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5db69b60-e33a-4cee-8179-2f467fcf7536/config-reloader/0.log" Apr 20 15:03:15.228530 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:15.228488 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5db69b60-e33a-4cee-8179-2f467fcf7536/thanos-sidecar/0.log" Apr 20 15:03:15.250436 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:15.250409 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5db69b60-e33a-4cee-8179-2f467fcf7536/kube-rbac-proxy-web/0.log" Apr 20 15:03:15.272142 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:15.272119 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5db69b60-e33a-4cee-8179-2f467fcf7536/kube-rbac-proxy/0.log" Apr 20 15:03:15.298041 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:15.298011 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5db69b60-e33a-4cee-8179-2f467fcf7536/kube-rbac-proxy-thanos/0.log" Apr 20 15:03:15.320351 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:15.320325 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_5db69b60-e33a-4cee-8179-2f467fcf7536/init-config-reloader/0.log" Apr 20 15:03:15.353654 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:15.353623 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-jl55v_5fcc0fbb-05e8-4606-85c3-84cc15be4e7c/prometheus-operator/0.log" Apr 20 15:03:15.372745 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:15.372721 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-jl55v_5fcc0fbb-05e8-4606-85c3-84cc15be4e7c/kube-rbac-proxy/0.log" Apr 20 15:03:15.399544 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:15.399520 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-2pt82_08e3cb9f-b15e-4569-a3b6-0c1123318898/prometheus-operator-admission-webhook/0.log" Apr 20 15:03:15.430314 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:15.430286 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-864bbc56b7-v5shp_6e7a48f3-a413-4589-ab1d-85e3cf805195/telemeter-client/0.log" Apr 20 15:03:15.455266 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:15.455194 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-864bbc56b7-v5shp_6e7a48f3-a413-4589-ab1d-85e3cf805195/reload/0.log" Apr 20 15:03:15.478154 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:15.478130 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-864bbc56b7-v5shp_6e7a48f3-a413-4589-ab1d-85e3cf805195/kube-rbac-proxy/0.log" Apr 20 15:03:16.651308 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.651274 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-44kjm_422e9153-0d1c-42db-a5e3-d97c30b99267/networking-console-plugin/0.log" Apr 20 15:03:16.789493 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.789460 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb"] Apr 20 15:03:16.789889 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.789875 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02647406-b8c2-441a-8dac-c1104385e9c7" containerName="cleanup" Apr 20 15:03:16.789937 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.789891 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="02647406-b8c2-441a-8dac-c1104385e9c7" containerName="cleanup" Apr 20 15:03:16.789972 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.789952 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="02647406-b8c2-441a-8dac-c1104385e9c7" containerName="cleanup" Apr 20 15:03:16.793198 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.793177 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb" Apr 20 15:03:16.795622 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.795602 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xvzk6\"/\"kube-root-ca.crt\"" Apr 20 15:03:16.796689 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.796670 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xvzk6\"/\"openshift-service-ca.crt\"" Apr 20 15:03:16.796689 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.796680 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xvzk6\"/\"default-dockercfg-5f9zf\"" Apr 20 15:03:16.803153 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.803131 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb"] Apr 20 15:03:16.843010 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.842983 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cf788cc6-f96d-4922-8035-b9c273d736de-podres\") pod \"perf-node-gather-daemonset-cmdcb\" (UID: \"cf788cc6-f96d-4922-8035-b9c273d736de\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb" Apr 20 15:03:16.843151 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.843033 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cf788cc6-f96d-4922-8035-b9c273d736de-sys\") pod \"perf-node-gather-daemonset-cmdcb\" (UID: \"cf788cc6-f96d-4922-8035-b9c273d736de\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb" Apr 20 15:03:16.843151 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.843109 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cf788cc6-f96d-4922-8035-b9c273d736de-proc\") pod \"perf-node-gather-daemonset-cmdcb\" (UID: \"cf788cc6-f96d-4922-8035-b9c273d736de\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb" Apr 20 15:03:16.843235 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.843155 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cf788cc6-f96d-4922-8035-b9c273d736de-lib-modules\") pod \"perf-node-gather-daemonset-cmdcb\" (UID: \"cf788cc6-f96d-4922-8035-b9c273d736de\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb" Apr 20 15:03:16.843235 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.843219 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b945k\" (UniqueName: \"kubernetes.io/projected/cf788cc6-f96d-4922-8035-b9c273d736de-kube-api-access-b945k\") pod \"perf-node-gather-daemonset-cmdcb\" (UID: \"cf788cc6-f96d-4922-8035-b9c273d736de\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb" Apr 20 15:03:16.944651 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.944571 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b945k\" (UniqueName: \"kubernetes.io/projected/cf788cc6-f96d-4922-8035-b9c273d736de-kube-api-access-b945k\") pod \"perf-node-gather-daemonset-cmdcb\" (UID: \"cf788cc6-f96d-4922-8035-b9c273d736de\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb" Apr 20 15:03:16.944651 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.944618 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cf788cc6-f96d-4922-8035-b9c273d736de-podres\") pod \"perf-node-gather-daemonset-cmdcb\" (UID: \"cf788cc6-f96d-4922-8035-b9c273d736de\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb" Apr 20 15:03:16.944835 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.944671 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cf788cc6-f96d-4922-8035-b9c273d736de-sys\") pod \"perf-node-gather-daemonset-cmdcb\" (UID: \"cf788cc6-f96d-4922-8035-b9c273d736de\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb" Apr 20 15:03:16.944835 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.944710 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cf788cc6-f96d-4922-8035-b9c273d736de-proc\") pod \"perf-node-gather-daemonset-cmdcb\" (UID: \"cf788cc6-f96d-4922-8035-b9c273d736de\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb" Apr 20 15:03:16.944835 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.944757 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cf788cc6-f96d-4922-8035-b9c273d736de-lib-modules\") pod \"perf-node-gather-daemonset-cmdcb\" (UID: \"cf788cc6-f96d-4922-8035-b9c273d736de\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb" Apr 20 15:03:16.944835 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.944780 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cf788cc6-f96d-4922-8035-b9c273d736de-sys\") pod \"perf-node-gather-daemonset-cmdcb\" (UID: \"cf788cc6-f96d-4922-8035-b9c273d736de\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb" Apr 20 15:03:16.944835 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.944824 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cf788cc6-f96d-4922-8035-b9c273d736de-podres\") pod \"perf-node-gather-daemonset-cmdcb\" (UID: \"cf788cc6-f96d-4922-8035-b9c273d736de\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb" Apr 20 15:03:16.944996 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.944854 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cf788cc6-f96d-4922-8035-b9c273d736de-proc\") pod \"perf-node-gather-daemonset-cmdcb\" (UID: \"cf788cc6-f96d-4922-8035-b9c273d736de\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb" Apr 20 15:03:16.944996 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.944901 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cf788cc6-f96d-4922-8035-b9c273d736de-lib-modules\") pod \"perf-node-gather-daemonset-cmdcb\" (UID: \"cf788cc6-f96d-4922-8035-b9c273d736de\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb" Apr 20 15:03:16.955720 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:16.955688 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b945k\" (UniqueName: \"kubernetes.io/projected/cf788cc6-f96d-4922-8035-b9c273d736de-kube-api-access-b945k\") pod \"perf-node-gather-daemonset-cmdcb\" (UID: \"cf788cc6-f96d-4922-8035-b9c273d736de\") " pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb" Apr 20 15:03:17.104681 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:17.104643 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb" Apr 20 15:03:17.233618 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:17.233540 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb"] Apr 20 15:03:17.236375 ip-10-0-142-166 kubenswrapper[2580]: W0420 15:03:17.236348 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcf788cc6_f96d_4922_8035_b9c273d736de.slice/crio-629e4eea0fda93e989e7124c4599c85c32bdc2ed55aa59c53e88bcb2a4ddab52 WatchSource:0}: Error finding container 629e4eea0fda93e989e7124c4599c85c32bdc2ed55aa59c53e88bcb2a4ddab52: Status 404 returned error can't find the container with id 629e4eea0fda93e989e7124c4599c85c32bdc2ed55aa59c53e88bcb2a4ddab52 Apr 20 15:03:17.237963 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:17.237945 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:03:17.585918 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:17.585883 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb" event={"ID":"cf788cc6-f96d-4922-8035-b9c273d736de","Type":"ContainerStarted","Data":"862d485accd3dc6c695b3003944996248aa8fd552e5512cc39cde2ead6412214"} Apr 20 15:03:17.585918 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:17.585922 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb" event={"ID":"cf788cc6-f96d-4922-8035-b9c273d736de","Type":"ContainerStarted","Data":"629e4eea0fda93e989e7124c4599c85c32bdc2ed55aa59c53e88bcb2a4ddab52"} Apr 20 15:03:17.586185 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:17.585982 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb" Apr 20 15:03:17.601958 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:17.601902 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb" podStartSLOduration=1.601885383 podStartE2EDuration="1.601885383s" podCreationTimestamp="2026-04-20 15:03:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:03:17.600143242 +0000 UTC m=+2184.320444717" watchObservedRunningTime="2026-04-20 15:03:17.601885383 +0000 UTC m=+2184.322186815" Apr 20 15:03:18.250783 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:18.250756 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-zjdjg_7dfb4e40-e179-469e-adae-c6d6a0f119db/volume-data-source-validator/0.log" Apr 20 15:03:19.098347 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:19.098319 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vlqz9_26882095-42a8-4889-9983-45d2dc2d0fc6/dns/0.log" Apr 20 15:03:19.127586 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:19.127553 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vlqz9_26882095-42a8-4889-9983-45d2dc2d0fc6/kube-rbac-proxy/0.log" Apr 20 15:03:19.190917 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:19.190884 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mw9b9_236542ea-fda7-4b96-ae9e-dd685e15e5ef/dns-node-resolver/0.log" Apr 20 15:03:19.762981 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:19.762956 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2rnmj_175a3efc-4aa3-4f7d-ac63-bb40b7cd457b/node-ca/0.log" Apr 20 15:03:20.876208 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:20.876181 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-87db58fcf-7gt29_7f509089-8b77-4fa4-8afb-ed3227e0f2d2/kube-auth-proxy/0.log" Apr 20 15:03:21.554561 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:21.554535 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8g5vx_d6c411bc-80c8-4d9c-993c-cb6aeb232750/serve-healthcheck-canary/0.log" Apr 20 15:03:22.231338 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:22.231308 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q2tq5_6e2b9449-8bdb-4411-bc74-b657c116838a/kube-rbac-proxy/0.log" Apr 20 15:03:22.262049 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:22.262029 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q2tq5_6e2b9449-8bdb-4411-bc74-b657c116838a/exporter/0.log" Apr 20 15:03:22.313662 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:22.313636 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-q2tq5_6e2b9449-8bdb-4411-bc74-b657c116838a/extractor/0.log" Apr 20 15:03:23.599147 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:23.599120 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xvzk6/perf-node-gather-daemonset-cmdcb" Apr 20 15:03:24.491286 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:24.491244 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-79cb8b9576-xzhb8_9094e779-d79f-4892-95c8-8233b6c66b58/manager/0.log" Apr 20 15:03:24.682944 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:24.682902 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-65c545df94-xlml9_c265fc19-3a91-4dc4-9f05-7d671aed51f8/manager/0.log" Apr 20 15:03:26.073643 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:26.073611 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-bc7d4767f-58dhd_b13fc4d6-1b50-4552-abc0-b185e44ca8c7/manager/0.log" Apr 20 15:03:30.904746 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:30.904710 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gc5dt_898273ba-9057-4ee0-9211-0db4c4234ca3/kube-storage-version-migrator-operator/1.log" Apr 20 15:03:30.905626 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:30.905607 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gc5dt_898273ba-9057-4ee0-9211-0db4c4234ca3/kube-storage-version-migrator-operator/0.log" Apr 20 15:03:32.337821 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:32.337794 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dmfdx_3c30eab0-ce99-4717-9b34-99ab3a10543c/kube-multus-additional-cni-plugins/0.log" Apr 20 15:03:32.360811 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:32.360788 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dmfdx_3c30eab0-ce99-4717-9b34-99ab3a10543c/egress-router-binary-copy/0.log" Apr 20 15:03:32.382849 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:32.382827 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dmfdx_3c30eab0-ce99-4717-9b34-99ab3a10543c/cni-plugins/0.log" Apr 20 15:03:32.406794 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:32.406774 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dmfdx_3c30eab0-ce99-4717-9b34-99ab3a10543c/bond-cni-plugin/0.log" Apr 20 15:03:32.429159 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:32.429133 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dmfdx_3c30eab0-ce99-4717-9b34-99ab3a10543c/routeoverride-cni/0.log" Apr 20 15:03:32.449914 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:32.449894 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dmfdx_3c30eab0-ce99-4717-9b34-99ab3a10543c/whereabouts-cni-bincopy/0.log" Apr 20 15:03:32.471400 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:32.471367 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-dmfdx_3c30eab0-ce99-4717-9b34-99ab3a10543c/whereabouts-cni/0.log" Apr 20 15:03:32.501739 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:32.501721 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ct824_5ab1d81b-18bf-4cdd-80f4-67fa99ae9490/kube-multus/0.log" Apr 20 15:03:32.606014 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:32.605952 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qsqks_db79a290-5377-45f9-bb87-89588231d8a7/network-metrics-daemon/0.log" Apr 20 15:03:32.625704 ip-10-0-142-166 kubenswrapper[2580]: I0420 15:03:32.625686 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-qsqks_db79a290-5377-45f9-bb87-89588231d8a7/kube-rbac-proxy/0.log"