Apr 23 17:53:23.060407 ip-10-0-142-63 systemd[1]: Starting Kubernetes Kubelet... Apr 23 17:53:23.510317 ip-10-0-142-63 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:53:23.510317 ip-10-0-142-63 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 17:53:23.510317 ip-10-0-142-63 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:53:23.510317 ip-10-0-142-63 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 17:53:23.510317 ip-10-0-142-63 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 17:53:23.512959 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.512867 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 17:53:23.521052 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521033 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:53:23.521052 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521049 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:53:23.521052 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521053 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:53:23.521052 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521056 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:53:23.521052 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521060 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:53:23.521052 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521063 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:53:23.521277 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521066 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:53:23.521277 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521069 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:53:23.521277 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521072 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:53:23.521277 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521075 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:53:23.521277 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521078 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:53:23.521277 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521081 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:53:23.521277 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521098 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:53:23.521277 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521101 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:53:23.521277 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521104 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:53:23.521277 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521106 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:53:23.521277 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521109 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:53:23.521277 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521112 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:53:23.521277 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521114 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:53:23.521277 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521117 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:53:23.521277 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521120 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:53:23.521277 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521122 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:53:23.521277 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521125 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:53:23.521277 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521128 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:53:23.521277 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521131 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:53:23.521277 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521133 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:53:23.521794 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521136 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:53:23.521794 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521138 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:53:23.521794 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521141 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:53:23.521794 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521144 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:53:23.521794 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521146 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:53:23.521794 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521149 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:53:23.521794 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521152 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:53:23.521794 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521156 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:53:23.521794 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521159 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:53:23.521794 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521162 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:53:23.521794 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521165 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:53:23.521794 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521168 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:53:23.521794 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521170 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:53:23.521794 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521174 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:53:23.521794 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521178 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:53:23.521794 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521182 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:53:23.521794 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521184 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:53:23.521794 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521187 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:53:23.521794 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521190 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:53:23.522282 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521193 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:53:23.522282 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521196 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:53:23.522282 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521199 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:53:23.522282 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521201 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:53:23.522282 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521204 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:53:23.522282 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521207 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:53:23.522282 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521209 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:53:23.522282 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521212 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:53:23.522282 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521214 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:53:23.522282 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521217 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:53:23.522282 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521219 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:53:23.522282 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521222 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:53:23.522282 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521224 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:53:23.522282 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521227 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:53:23.522282 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521229 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:53:23.522282 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521232 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:53:23.522282 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521235 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:53:23.522282 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521237 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:53:23.522282 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521240 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:53:23.522282 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521243 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:53:23.522764 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521245 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:53:23.522764 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521247 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:53:23.522764 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521250 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:53:23.522764 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521252 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:53:23.522764 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521255 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:53:23.522764 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521257 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:53:23.522764 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521261 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:53:23.522764 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521263 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:53:23.522764 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521266 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:53:23.522764 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521269 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:53:23.522764 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521271 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:53:23.522764 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521274 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:53:23.522764 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521278 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:53:23.522764 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521280 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:53:23.522764 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521283 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:53:23.522764 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521286 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:53:23.522764 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521289 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:53:23.522764 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521292 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:53:23.522764 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521295 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:53:23.522764 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521298 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:53:23.523262 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.521300 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:53:23.523262 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522369 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:53:23.523262 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522378 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:53:23.523262 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522382 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:53:23.523262 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522385 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:53:23.523262 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522388 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:53:23.523262 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522391 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:53:23.523262 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522394 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:53:23.523262 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522397 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:53:23.523262 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522400 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:53:23.523262 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522402 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:53:23.523262 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522405 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:53:23.523262 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522413 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:53:23.523262 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522416 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:53:23.523262 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522420 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:53:23.523262 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522423 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:53:23.523262 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522425 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:53:23.523262 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522428 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:53:23.523262 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522431 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:53:23.523262 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522434 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:53:23.523749 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522437 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:53:23.523749 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522439 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:53:23.523749 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522442 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:53:23.523749 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522445 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:53:23.523749 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522448 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:53:23.523749 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522452 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:53:23.523749 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522454 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:53:23.523749 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522457 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:53:23.523749 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522460 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:53:23.523749 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522462 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:53:23.523749 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522465 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:53:23.523749 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522467 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:53:23.523749 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522470 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:53:23.523749 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522472 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:53:23.523749 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522475 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:53:23.523749 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522477 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:53:23.523749 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522480 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:53:23.523749 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522482 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:53:23.523749 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522486 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:53:23.523749 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522489 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:53:23.524296 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522492 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:53:23.524296 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522495 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:53:23.524296 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522497 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:53:23.524296 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522499 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:53:23.524296 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522502 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:53:23.524296 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522504 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:53:23.524296 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522507 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:53:23.524296 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522510 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:53:23.524296 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522512 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:53:23.524296 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522514 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:53:23.524296 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522517 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:53:23.524296 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522520 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:53:23.524296 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522523 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:53:23.524296 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522525 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:53:23.524296 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522528 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:53:23.524296 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522531 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:53:23.524296 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522533 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:53:23.524296 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522536 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:53:23.524296 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522538 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:53:23.524296 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522541 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:53:23.524792 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522544 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:53:23.524792 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522546 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:53:23.524792 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522549 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:53:23.524792 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522551 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:53:23.524792 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522553 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:53:23.524792 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522556 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:53:23.524792 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522559 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:53:23.524792 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522561 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:53:23.524792 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522564 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:53:23.524792 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522566 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:53:23.524792 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522569 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:53:23.524792 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522572 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:53:23.524792 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522574 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:53:23.524792 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522576 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:53:23.524792 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522579 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:53:23.524792 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522582 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:53:23.524792 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522585 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:53:23.524792 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522588 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:53:23.524792 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522590 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:53:23.524792 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522593 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:53:23.525297 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522595 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:53:23.525297 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522598 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:53:23.525297 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522601 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:53:23.525297 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522603 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:53:23.525297 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522607 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:53:23.525297 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522611 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:53:23.525297 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.522615 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:53:23.525297 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522682 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 17:53:23.525297 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522689 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 17:53:23.525297 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522696 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 17:53:23.525297 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522701 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 17:53:23.525297 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522706 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 17:53:23.525297 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522710 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 17:53:23.525297 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522714 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 17:53:23.525297 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522719 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 17:53:23.525297 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522722 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 17:53:23.525297 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522725 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 17:53:23.525297 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522728 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 17:53:23.525297 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522731 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 17:53:23.525297 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522735 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 17:53:23.525297 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522738 2578 flags.go:64] FLAG: --cgroup-root="" Apr 23 17:53:23.525297 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522740 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522743 2578 flags.go:64] FLAG: --client-ca-file="" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522746 2578 flags.go:64] FLAG: --cloud-config="" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522749 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522752 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522756 2578 flags.go:64] FLAG: --cluster-domain="" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522759 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522762 2578 flags.go:64] FLAG: --config-dir="" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522765 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522769 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522773 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522776 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522779 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522783 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522786 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522789 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522792 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522795 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522798 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522802 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522805 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522808 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522810 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522814 2578 flags.go:64] FLAG: --enable-server="true" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522817 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 17:53:23.525827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522822 2578 flags.go:64] FLAG: --event-burst="100" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522825 2578 flags.go:64] FLAG: --event-qps="50" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522828 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522831 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522834 2578 flags.go:64] FLAG: --eviction-hard="" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522838 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522841 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522844 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522847 2578 flags.go:64] FLAG: --eviction-soft="" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522850 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522853 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522856 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522859 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522862 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522865 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522868 2578 flags.go:64] FLAG: --feature-gates="" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522872 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522875 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522877 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522881 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522883 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522886 2578 flags.go:64] FLAG: --help="false" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522889 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-142-63.ec2.internal" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522892 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 17:53:23.526458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522895 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522897 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522901 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522904 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522907 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522909 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522912 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522915 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522918 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522922 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522925 2578 flags.go:64] FLAG: --kube-reserved="" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522928 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522930 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522933 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522936 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522939 2578 flags.go:64] FLAG: --lock-file="" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522942 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522945 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522948 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522953 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522956 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522960 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522962 2578 flags.go:64] FLAG: --logging-format="text" Apr 23 17:53:23.527036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522965 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522968 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522971 2578 flags.go:64] FLAG: --manifest-url="" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522974 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522978 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522981 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522985 2578 flags.go:64] FLAG: --max-pods="110" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522988 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522991 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522995 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.522998 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523001 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523004 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523007 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523014 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523017 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523020 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523023 2578 flags.go:64] FLAG: --pod-cidr="" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523027 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523032 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523035 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523038 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523041 2578 flags.go:64] FLAG: --port="10250" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523044 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 17:53:23.527608 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523047 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-006c1047552982a1f" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523050 2578 flags.go:64] FLAG: --qos-reserved="" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523053 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523056 2578 flags.go:64] FLAG: --register-node="true" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523059 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523061 2578 flags.go:64] FLAG: --register-with-taints="" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523065 2578 flags.go:64] FLAG: --registry-burst="10" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523068 2578 flags.go:64] FLAG: --registry-qps="5" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523071 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523074 2578 flags.go:64] FLAG: --reserved-memory="" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523078 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523081 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523096 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523099 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523102 2578 flags.go:64] FLAG: --runonce="false" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523105 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523108 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523111 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523114 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523117 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523121 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523124 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523127 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523130 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523133 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523135 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 17:53:23.528203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523139 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523142 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523145 2578 flags.go:64] FLAG: --system-cgroups="" Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523148 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523153 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523157 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523159 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523163 2578 flags.go:64] FLAG: --tls-min-version="" Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523166 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523169 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523172 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523175 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523178 2578 flags.go:64] FLAG: --v="2" Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523182 2578 flags.go:64] FLAG: --version="false" Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523186 2578 flags.go:64] FLAG: --vmodule="" Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523190 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.523193 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523285 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523288 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523292 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523294 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523298 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523303 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523306 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:53:23.528827 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523308 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:53:23.529468 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523311 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:53:23.529468 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523314 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:53:23.529468 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523317 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:53:23.529468 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523319 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:53:23.529468 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523322 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:53:23.529468 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523324 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:53:23.529468 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523327 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:53:23.529468 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523330 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:53:23.529468 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523333 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:53:23.529468 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523335 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:53:23.529468 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523339 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:53:23.529468 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523343 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:53:23.529468 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523346 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:53:23.529468 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523349 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:53:23.529468 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523352 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:53:23.529468 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523354 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:53:23.529468 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523357 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:53:23.529468 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523361 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:53:23.529468 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523364 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:53:23.530005 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523366 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:53:23.530005 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523369 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:53:23.530005 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523371 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:53:23.530005 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523374 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:53:23.530005 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523376 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:53:23.530005 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523379 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:53:23.530005 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523381 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:53:23.530005 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523384 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:53:23.530005 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523387 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:53:23.530005 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523389 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:53:23.530005 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523393 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:53:23.530005 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523396 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:53:23.530005 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523398 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:53:23.530005 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523401 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:53:23.530005 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523404 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:53:23.530005 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523406 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:53:23.530005 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523409 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:53:23.530005 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523412 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:53:23.530005 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523414 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:53:23.530005 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523417 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:53:23.530609 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523419 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:53:23.530609 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523422 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:53:23.530609 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523425 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:53:23.530609 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523428 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:53:23.530609 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523430 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:53:23.530609 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523433 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:53:23.530609 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523435 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:53:23.530609 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523438 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:53:23.530609 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523440 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:53:23.530609 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523443 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:53:23.530609 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523446 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:53:23.530609 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523449 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:53:23.530609 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523452 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:53:23.530609 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523454 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:53:23.530609 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523457 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:53:23.530609 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523461 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:53:23.530609 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523464 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:53:23.530609 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523467 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:53:23.530609 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523470 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:53:23.531103 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523473 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:53:23.531103 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523475 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:53:23.531103 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523478 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:53:23.531103 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523482 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:53:23.531103 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523485 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:53:23.531103 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523488 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:53:23.531103 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523490 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:53:23.531103 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523493 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:53:23.531103 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523495 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:53:23.531103 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523498 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:53:23.531103 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523500 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:53:23.531103 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523503 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:53:23.531103 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523505 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:53:23.531103 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523508 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:53:23.531103 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523510 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:53:23.531103 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523513 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:53:23.531103 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523516 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:53:23.531103 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523518 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:53:23.531103 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523521 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:53:23.531103 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.523523 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:53:23.531598 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.524397 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:53:23.531598 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.530941 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 17:53:23.531598 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.530957 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 17:53:23.531598 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531009 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:53:23.531598 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531014 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:53:23.531598 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531018 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:53:23.531598 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531021 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:53:23.531598 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531024 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:53:23.531598 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531027 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:53:23.531598 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531029 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:53:23.531598 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531032 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:53:23.531598 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531035 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:53:23.531598 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531038 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:53:23.531598 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531040 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:53:23.531598 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531043 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:53:23.531598 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531045 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:53:23.532002 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531048 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:53:23.532002 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531050 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:53:23.532002 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531053 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:53:23.532002 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531055 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:53:23.532002 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531058 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:53:23.532002 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531061 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:53:23.532002 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531064 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:53:23.532002 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531067 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:53:23.532002 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531069 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:53:23.532002 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531072 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:53:23.532002 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531074 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:53:23.532002 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531077 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:53:23.532002 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531081 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:53:23.532002 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531096 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:53:23.532002 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531099 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:53:23.532002 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531102 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:53:23.532002 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531106 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:53:23.532002 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531109 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:53:23.532002 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531113 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:53:23.532002 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531116 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:53:23.532501 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531119 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:53:23.532501 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531121 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:53:23.532501 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531124 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:53:23.532501 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531127 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:53:23.532501 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531130 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:53:23.532501 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531132 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:53:23.532501 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531135 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:53:23.532501 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531137 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:53:23.532501 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531141 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:53:23.532501 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531143 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:53:23.532501 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531146 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:53:23.532501 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531149 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:53:23.532501 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531151 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:53:23.532501 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531154 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:53:23.532501 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531156 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:53:23.532501 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531159 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:53:23.532501 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531161 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:53:23.532501 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531164 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:53:23.532501 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531168 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:53:23.532501 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531172 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:53:23.532974 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531175 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:53:23.532974 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531178 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:53:23.532974 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531180 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:53:23.532974 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531183 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:53:23.532974 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531186 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:53:23.532974 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531196 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:53:23.532974 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531199 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:53:23.532974 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531201 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:53:23.532974 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531204 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:53:23.532974 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531207 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:53:23.532974 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531209 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:53:23.532974 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531212 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:53:23.532974 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531215 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:53:23.532974 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531218 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:53:23.532974 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531221 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:53:23.532974 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531223 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:53:23.532974 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531226 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:53:23.532974 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531228 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:53:23.532974 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531231 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:53:23.533487 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531233 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:53:23.533487 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531236 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:53:23.533487 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531238 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:53:23.533487 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531241 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:53:23.533487 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531243 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:53:23.533487 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531246 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:53:23.533487 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531248 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:53:23.533487 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531251 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:53:23.533487 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531254 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:53:23.533487 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531256 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:53:23.533487 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531259 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:53:23.533487 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531263 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:53:23.533487 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531267 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:53:23.533487 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531270 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:53:23.533487 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.531276 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:53:23.533861 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531392 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 17:53:23.533861 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531397 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 17:53:23.533861 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531400 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 17:53:23.533861 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531403 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 17:53:23.533861 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531406 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 17:53:23.533861 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531409 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 17:53:23.533861 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531412 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 17:53:23.533861 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531415 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 17:53:23.533861 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531418 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 17:53:23.533861 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531420 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 17:53:23.533861 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531423 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 23 17:53:23.533861 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531426 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 17:53:23.533861 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531428 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 17:53:23.533861 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531431 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 17:53:23.533861 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531433 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 17:53:23.533861 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531436 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 17:53:23.533861 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531438 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 17:53:23.533861 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531441 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 17:53:23.533861 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531443 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 17:53:23.533861 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531446 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 17:53:23.534361 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531448 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 17:53:23.534361 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531451 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 17:53:23.534361 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531454 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 17:53:23.534361 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531456 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 17:53:23.534361 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531459 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 17:53:23.534361 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531462 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 17:53:23.534361 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531464 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 17:53:23.534361 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531468 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 17:53:23.534361 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531471 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 17:53:23.534361 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531474 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 17:53:23.534361 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531477 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 17:53:23.534361 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531480 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 17:53:23.534361 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531483 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 17:53:23.534361 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531486 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 17:53:23.534361 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531488 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 17:53:23.534361 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531491 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 17:53:23.534361 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531494 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 17:53:23.534361 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531498 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 17:53:23.534361 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531501 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 17:53:23.534936 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531504 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 17:53:23.534936 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531506 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 17:53:23.534936 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531509 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 17:53:23.534936 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531511 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 17:53:23.534936 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531514 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 17:53:23.534936 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531517 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 17:53:23.534936 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531519 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 17:53:23.534936 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531521 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 17:53:23.534936 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531524 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 17:53:23.534936 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531526 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 17:53:23.534936 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531529 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 17:53:23.534936 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531531 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 17:53:23.534936 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531534 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 17:53:23.534936 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531536 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 17:53:23.534936 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531539 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 17:53:23.534936 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531542 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 17:53:23.534936 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531545 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 17:53:23.534936 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531548 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 17:53:23.534936 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531551 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 17:53:23.534936 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531553 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 17:53:23.535566 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531556 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 17:53:23.535566 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531558 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 17:53:23.535566 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531560 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 17:53:23.535566 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531563 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 17:53:23.535566 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531565 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 17:53:23.535566 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531568 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 17:53:23.535566 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531571 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 17:53:23.535566 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531573 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 17:53:23.535566 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531575 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 17:53:23.535566 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531578 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 17:53:23.535566 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531580 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 17:53:23.535566 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531584 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 17:53:23.535566 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531587 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 17:53:23.535566 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531589 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 17:53:23.535566 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531592 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 17:53:23.535566 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531594 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 17:53:23.535566 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531597 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 17:53:23.535566 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531599 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 17:53:23.535566 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531602 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 17:53:23.535566 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531604 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 17:53:23.536182 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531607 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 17:53:23.536182 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531610 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 17:53:23.536182 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531612 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 17:53:23.536182 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531615 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 17:53:23.536182 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531617 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 17:53:23.536182 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531620 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 17:53:23.536182 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:23.531622 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 17:53:23.536182 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.531627 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 17:53:23.536182 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.531756 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 17:53:23.536182 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.534605 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 17:53:23.536182 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.535622 2578 server.go:1019] "Starting client certificate rotation" Apr 23 17:53:23.536182 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.535735 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:53:23.536182 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.535792 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 17:53:23.562680 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.562654 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:53:23.566038 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.566015 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 17:53:23.582238 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.582213 2578 log.go:25] "Validated CRI v1 runtime API" Apr 23 17:53:23.587630 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.587611 2578 log.go:25] "Validated CRI v1 image API" Apr 23 17:53:23.588754 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.588738 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 17:53:23.591935 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.591913 2578 fs.go:135] Filesystem UUIDs: map[5a3b650c-fa1b-4060-83d7-c38193da2989:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 8870055e-c516-4efc-9ec8-bb0be5a642c8:/dev/nvme0n1p4] Apr 23 17:53:23.591999 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.591935 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 17:53:23.593146 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.593130 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:53:23.596864 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.596762 2578 manager.go:217] Machine: {Timestamp:2026-04-23 17:53:23.595592001 +0000 UTC m=+0.418059727 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101210 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b0b567e036a06b5a0c3d1e7df050a SystemUUID:ec2b0b56-7e03-6a06-b5a0-c3d1e7df050a BootID:5eb1d916-168a-49c2-a2c1-e4e95df02a9a Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:59:6b:54:c4:8b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:59:6b:54:c4:8b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6a:59:b3:b3:aa:f8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 17:53:23.596864 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.596858 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 17:53:23.596971 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.596932 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 17:53:23.598648 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.598622 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 17:53:23.598778 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.598650 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-142-63.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 17:53:23.598823 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.598790 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 17:53:23.598823 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.598797 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 17:53:23.598823 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.598811 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:53:23.599752 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.599741 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 17:53:23.601106 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.601081 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:53:23.601372 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.601362 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 17:53:23.603743 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.603732 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 23 17:53:23.603795 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.603751 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 17:53:23.603795 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.603763 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 17:53:23.603795 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.603773 2578 kubelet.go:397] "Adding apiserver pod source" Apr 23 17:53:23.603795 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.603781 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 17:53:23.604823 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.604811 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:53:23.604872 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.604830 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 17:53:23.607638 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.607624 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 17:53:23.608964 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.608951 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 17:53:23.610730 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.610720 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 17:53:23.610774 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.610736 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 17:53:23.610774 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.610742 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 17:53:23.610774 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.610748 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 17:53:23.610774 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.610761 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 17:53:23.610774 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.610769 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 17:53:23.610902 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.610785 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 17:53:23.610902 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.610791 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 17:53:23.610902 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.610798 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 17:53:23.610902 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.610803 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 17:53:23.610902 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.610817 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 17:53:23.610902 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.610827 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 17:53:23.611661 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.611643 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 17:53:23.611661 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.611664 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 17:53:23.615325 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.615301 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 17:53:23.615426 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.615362 2578 server.go:1295] "Started kubelet" Apr 23 17:53:23.616416 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.616374 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 17:53:23.616416 ip-10-0-142-63 systemd[1]: Started Kubernetes Kubelet. Apr 23 17:53:23.616637 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.616586 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 17:53:23.616743 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.616652 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 17:53:23.617795 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.617777 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 17:53:23.618299 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.618281 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-142-63.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 17:53:23.618299 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:23.618281 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-142-63.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 17:53:23.618452 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:23.618359 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 17:53:23.619519 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.619505 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 23 17:53:23.624722 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:23.623638 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-142-63.ec2.internal.18a90de4a32710e5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-142-63.ec2.internal,UID:ip-10-0-142-63.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-142-63.ec2.internal,},FirstTimestamp:2026-04-23 17:53:23.615322341 +0000 UTC m=+0.437790068,LastTimestamp:2026-04-23 17:53:23.615322341 +0000 UTC m=+0.437790068,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-142-63.ec2.internal,}" Apr 23 17:53:23.625349 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.625332 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 17:53:23.625896 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.625883 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 17:53:23.626683 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.626660 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 17:53:23.626683 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.626684 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 17:53:23.626840 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.626828 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 23 17:53:23.626888 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.626842 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 23 17:53:23.626981 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:23.626966 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-63.ec2.internal\" not found" Apr 23 17:53:23.627128 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.627109 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 17:53:23.627128 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.627126 2578 factory.go:55] Registering systemd factory Apr 23 17:53:23.627235 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.627135 2578 factory.go:223] Registration of the systemd container factory successfully Apr 23 17:53:23.627376 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.627360 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 17:53:23.628311 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.628294 2578 factory.go:153] Registering CRI-O factory Apr 23 17:53:23.628437 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.628427 2578 factory.go:223] Registration of the crio container factory successfully Apr 23 17:53:23.628545 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.628536 2578 factory.go:103] Registering Raw factory Apr 23 17:53:23.628646 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.628638 2578 manager.go:1196] Started watching for new ooms in manager Apr 23 17:53:23.629277 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.629265 2578 manager.go:319] Starting recovery of all containers Apr 23 17:53:23.639319 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.638976 2578 manager.go:324] Recovery completed Apr 23 17:53:23.639452 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:23.639427 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 17:53:23.639511 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:23.639485 2578 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-142-63.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 17:53:23.639740 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.639720 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-n775j" Apr 23 17:53:23.643583 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.643571 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:23.645846 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.645829 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-63.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:23.645903 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.645859 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:23.645903 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.645872 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-63.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:23.646337 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.646321 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 17:53:23.646337 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.646337 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 17:53:23.646437 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.646352 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 23 17:53:23.649267 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.649255 2578 policy_none.go:49] "None policy: Start" Apr 23 17:53:23.649330 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.649271 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 17:53:23.649330 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.649281 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 23 17:53:23.649642 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.649626 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-n775j" Apr 23 17:53:23.698279 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.697630 2578 manager.go:341] "Starting Device Plugin manager" Apr 23 17:53:23.698279 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:23.697660 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 17:53:23.698279 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.697672 2578 server.go:85] "Starting device plugin registration server" Apr 23 17:53:23.698279 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.697905 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 17:53:23.698279 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.697918 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 17:53:23.698279 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.698002 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 17:53:23.698279 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.698100 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 17:53:23.698279 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.698110 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 17:53:23.698704 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:23.698542 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 17:53:23.698704 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:23.698576 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-142-63.ec2.internal\" not found" Apr 23 17:53:23.747477 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.747451 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 17:53:23.748644 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.748629 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 17:53:23.748717 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.748655 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 17:53:23.748717 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.748676 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 17:53:23.748717 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.748686 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 17:53:23.748832 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:23.748725 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 17:53:23.750491 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.750471 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:53:23.798567 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.798511 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:23.799518 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.799504 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-63.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:23.799577 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.799534 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:23.799577 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.799544 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-63.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:23.799577 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.799566 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-142-63.ec2.internal" Apr 23 17:53:23.808891 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.808877 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-142-63.ec2.internal" Apr 23 17:53:23.808955 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:23.808898 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-142-63.ec2.internal\": node \"ip-10-0-142-63.ec2.internal\" not found" Apr 23 17:53:23.834772 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:23.834751 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-63.ec2.internal\" not found" Apr 23 17:53:23.849358 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.849338 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-142-63.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-63.ec2.internal"] Apr 23 17:53:23.849437 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.849394 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:23.850755 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.850741 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-63.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:23.850825 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.850768 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:23.850825 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.850786 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-63.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:23.852045 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.852033 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:23.852219 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.852204 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-63.ec2.internal" Apr 23 17:53:23.852270 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.852233 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:23.852721 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.852705 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-63.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:23.852785 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.852723 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-63.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:23.852785 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.852731 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:23.852785 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.852740 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-63.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:23.852785 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.852744 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:23.852785 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.852757 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-63.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:23.853942 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.853924 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-63.ec2.internal" Apr 23 17:53:23.854007 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.853949 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 17:53:23.854555 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.854538 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-63.ec2.internal" event="NodeHasSufficientMemory" Apr 23 17:53:23.854555 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.854558 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-63.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 17:53:23.854684 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.854571 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-63.ec2.internal" event="NodeHasSufficientPID" Apr 23 17:53:23.874735 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:23.874715 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-63.ec2.internal\" not found" node="ip-10-0-142-63.ec2.internal" Apr 23 17:53:23.879111 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:23.879096 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-142-63.ec2.internal\" not found" node="ip-10-0-142-63.ec2.internal" Apr 23 17:53:23.928827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.928806 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7e0da96a27e0b9059a934c06d8e50b1f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-63.ec2.internal\" (UID: \"7e0da96a27e0b9059a934c06d8e50b1f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-63.ec2.internal" Apr 23 17:53:23.928904 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.928831 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e0da96a27e0b9059a934c06d8e50b1f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-63.ec2.internal\" (UID: \"7e0da96a27e0b9059a934c06d8e50b1f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-63.ec2.internal" Apr 23 17:53:23.928904 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:23.928848 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fb1b5e25e4662b1f0eb7136557c5a4df-config\") pod \"kube-apiserver-proxy-ip-10-0-142-63.ec2.internal\" (UID: \"fb1b5e25e4662b1f0eb7136557c5a4df\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-63.ec2.internal" Apr 23 17:53:23.934861 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:23.934845 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-63.ec2.internal\" not found" Apr 23 17:53:24.029803 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:24.029781 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7e0da96a27e0b9059a934c06d8e50b1f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-63.ec2.internal\" (UID: \"7e0da96a27e0b9059a934c06d8e50b1f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-63.ec2.internal" Apr 23 17:53:24.029877 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:24.029811 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e0da96a27e0b9059a934c06d8e50b1f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-63.ec2.internal\" (UID: \"7e0da96a27e0b9059a934c06d8e50b1f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-63.ec2.internal" Apr 23 17:53:24.029877 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:24.029828 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fb1b5e25e4662b1f0eb7136557c5a4df-config\") pod \"kube-apiserver-proxy-ip-10-0-142-63.ec2.internal\" (UID: \"fb1b5e25e4662b1f0eb7136557c5a4df\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-63.ec2.internal" Apr 23 17:53:24.029877 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:24.029860 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e0da96a27e0b9059a934c06d8e50b1f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-142-63.ec2.internal\" (UID: \"7e0da96a27e0b9059a934c06d8e50b1f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-63.ec2.internal" Apr 23 17:53:24.029877 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:24.029868 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7e0da96a27e0b9059a934c06d8e50b1f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-142-63.ec2.internal\" (UID: \"7e0da96a27e0b9059a934c06d8e50b1f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-63.ec2.internal" Apr 23 17:53:24.030009 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:24.029906 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/fb1b5e25e4662b1f0eb7136557c5a4df-config\") pod \"kube-apiserver-proxy-ip-10-0-142-63.ec2.internal\" (UID: \"fb1b5e25e4662b1f0eb7136557c5a4df\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-142-63.ec2.internal" Apr 23 17:53:24.035900 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:24.035878 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-63.ec2.internal\" not found" Apr 23 17:53:24.136744 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:24.136677 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-63.ec2.internal\" not found" Apr 23 17:53:24.176889 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:24.176863 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-63.ec2.internal" Apr 23 17:53:24.181418 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:24.181395 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-63.ec2.internal" Apr 23 17:53:24.237305 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:24.237275 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-63.ec2.internal\" not found" Apr 23 17:53:24.337839 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:24.337813 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-63.ec2.internal\" not found" Apr 23 17:53:24.438432 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:24.438360 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-63.ec2.internal\" not found" Apr 23 17:53:24.501507 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:24.501480 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:53:24.535447 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:24.535426 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 17:53:24.536074 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:24.535554 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 17:53:24.536074 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:24.535580 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 17:53:24.538572 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:24.538550 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-63.ec2.internal\" not found" Apr 23 17:53:24.626053 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:24.626023 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 17:53:24.638346 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:24.638326 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 17:53:24.639315 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:24.639298 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-63.ec2.internal\" not found" Apr 23 17:53:24.652458 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:24.652412 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 17:48:23 +0000 UTC" deadline="2027-10-10 08:31:59.769575021 +0000 UTC" Apr 23 17:53:24.652552 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:24.652459 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12830h38m35.117121837s" Apr 23 17:53:24.657306 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:24.657286 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-7lzt2" Apr 23 17:53:24.665701 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:24.665682 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-7lzt2" Apr 23 17:53:24.684498 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:24.684460 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb1b5e25e4662b1f0eb7136557c5a4df.slice/crio-db8669722601cc92b941a50e35ccd2f9a28496220eaf13116e346e0d266efbf3 WatchSource:0}: Error finding container db8669722601cc92b941a50e35ccd2f9a28496220eaf13116e346e0d266efbf3: Status 404 returned error can't find the container with id db8669722601cc92b941a50e35ccd2f9a28496220eaf13116e346e0d266efbf3 Apr 23 17:53:24.684769 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:24.684754 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e0da96a27e0b9059a934c06d8e50b1f.slice/crio-963a15068b2c98c29b1207f0abb0f835b3e1f1d54a2a62066d5ca96687e634c9 WatchSource:0}: Error finding container 963a15068b2c98c29b1207f0abb0f835b3e1f1d54a2a62066d5ca96687e634c9: Status 404 returned error can't find the container with id 963a15068b2c98c29b1207f0abb0f835b3e1f1d54a2a62066d5ca96687e634c9 Apr 23 17:53:24.688413 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:24.688399 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:53:24.739659 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:24.739634 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-63.ec2.internal\" not found" Apr 23 17:53:24.751373 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:24.751329 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-63.ec2.internal" event={"ID":"7e0da96a27e0b9059a934c06d8e50b1f","Type":"ContainerStarted","Data":"963a15068b2c98c29b1207f0abb0f835b3e1f1d54a2a62066d5ca96687e634c9"} Apr 23 17:53:24.752261 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:24.752242 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-63.ec2.internal" event={"ID":"fb1b5e25e4662b1f0eb7136557c5a4df","Type":"ContainerStarted","Data":"db8669722601cc92b941a50e35ccd2f9a28496220eaf13116e346e0d266efbf3"} Apr 23 17:53:24.839742 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:24.839722 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-63.ec2.internal\" not found" Apr 23 17:53:24.940342 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:24.940270 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-63.ec2.internal\" not found" Apr 23 17:53:25.040913 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:25.040868 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-63.ec2.internal\" not found" Apr 23 17:53:25.118844 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.118821 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:53:25.141156 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:25.141129 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-142-63.ec2.internal\" not found" Apr 23 17:53:25.221204 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.221130 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:53:25.226439 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.226414 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-63.ec2.internal" Apr 23 17:53:25.235165 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.235140 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:53:25.236277 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.236257 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-63.ec2.internal" Apr 23 17:53:25.248983 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.248937 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 17:53:25.605743 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.605673 2578 apiserver.go:52] "Watching apiserver" Apr 23 17:53:25.611871 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.611846 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:53:25.613205 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.613185 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 17:53:25.613556 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.613533 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-142-63.ec2.internal","openshift-dns/node-resolver-mn726","openshift-image-registry/node-ca-qjrbv","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-63.ec2.internal","openshift-multus/multus-48vg7","openshift-multus/multus-additional-cni-plugins-2krs5","openshift-multus/network-metrics-daemon-mqfsb","openshift-network-diagnostics/network-check-target-bztd4","kube-system/konnectivity-agent-442hb","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78","openshift-cluster-node-tuning-operator/tuned-t84lj","openshift-network-operator/iptables-alerter-xvbsl","openshift-ovn-kubernetes/ovnkube-node-x2gvq"] Apr 23 17:53:25.615708 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.615688 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.616885 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.616861 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:25.616981 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:25.616943 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bztd4" podUID="0c49641e-88eb-49d0-b1e0-5408152b701d" Apr 23 17:53:25.618394 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.618189 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 17:53:25.618394 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.618235 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 17:53:25.618394 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.618236 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 17:53:25.618394 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.618240 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-l2cdq\"" Apr 23 17:53:25.618394 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.618287 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 17:53:25.618394 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.618397 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 17:53:25.619479 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.619458 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qjrbv" Apr 23 17:53:25.620545 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.620514 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.620856 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.620788 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.621842 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.621826 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 17:53:25.621931 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.621871 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 17:53:25.621931 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.621871 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-p7l7k\"" Apr 23 17:53:25.622212 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.622194 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:25.622212 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.622203 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 17:53:25.622335 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:25.622293 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqfsb" podUID="e70550da-839d-4462-b368-c0139f793c15" Apr 23 17:53:25.622905 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.622883 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 17:53:25.622993 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.622904 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 17:53:25.622993 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.622936 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 17:53:25.622993 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.622964 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8s92q\"" Apr 23 17:53:25.623386 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.623365 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 17:53:25.623489 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.623427 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jl6x2\"" Apr 23 17:53:25.624480 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.624372 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mn726" Apr 23 17:53:25.627594 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.627296 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-28r6m\"" Apr 23 17:53:25.627594 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.627415 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 17:53:25.627594 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.627300 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 17:53:25.628648 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.628630 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-442hb" Apr 23 17:53:25.628743 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.628729 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.630128 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.630111 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xvbsl" Apr 23 17:53:25.631073 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.631055 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 17:53:25.631177 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.631110 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-bvtlq\"" Apr 23 17:53:25.631177 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.631140 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 17:53:25.631282 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.631199 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:53:25.631282 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.631061 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-mszg9\"" Apr 23 17:53:25.631474 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.631381 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 17:53:25.631950 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.631930 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.632658 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.632630 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-qg2c5\"" Apr 23 17:53:25.632754 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.632735 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:53:25.632885 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.632866 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 17:53:25.633000 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.632982 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 17:53:25.634286 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.634267 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 17:53:25.634475 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.634462 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 17:53:25.634597 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.634578 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 17:53:25.634698 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.634681 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 17:53:25.634834 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.634815 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-sqdmk\"" Apr 23 17:53:25.634911 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.634902 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 17:53:25.634969 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.634718 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 17:53:25.638066 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638044 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-run\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.638175 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638096 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8c323db9-9645-43cb-b997-4e141600d264-registration-dir\") pod \"aws-ebs-csi-driver-node-fvq78\" (UID: \"8c323db9-9645-43cb-b997-4e141600d264\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.638175 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638122 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-etc-modprobe-d\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.638175 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638145 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5a131586-128b-4207-ac02-4240d9075bc2-system-cni-dir\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.638175 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638169 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x88qv\" (UniqueName: \"kubernetes.io/projected/8c323db9-9645-43cb-b997-4e141600d264-kube-api-access-x88qv\") pod \"aws-ebs-csi-driver-node-fvq78\" (UID: \"8c323db9-9645-43cb-b997-4e141600d264\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.638378 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638220 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-etc-kubernetes\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.638378 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638252 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8c323db9-9645-43cb-b997-4e141600d264-etc-selinux\") pod \"aws-ebs-csi-driver-node-fvq78\" (UID: \"8c323db9-9645-43cb-b997-4e141600d264\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.638378 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638293 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-os-release\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.638378 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638335 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-etc-sysctl-d\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.638551 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638383 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtn2s\" (UniqueName: \"kubernetes.io/projected/5a131586-128b-4207-ac02-4240d9075bc2-kube-api-access-rtn2s\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.638551 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638411 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-host-run-netns\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.638551 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638434 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-cnibin\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.638551 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638461 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs\") pod \"network-metrics-daemon-mqfsb\" (UID: \"e70550da-839d-4462-b368-c0139f793c15\") " pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:25.638551 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638507 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-host-var-lib-cni-multus\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.638551 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638535 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-host-var-lib-kubelet\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.638823 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638558 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-sys\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.638823 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638595 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8c323db9-9645-43cb-b997-4e141600d264-sys-fs\") pod \"aws-ebs-csi-driver-node-fvq78\" (UID: \"8c323db9-9645-43cb-b997-4e141600d264\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.638823 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638645 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdfs7\" (UniqueName: \"kubernetes.io/projected/7560deb4-54dc-4f99-a04b-c7e973e8b201-kube-api-access-cdfs7\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.638823 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638679 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5a131586-128b-4207-ac02-4240d9075bc2-cni-binary-copy\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.638823 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638708 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/308874bf-36fb-4296-aa6f-8568677e83c4-serviceca\") pod \"node-ca-qjrbv\" (UID: \"308874bf-36fb-4296-aa6f-8568677e83c4\") " pod="openshift-image-registry/node-ca-qjrbv" Apr 23 17:53:25.638823 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638743 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8c323db9-9645-43cb-b997-4e141600d264-device-dir\") pod \"aws-ebs-csi-driver-node-fvq78\" (UID: \"8c323db9-9645-43cb-b997-4e141600d264\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.638823 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638768 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/97a978cb-3849-4c89-bce7-b7b3126e771f-agent-certs\") pod \"konnectivity-agent-442hb\" (UID: \"97a978cb-3849-4c89-bce7-b7b3126e771f\") " pod="kube-system/konnectivity-agent-442hb" Apr 23 17:53:25.638823 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638799 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-cni-binary-copy\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.638823 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638821 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-hostroot\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.639249 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638844 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/97a978cb-3849-4c89-bce7-b7b3126e771f-konnectivity-ca\") pod \"konnectivity-agent-442hb\" (UID: \"97a978cb-3849-4c89-bce7-b7b3126e771f\") " pod="kube-system/konnectivity-agent-442hb" Apr 23 17:53:25.639249 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638870 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nxzm\" (UniqueName: \"kubernetes.io/projected/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-kube-api-access-6nxzm\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.639249 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638895 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-var-lib-kubelet\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.639249 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638934 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5a131586-128b-4207-ac02-4240d9075bc2-os-release\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.639249 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638958 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/890e39b8-16d9-4ffa-9934-ca657c99daf2-tmp-dir\") pod \"node-resolver-mn726\" (UID: \"890e39b8-16d9-4ffa-9934-ca657c99daf2\") " pod="openshift-dns/node-resolver-mn726" Apr 23 17:53:25.639249 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.638984 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c64dq\" (UniqueName: \"kubernetes.io/projected/e70550da-839d-4462-b368-c0139f793c15-kube-api-access-c64dq\") pod \"network-metrics-daemon-mqfsb\" (UID: \"e70550da-839d-4462-b368-c0139f793c15\") " pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:25.639249 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639007 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-etc-kubernetes\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.639249 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639031 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-host\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.639249 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639055 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5a131586-128b-4207-ac02-4240d9075bc2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.639249 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639079 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96bdh\" (UniqueName: \"kubernetes.io/projected/890e39b8-16d9-4ffa-9934-ca657c99daf2-kube-api-access-96bdh\") pod \"node-resolver-mn726\" (UID: \"890e39b8-16d9-4ffa-9934-ca657c99daf2\") " pod="openshift-dns/node-resolver-mn726" Apr 23 17:53:25.639249 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639121 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-multus-conf-dir\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.639249 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639143 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-etc-sysctl-conf\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.639249 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639178 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5a131586-128b-4207-ac02-4240d9075bc2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.639249 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639202 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49d2h\" (UniqueName: \"kubernetes.io/projected/0c49641e-88eb-49d0-b1e0-5408152b701d-kube-api-access-49d2h\") pod \"network-check-target-bztd4\" (UID: \"0c49641e-88eb-49d0-b1e0-5408152b701d\") " pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:25.639249 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639223 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/308874bf-36fb-4296-aa6f-8568677e83c4-host\") pod \"node-ca-qjrbv\" (UID: \"308874bf-36fb-4296-aa6f-8568677e83c4\") " pod="openshift-image-registry/node-ca-qjrbv" Apr 23 17:53:25.640463 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639256 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c323db9-9645-43cb-b997-4e141600d264-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fvq78\" (UID: \"8c323db9-9645-43cb-b997-4e141600d264\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.640463 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639295 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-host-run-multus-certs\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.640463 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639352 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5a131586-128b-4207-ac02-4240d9075bc2-cnibin\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.640463 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639376 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8c323db9-9645-43cb-b997-4e141600d264-socket-dir\") pod \"aws-ebs-csi-driver-node-fvq78\" (UID: \"8c323db9-9645-43cb-b997-4e141600d264\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.640463 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639406 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/890e39b8-16d9-4ffa-9934-ca657c99daf2-hosts-file\") pod \"node-resolver-mn726\" (UID: \"890e39b8-16d9-4ffa-9934-ca657c99daf2\") " pod="openshift-dns/node-resolver-mn726" Apr 23 17:53:25.640463 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639448 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-system-cni-dir\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.640463 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639480 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-host-var-lib-cni-bin\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.640463 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639516 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-lib-modules\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.640463 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639608 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7560deb4-54dc-4f99-a04b-c7e973e8b201-etc-tuned\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.640463 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639634 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5a131586-128b-4207-ac02-4240d9075bc2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.640463 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639665 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-multus-cni-dir\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.640463 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639689 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-multus-daemon-config\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.640463 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639713 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7560deb4-54dc-4f99-a04b-c7e973e8b201-tmp\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.640463 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639735 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq9vf\" (UniqueName: \"kubernetes.io/projected/308874bf-36fb-4296-aa6f-8568677e83c4-kube-api-access-lq9vf\") pod \"node-ca-qjrbv\" (UID: \"308874bf-36fb-4296-aa6f-8568677e83c4\") " pod="openshift-image-registry/node-ca-qjrbv" Apr 23 17:53:25.640463 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639758 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-multus-socket-dir-parent\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.640463 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639820 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-host-run-k8s-cni-cncf-io\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.641152 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639855 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-etc-sysconfig\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.641152 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.639878 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-etc-systemd\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.667943 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.667870 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:48:24 +0000 UTC" deadline="2027-12-27 02:37:19.413015214 +0000 UTC" Apr 23 17:53:25.667943 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.667900 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14696h43m53.745118517s" Apr 23 17:53:25.728478 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.728445 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 17:53:25.740486 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.740453 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-host-run-netns\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.740618 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.740499 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2467946b-effa-4a29-a822-8670defce032-host-slash\") pod \"iptables-alerter-xvbsl\" (UID: \"2467946b-effa-4a29-a822-8670defce032\") " pod="openshift-network-operator/iptables-alerter-xvbsl" Apr 23 17:53:25.740618 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.740545 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-cnibin\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.740618 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.740569 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs\") pod \"network-metrics-daemon-mqfsb\" (UID: \"e70550da-839d-4462-b368-c0139f793c15\") " pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:25.740618 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.740572 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-host-run-netns\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.740618 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.740585 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-host-var-lib-cni-multus\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.740867 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.740641 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-host-var-lib-kubelet\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.740867 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.740657 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-cnibin\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.740867 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:25.740742 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:25.740867 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.740759 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-host-var-lib-cni-multus\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.740867 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.740785 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-sys\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.740867 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.740822 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-host-var-lib-kubelet\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.740867 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:25.740836 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs podName:e70550da-839d-4462-b368-c0139f793c15 nodeName:}" failed. No retries permitted until 2026-04-23 17:53:26.240799734 +0000 UTC m=+3.063267464 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs") pod "network-metrics-daemon-mqfsb" (UID: "e70550da-839d-4462-b368-c0139f793c15") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:25.740867 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.740860 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8c323db9-9645-43cb-b997-4e141600d264-sys-fs\") pod \"aws-ebs-csi-driver-node-fvq78\" (UID: \"8c323db9-9645-43cb-b997-4e141600d264\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.741257 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.740896 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-sys\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.741257 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.740933 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdfs7\" (UniqueName: \"kubernetes.io/projected/7560deb4-54dc-4f99-a04b-c7e973e8b201-kube-api-access-cdfs7\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.741257 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.740965 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5a131586-128b-4207-ac02-4240d9075bc2-cni-binary-copy\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.741257 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.740972 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8c323db9-9645-43cb-b997-4e141600d264-sys-fs\") pod \"aws-ebs-csi-driver-node-fvq78\" (UID: \"8c323db9-9645-43cb-b997-4e141600d264\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.741257 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.740991 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/308874bf-36fb-4296-aa6f-8568677e83c4-serviceca\") pod \"node-ca-qjrbv\" (UID: \"308874bf-36fb-4296-aa6f-8568677e83c4\") " pod="openshift-image-registry/node-ca-qjrbv" Apr 23 17:53:25.741257 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741017 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8c323db9-9645-43cb-b997-4e141600d264-device-dir\") pod \"aws-ebs-csi-driver-node-fvq78\" (UID: \"8c323db9-9645-43cb-b997-4e141600d264\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.741257 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741045 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-log-socket\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.741257 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741069 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/97a978cb-3849-4c89-bce7-b7b3126e771f-agent-certs\") pod \"konnectivity-agent-442hb\" (UID: \"97a978cb-3849-4c89-bce7-b7b3126e771f\") " pod="kube-system/konnectivity-agent-442hb" Apr 23 17:53:25.741257 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741101 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-cni-binary-copy\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.741257 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741120 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-hostroot\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.741257 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741162 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/97a978cb-3849-4c89-bce7-b7b3126e771f-konnectivity-ca\") pod \"konnectivity-agent-442hb\" (UID: \"97a978cb-3849-4c89-bce7-b7b3126e771f\") " pod="kube-system/konnectivity-agent-442hb" Apr 23 17:53:25.741257 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741180 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8c323db9-9645-43cb-b997-4e141600d264-device-dir\") pod \"aws-ebs-csi-driver-node-fvq78\" (UID: \"8c323db9-9645-43cb-b997-4e141600d264\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.741257 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741191 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vnkv\" (UniqueName: \"kubernetes.io/projected/2467946b-effa-4a29-a822-8670defce032-kube-api-access-6vnkv\") pod \"iptables-alerter-xvbsl\" (UID: \"2467946b-effa-4a29-a822-8670defce032\") " pod="openshift-network-operator/iptables-alerter-xvbsl" Apr 23 17:53:25.741257 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741202 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-hostroot\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.741257 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741218 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nxzm\" (UniqueName: \"kubernetes.io/projected/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-kube-api-access-6nxzm\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.741257 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741243 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-var-lib-kubelet\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.741257 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741267 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5a131586-128b-4207-ac02-4240d9075bc2-os-release\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.742003 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741289 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/890e39b8-16d9-4ffa-9934-ca657c99daf2-tmp-dir\") pod \"node-resolver-mn726\" (UID: \"890e39b8-16d9-4ffa-9934-ca657c99daf2\") " pod="openshift-dns/node-resolver-mn726" Apr 23 17:53:25.742003 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741330 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-host-run-netns\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.742003 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741355 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-run-openvswitch\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.742003 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741385 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.742003 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741413 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-ovnkube-script-lib\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.742003 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741437 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c64dq\" (UniqueName: \"kubernetes.io/projected/e70550da-839d-4462-b368-c0139f793c15-kube-api-access-c64dq\") pod \"network-metrics-daemon-mqfsb\" (UID: \"e70550da-839d-4462-b368-c0139f793c15\") " pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:25.742003 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741463 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-etc-kubernetes\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.742003 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741482 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-host\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.742003 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741485 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/308874bf-36fb-4296-aa6f-8568677e83c4-serviceca\") pod \"node-ca-qjrbv\" (UID: \"308874bf-36fb-4296-aa6f-8568677e83c4\") " pod="openshift-image-registry/node-ca-qjrbv" Apr 23 17:53:25.742003 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741497 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5a131586-128b-4207-ac02-4240d9075bc2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.742003 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741513 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96bdh\" (UniqueName: \"kubernetes.io/projected/890e39b8-16d9-4ffa-9934-ca657c99daf2-kube-api-access-96bdh\") pod \"node-resolver-mn726\" (UID: \"890e39b8-16d9-4ffa-9934-ca657c99daf2\") " pod="openshift-dns/node-resolver-mn726" Apr 23 17:53:25.742003 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741536 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-host-kubelet\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.742003 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741576 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-run-systemd\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.742003 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741575 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 17:53:25.742003 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741593 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-host-cni-netd\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.742003 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741613 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-multus-conf-dir\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.742003 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741573 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5a131586-128b-4207-ac02-4240d9075bc2-cni-binary-copy\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.742777 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741648 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-etc-sysctl-conf\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.742777 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741668 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5a131586-128b-4207-ac02-4240d9075bc2-os-release\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.742777 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741356 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-var-lib-kubelet\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.742777 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741679 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5a131586-128b-4207-ac02-4240d9075bc2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.742777 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741702 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-etc-kubernetes\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.742777 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741705 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-host\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.742777 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741712 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49d2h\" (UniqueName: \"kubernetes.io/projected/0c49641e-88eb-49d0-b1e0-5408152b701d-kube-api-access-49d2h\") pod \"network-check-target-bztd4\" (UID: \"0c49641e-88eb-49d0-b1e0-5408152b701d\") " pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:25.742777 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741743 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/97a978cb-3849-4c89-bce7-b7b3126e771f-konnectivity-ca\") pod \"konnectivity-agent-442hb\" (UID: \"97a978cb-3849-4c89-bce7-b7b3126e771f\") " pod="kube-system/konnectivity-agent-442hb" Apr 23 17:53:25.742777 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741747 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/308874bf-36fb-4296-aa6f-8568677e83c4-host\") pod \"node-ca-qjrbv\" (UID: \"308874bf-36fb-4296-aa6f-8568677e83c4\") " pod="openshift-image-registry/node-ca-qjrbv" Apr 23 17:53:25.742777 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741779 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5a131586-128b-4207-ac02-4240d9075bc2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.742777 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741877 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c323db9-9645-43cb-b997-4e141600d264-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fvq78\" (UID: \"8c323db9-9645-43cb-b997-4e141600d264\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.742777 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741895 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/308874bf-36fb-4296-aa6f-8568677e83c4-host\") pod \"node-ca-qjrbv\" (UID: \"308874bf-36fb-4296-aa6f-8568677e83c4\") " pod="openshift-image-registry/node-ca-qjrbv" Apr 23 17:53:25.742777 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741915 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-host-run-ovn-kubernetes\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.742777 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.741947 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-host-cni-bin\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.742777 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742070 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c323db9-9645-43cb-b997-4e141600d264-kubelet-dir\") pod \"aws-ebs-csi-driver-node-fvq78\" (UID: \"8c323db9-9645-43cb-b997-4e141600d264\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.742777 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742074 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-multus-conf-dir\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.742777 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742100 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-host-run-multus-certs\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.743590 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742136 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5a131586-128b-4207-ac02-4240d9075bc2-cnibin\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.743590 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742138 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-host-run-multus-certs\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.743590 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742163 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8c323db9-9645-43cb-b997-4e141600d264-socket-dir\") pod \"aws-ebs-csi-driver-node-fvq78\" (UID: \"8c323db9-9645-43cb-b997-4e141600d264\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.743590 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742203 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5a131586-128b-4207-ac02-4240d9075bc2-cnibin\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.743590 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742219 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-etc-sysctl-conf\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.743590 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742247 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/890e39b8-16d9-4ffa-9934-ca657c99daf2-hosts-file\") pod \"node-resolver-mn726\" (UID: \"890e39b8-16d9-4ffa-9934-ca657c99daf2\") " pod="openshift-dns/node-resolver-mn726" Apr 23 17:53:25.743590 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742297 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/890e39b8-16d9-4ffa-9934-ca657c99daf2-hosts-file\") pod \"node-resolver-mn726\" (UID: \"890e39b8-16d9-4ffa-9934-ca657c99daf2\") " pod="openshift-dns/node-resolver-mn726" Apr 23 17:53:25.743590 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742335 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-system-cni-dir\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.743590 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742334 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/890e39b8-16d9-4ffa-9934-ca657c99daf2-tmp-dir\") pod \"node-resolver-mn726\" (UID: \"890e39b8-16d9-4ffa-9934-ca657c99daf2\") " pod="openshift-dns/node-resolver-mn726" Apr 23 17:53:25.743590 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742370 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-host-var-lib-cni-bin\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.743590 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742396 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8c323db9-9645-43cb-b997-4e141600d264-socket-dir\") pod \"aws-ebs-csi-driver-node-fvq78\" (UID: \"8c323db9-9645-43cb-b997-4e141600d264\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.743590 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742401 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-system-cni-dir\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.743590 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742426 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-lib-modules\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.743590 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742442 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7560deb4-54dc-4f99-a04b-c7e973e8b201-etc-tuned\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.743590 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742448 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-cni-binary-copy\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.743590 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742473 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5a131586-128b-4207-ac02-4240d9075bc2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.743590 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742511 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-host-var-lib-cni-bin\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.744382 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742536 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2467946b-effa-4a29-a822-8670defce032-iptables-alerter-script\") pod \"iptables-alerter-xvbsl\" (UID: \"2467946b-effa-4a29-a822-8670defce032\") " pod="openshift-network-operator/iptables-alerter-xvbsl" Apr 23 17:53:25.744382 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742578 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-lib-modules\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.744382 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742625 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-systemd-units\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.744382 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742655 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-multus-cni-dir\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.744382 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742683 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-multus-daemon-config\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.744382 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742731 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7560deb4-54dc-4f99-a04b-c7e973e8b201-tmp\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.744382 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742781 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lq9vf\" (UniqueName: \"kubernetes.io/projected/308874bf-36fb-4296-aa6f-8568677e83c4-kube-api-access-lq9vf\") pod \"node-ca-qjrbv\" (UID: \"308874bf-36fb-4296-aa6f-8568677e83c4\") " pod="openshift-image-registry/node-ca-qjrbv" Apr 23 17:53:25.744382 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742815 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-host-slash\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.744382 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742841 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-multus-socket-dir-parent\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.744382 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742902 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-host-run-k8s-cni-cncf-io\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.744382 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742928 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-etc-sysconfig\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.744382 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742940 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5a131586-128b-4207-ac02-4240d9075bc2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.744382 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742950 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-etc-systemd\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.744382 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.742975 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-run\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.744382 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743018 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-multus-cni-dir\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.744382 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743058 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8c323db9-9645-43cb-b997-4e141600d264-registration-dir\") pod \"aws-ebs-csi-driver-node-fvq78\" (UID: \"8c323db9-9645-43cb-b997-4e141600d264\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.744382 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743110 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-var-lib-openvswitch\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.745066 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743136 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-node-log\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.745066 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743194 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-etc-systemd\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.745066 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743194 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-etc-sysconfig\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.745066 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743217 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-etc-modprobe-d\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.745066 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743248 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5a131586-128b-4207-ac02-4240d9075bc2-system-cni-dir\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.745066 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743261 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8c323db9-9645-43cb-b997-4e141600d264-registration-dir\") pod \"aws-ebs-csi-driver-node-fvq78\" (UID: \"8c323db9-9645-43cb-b997-4e141600d264\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.745066 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743276 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x88qv\" (UniqueName: \"kubernetes.io/projected/8c323db9-9645-43cb-b997-4e141600d264-kube-api-access-x88qv\") pod \"aws-ebs-csi-driver-node-fvq78\" (UID: \"8c323db9-9645-43cb-b997-4e141600d264\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.745066 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743220 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-multus-daemon-config\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.745066 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743298 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-host-run-k8s-cni-cncf-io\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.745066 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743344 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5a131586-128b-4207-ac02-4240d9075bc2-system-cni-dir\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.745066 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743384 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-run\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.745066 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743414 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-etc-modprobe-d\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.745066 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743423 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-multus-socket-dir-parent\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.745066 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743429 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-run-ovn\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.745066 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743539 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-env-overrides\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.745066 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743568 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-ovn-node-metrics-cert\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.745066 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743608 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb6hg\" (UniqueName: \"kubernetes.io/projected/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-kube-api-access-mb6hg\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.745738 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743633 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-etc-kubernetes\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.745738 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743668 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8c323db9-9645-43cb-b997-4e141600d264-etc-selinux\") pod \"aws-ebs-csi-driver-node-fvq78\" (UID: \"8c323db9-9645-43cb-b997-4e141600d264\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.745738 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743693 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-ovnkube-config\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.745738 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743719 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-etc-kubernetes\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.745738 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743708 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5a131586-128b-4207-ac02-4240d9075bc2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.745738 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743761 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-os-release\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.745738 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743781 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-etc-sysctl-d\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.745738 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743793 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8c323db9-9645-43cb-b997-4e141600d264-etc-selinux\") pod \"aws-ebs-csi-driver-node-fvq78\" (UID: \"8c323db9-9645-43cb-b997-4e141600d264\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.745738 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743797 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtn2s\" (UniqueName: \"kubernetes.io/projected/5a131586-128b-4207-ac02-4240d9075bc2-kube-api-access-rtn2s\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.745738 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743835 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-etc-openvswitch\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.745738 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743862 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-os-release\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.745738 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.743964 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7560deb4-54dc-4f99-a04b-c7e973e8b201-etc-sysctl-d\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.745738 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.745136 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7560deb4-54dc-4f99-a04b-c7e973e8b201-tmp\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.745738 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.745161 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7560deb4-54dc-4f99-a04b-c7e973e8b201-etc-tuned\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.745738 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.745442 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/97a978cb-3849-4c89-bce7-b7b3126e771f-agent-certs\") pod \"konnectivity-agent-442hb\" (UID: \"97a978cb-3849-4c89-bce7-b7b3126e771f\") " pod="kube-system/konnectivity-agent-442hb" Apr 23 17:53:25.751901 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:25.751863 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:53:25.751901 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:25.751883 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:53:25.751901 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:25.751896 2578 projected.go:194] Error preparing data for projected volume kube-api-access-49d2h for pod openshift-network-diagnostics/network-check-target-bztd4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:25.752077 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:25.751949 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c49641e-88eb-49d0-b1e0-5408152b701d-kube-api-access-49d2h podName:0c49641e-88eb-49d0-b1e0-5408152b701d nodeName:}" failed. No retries permitted until 2026-04-23 17:53:26.251933228 +0000 UTC m=+3.074400945 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-49d2h" (UniqueName: "kubernetes.io/projected/0c49641e-88eb-49d0-b1e0-5408152b701d-kube-api-access-49d2h") pod "network-check-target-bztd4" (UID: "0c49641e-88eb-49d0-b1e0-5408152b701d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:25.752986 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.752966 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96bdh\" (UniqueName: \"kubernetes.io/projected/890e39b8-16d9-4ffa-9934-ca657c99daf2-kube-api-access-96bdh\") pod \"node-resolver-mn726\" (UID: \"890e39b8-16d9-4ffa-9934-ca657c99daf2\") " pod="openshift-dns/node-resolver-mn726" Apr 23 17:53:25.754022 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.753991 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdfs7\" (UniqueName: \"kubernetes.io/projected/7560deb4-54dc-4f99-a04b-c7e973e8b201-kube-api-access-cdfs7\") pod \"tuned-t84lj\" (UID: \"7560deb4-54dc-4f99-a04b-c7e973e8b201\") " pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.754465 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.754444 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x88qv\" (UniqueName: \"kubernetes.io/projected/8c323db9-9645-43cb-b997-4e141600d264-kube-api-access-x88qv\") pod \"aws-ebs-csi-driver-node-fvq78\" (UID: \"8c323db9-9645-43cb-b997-4e141600d264\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.755752 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.755709 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nxzm\" (UniqueName: \"kubernetes.io/projected/a6b6e3a3-edb0-41a4-877a-1eed7a82403d-kube-api-access-6nxzm\") pod \"multus-48vg7\" (UID: \"a6b6e3a3-edb0-41a4-877a-1eed7a82403d\") " pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.756103 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.756067 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtn2s\" (UniqueName: \"kubernetes.io/projected/5a131586-128b-4207-ac02-4240d9075bc2-kube-api-access-rtn2s\") pod \"multus-additional-cni-plugins-2krs5\" (UID: \"5a131586-128b-4207-ac02-4240d9075bc2\") " pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.756591 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.756568 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c64dq\" (UniqueName: \"kubernetes.io/projected/e70550da-839d-4462-b368-c0139f793c15-kube-api-access-c64dq\") pod \"network-metrics-daemon-mqfsb\" (UID: \"e70550da-839d-4462-b368-c0139f793c15\") " pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:25.756702 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.756621 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq9vf\" (UniqueName: \"kubernetes.io/projected/308874bf-36fb-4296-aa6f-8568677e83c4-kube-api-access-lq9vf\") pod \"node-ca-qjrbv\" (UID: \"308874bf-36fb-4296-aa6f-8568677e83c4\") " pod="openshift-image-registry/node-ca-qjrbv" Apr 23 17:53:25.844730 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.844685 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-etc-openvswitch\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.844730 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.844723 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2467946b-effa-4a29-a822-8670defce032-host-slash\") pod \"iptables-alerter-xvbsl\" (UID: \"2467946b-effa-4a29-a822-8670defce032\") " pod="openshift-network-operator/iptables-alerter-xvbsl" Apr 23 17:53:25.844945 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.844764 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-log-socket\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.844945 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.844786 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vnkv\" (UniqueName: \"kubernetes.io/projected/2467946b-effa-4a29-a822-8670defce032-kube-api-access-6vnkv\") pod \"iptables-alerter-xvbsl\" (UID: \"2467946b-effa-4a29-a822-8670defce032\") " pod="openshift-network-operator/iptables-alerter-xvbsl" Apr 23 17:53:25.844945 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.844792 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-etc-openvswitch\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.844945 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.844804 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-host-run-netns\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.844945 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.844828 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-run-openvswitch\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.844945 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.844843 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2467946b-effa-4a29-a822-8670defce032-host-slash\") pod \"iptables-alerter-xvbsl\" (UID: \"2467946b-effa-4a29-a822-8670defce032\") " pod="openshift-network-operator/iptables-alerter-xvbsl" Apr 23 17:53:25.844945 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.844855 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.844945 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.844864 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-log-socket\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.844945 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.844883 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-ovnkube-script-lib\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.844945 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.844896 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-host-run-netns\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.844945 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.844919 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-host-kubelet\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.845531 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.844953 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-host-kubelet\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.845531 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.844959 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-run-systemd\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.845531 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.844970 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.845531 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.844987 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-host-cni-netd\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.845531 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845012 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-run-openvswitch\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.845531 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845032 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-host-run-ovn-kubernetes\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.845531 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845051 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-run-systemd\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.845531 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845059 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-host-cni-bin\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.845531 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845072 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-host-cni-netd\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.845531 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845111 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-host-run-ovn-kubernetes\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.845531 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845123 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-host-cni-bin\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.845531 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845126 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2467946b-effa-4a29-a822-8670defce032-iptables-alerter-script\") pod \"iptables-alerter-xvbsl\" (UID: \"2467946b-effa-4a29-a822-8670defce032\") " pod="openshift-network-operator/iptables-alerter-xvbsl" Apr 23 17:53:25.845531 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845154 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-systemd-units\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.845531 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845185 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-host-slash\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.845531 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845216 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-var-lib-openvswitch\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.845531 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845240 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-node-log\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.845531 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845267 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-run-ovn\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.846284 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845294 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-env-overrides\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.846284 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845318 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-ovn-node-metrics-cert\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.846284 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845323 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-host-slash\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.846284 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845265 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-systemd-units\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.846284 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845346 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mb6hg\" (UniqueName: \"kubernetes.io/projected/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-kube-api-access-mb6hg\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.846284 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845384 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-ovnkube-config\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.846284 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845295 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-var-lib-openvswitch\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.846284 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845469 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-node-log\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.846284 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845513 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-run-ovn\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.846284 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845520 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-ovnkube-script-lib\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.846284 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845687 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2467946b-effa-4a29-a822-8670defce032-iptables-alerter-script\") pod \"iptables-alerter-xvbsl\" (UID: \"2467946b-effa-4a29-a822-8670defce032\") " pod="openshift-network-operator/iptables-alerter-xvbsl" Apr 23 17:53:25.846284 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.845813 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-env-overrides\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.846284 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.846143 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-ovnkube-config\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.847872 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.847847 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-ovn-node-metrics-cert\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.853818 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.853790 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vnkv\" (UniqueName: \"kubernetes.io/projected/2467946b-effa-4a29-a822-8670defce032-kube-api-access-6vnkv\") pod \"iptables-alerter-xvbsl\" (UID: \"2467946b-effa-4a29-a822-8670defce032\") " pod="openshift-network-operator/iptables-alerter-xvbsl" Apr 23 17:53:25.854226 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.854203 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb6hg\" (UniqueName: \"kubernetes.io/projected/2fb5e8ca-0609-4dd5-ac79-69c12ad152a3-kube-api-access-mb6hg\") pod \"ovnkube-node-x2gvq\" (UID: \"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3\") " pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:25.928318 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.928238 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2krs5" Apr 23 17:53:25.936132 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.936106 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qjrbv" Apr 23 17:53:25.943850 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.943828 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" Apr 23 17:53:25.949625 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.949606 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-48vg7" Apr 23 17:53:25.956016 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.955996 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mn726" Apr 23 17:53:25.961585 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.961564 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-442hb" Apr 23 17:53:25.969185 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.969156 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-t84lj" Apr 23 17:53:25.976741 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.976714 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xvbsl" Apr 23 17:53:25.983387 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:25.983368 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:26.249042 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:26.248844 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs\") pod \"network-metrics-daemon-mqfsb\" (UID: \"e70550da-839d-4462-b368-c0139f793c15\") " pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:26.249042 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:26.248991 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:26.249222 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:26.249075 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs podName:e70550da-839d-4462-b368-c0139f793c15 nodeName:}" failed. No retries permitted until 2026-04-23 17:53:27.249048572 +0000 UTC m=+4.071516306 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs") pod "network-metrics-daemon-mqfsb" (UID: "e70550da-839d-4462-b368-c0139f793c15") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:26.252660 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:26.252639 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a131586_128b_4207_ac02_4240d9075bc2.slice/crio-55e1cee898414c33b52b7d62b88735c0427286aa8a07244c8c46001f5aa97c01 WatchSource:0}: Error finding container 55e1cee898414c33b52b7d62b88735c0427286aa8a07244c8c46001f5aa97c01: Status 404 returned error can't find the container with id 55e1cee898414c33b52b7d62b88735c0427286aa8a07244c8c46001f5aa97c01 Apr 23 17:53:26.254465 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:26.254322 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97a978cb_3849_4c89_bce7_b7b3126e771f.slice/crio-79081b05338f2d34e77872d98836bd0233a73c5aff1c6e19241f100319f98dcc WatchSource:0}: Error finding container 79081b05338f2d34e77872d98836bd0233a73c5aff1c6e19241f100319f98dcc: Status 404 returned error can't find the container with id 79081b05338f2d34e77872d98836bd0233a73c5aff1c6e19241f100319f98dcc Apr 23 17:53:26.256509 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:26.256481 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6b6e3a3_edb0_41a4_877a_1eed7a82403d.slice/crio-3d1fd90191ae3aad4ba6538c55c9fbfea940303296e2c799542a5b98e14dbb85 WatchSource:0}: Error finding container 3d1fd90191ae3aad4ba6538c55c9fbfea940303296e2c799542a5b98e14dbb85: Status 404 returned error can't find the container with id 3d1fd90191ae3aad4ba6538c55c9fbfea940303296e2c799542a5b98e14dbb85 Apr 23 17:53:26.258626 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:26.258603 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2467946b_effa_4a29_a822_8670defce032.slice/crio-b98b47be3d8be9453c65e6866a4f74c176ed2b06adecc4cb64bbb6a359b7b809 WatchSource:0}: Error finding container b98b47be3d8be9453c65e6866a4f74c176ed2b06adecc4cb64bbb6a359b7b809: Status 404 returned error can't find the container with id b98b47be3d8be9453c65e6866a4f74c176ed2b06adecc4cb64bbb6a359b7b809 Apr 23 17:53:26.259346 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:26.259284 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7560deb4_54dc_4f99_a04b_c7e973e8b201.slice/crio-41f18160b8886052a35c76dba2910e32e89ca5b051c0d28e1690f3fba1848de3 WatchSource:0}: Error finding container 41f18160b8886052a35c76dba2910e32e89ca5b051c0d28e1690f3fba1848de3: Status 404 returned error can't find the container with id 41f18160b8886052a35c76dba2910e32e89ca5b051c0d28e1690f3fba1848de3 Apr 23 17:53:26.260418 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:26.260393 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod308874bf_36fb_4296_aa6f_8568677e83c4.slice/crio-61750eb35221fa737ead13a9eb1f23e6c522dd567c5dc5d7955e537da52616f7 WatchSource:0}: Error finding container 61750eb35221fa737ead13a9eb1f23e6c522dd567c5dc5d7955e537da52616f7: Status 404 returned error can't find the container with id 61750eb35221fa737ead13a9eb1f23e6c522dd567c5dc5d7955e537da52616f7 Apr 23 17:53:26.261264 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:26.261244 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fb5e8ca_0609_4dd5_ac79_69c12ad152a3.slice/crio-080a6dd41d054ff67da86d5ecef203043f7a82133a751dc20dfbfd8329fe7c2f WatchSource:0}: Error finding container 080a6dd41d054ff67da86d5ecef203043f7a82133a751dc20dfbfd8329fe7c2f: Status 404 returned error can't find the container with id 080a6dd41d054ff67da86d5ecef203043f7a82133a751dc20dfbfd8329fe7c2f Apr 23 17:53:26.262635 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:26.262616 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c323db9_9645_43cb_b997_4e141600d264.slice/crio-b6cc2570a26d73fc33396e75182d8df2c76177bfe294aa7b988440e0df05bf14 WatchSource:0}: Error finding container b6cc2570a26d73fc33396e75182d8df2c76177bfe294aa7b988440e0df05bf14: Status 404 returned error can't find the container with id b6cc2570a26d73fc33396e75182d8df2c76177bfe294aa7b988440e0df05bf14 Apr 23 17:53:26.263325 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:26.263288 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod890e39b8_16d9_4ffa_9934_ca657c99daf2.slice/crio-6d2bc75c352b6908b03b7618e46d558e91ac5477110b33e1fa1664d4a718c97a WatchSource:0}: Error finding container 6d2bc75c352b6908b03b7618e46d558e91ac5477110b33e1fa1664d4a718c97a: Status 404 returned error can't find the container with id 6d2bc75c352b6908b03b7618e46d558e91ac5477110b33e1fa1664d4a718c97a Apr 23 17:53:26.349602 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:26.349575 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49d2h\" (UniqueName: \"kubernetes.io/projected/0c49641e-88eb-49d0-b1e0-5408152b701d-kube-api-access-49d2h\") pod \"network-check-target-bztd4\" (UID: \"0c49641e-88eb-49d0-b1e0-5408152b701d\") " pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:26.349699 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:26.349681 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:53:26.349755 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:26.349702 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:53:26.349755 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:26.349712 2578 projected.go:194] Error preparing data for projected volume kube-api-access-49d2h for pod openshift-network-diagnostics/network-check-target-bztd4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:26.349755 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:26.349753 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c49641e-88eb-49d0-b1e0-5408152b701d-kube-api-access-49d2h podName:0c49641e-88eb-49d0-b1e0-5408152b701d nodeName:}" failed. No retries permitted until 2026-04-23 17:53:27.349739898 +0000 UTC m=+4.172207612 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-49d2h" (UniqueName: "kubernetes.io/projected/0c49641e-88eb-49d0-b1e0-5408152b701d-kube-api-access-49d2h") pod "network-check-target-bztd4" (UID: "0c49641e-88eb-49d0-b1e0-5408152b701d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:26.668178 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:26.668076 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 17:48:24 +0000 UTC" deadline="2027-11-04 19:38:12.981010255 +0000 UTC" Apr 23 17:53:26.668178 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:26.668133 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13441h44m46.312881631s" Apr 23 17:53:26.764711 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:26.763976 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-63.ec2.internal" event={"ID":"fb1b5e25e4662b1f0eb7136557c5a4df","Type":"ContainerStarted","Data":"1d28ee53ea6c487763ed49ebd11e91e172595c75ffa0630ea855877958b66ef7"} Apr 23 17:53:26.767331 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:26.767272 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mn726" event={"ID":"890e39b8-16d9-4ffa-9934-ca657c99daf2","Type":"ContainerStarted","Data":"6d2bc75c352b6908b03b7618e46d558e91ac5477110b33e1fa1664d4a718c97a"} Apr 23 17:53:26.769725 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:26.769681 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" event={"ID":"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3","Type":"ContainerStarted","Data":"080a6dd41d054ff67da86d5ecef203043f7a82133a751dc20dfbfd8329fe7c2f"} Apr 23 17:53:26.771682 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:26.771623 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qjrbv" event={"ID":"308874bf-36fb-4296-aa6f-8568677e83c4","Type":"ContainerStarted","Data":"61750eb35221fa737ead13a9eb1f23e6c522dd567c5dc5d7955e537da52616f7"} Apr 23 17:53:26.774498 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:26.774449 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-442hb" event={"ID":"97a978cb-3849-4c89-bce7-b7b3126e771f","Type":"ContainerStarted","Data":"79081b05338f2d34e77872d98836bd0233a73c5aff1c6e19241f100319f98dcc"} Apr 23 17:53:26.791045 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:26.791017 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-t84lj" event={"ID":"7560deb4-54dc-4f99-a04b-c7e973e8b201","Type":"ContainerStarted","Data":"41f18160b8886052a35c76dba2910e32e89ca5b051c0d28e1690f3fba1848de3"} Apr 23 17:53:26.795013 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:26.794831 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" event={"ID":"8c323db9-9645-43cb-b997-4e141600d264","Type":"ContainerStarted","Data":"b6cc2570a26d73fc33396e75182d8df2c76177bfe294aa7b988440e0df05bf14"} Apr 23 17:53:26.799928 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:26.799905 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xvbsl" event={"ID":"2467946b-effa-4a29-a822-8670defce032","Type":"ContainerStarted","Data":"b98b47be3d8be9453c65e6866a4f74c176ed2b06adecc4cb64bbb6a359b7b809"} Apr 23 17:53:26.806813 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:26.806790 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-48vg7" event={"ID":"a6b6e3a3-edb0-41a4-877a-1eed7a82403d","Type":"ContainerStarted","Data":"3d1fd90191ae3aad4ba6538c55c9fbfea940303296e2c799542a5b98e14dbb85"} Apr 23 17:53:26.808610 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:26.808576 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2krs5" event={"ID":"5a131586-128b-4207-ac02-4240d9075bc2","Type":"ContainerStarted","Data":"55e1cee898414c33b52b7d62b88735c0427286aa8a07244c8c46001f5aa97c01"} Apr 23 17:53:27.268103 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:27.263104 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs\") pod \"network-metrics-daemon-mqfsb\" (UID: \"e70550da-839d-4462-b368-c0139f793c15\") " pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:27.268103 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:27.265362 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:27.268103 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:27.265453 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs podName:e70550da-839d-4462-b368-c0139f793c15 nodeName:}" failed. No retries permitted until 2026-04-23 17:53:29.265430116 +0000 UTC m=+6.087897830 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs") pod "network-metrics-daemon-mqfsb" (UID: "e70550da-839d-4462-b368-c0139f793c15") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:27.311788 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:27.311725 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 17:53:27.364560 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:27.364526 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49d2h\" (UniqueName: \"kubernetes.io/projected/0c49641e-88eb-49d0-b1e0-5408152b701d-kube-api-access-49d2h\") pod \"network-check-target-bztd4\" (UID: \"0c49641e-88eb-49d0-b1e0-5408152b701d\") " pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:27.364723 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:27.364676 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:53:27.364723 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:27.364698 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:53:27.364723 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:27.364709 2578 projected.go:194] Error preparing data for projected volume kube-api-access-49d2h for pod openshift-network-diagnostics/network-check-target-bztd4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:27.364891 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:27.364765 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c49641e-88eb-49d0-b1e0-5408152b701d-kube-api-access-49d2h podName:0c49641e-88eb-49d0-b1e0-5408152b701d nodeName:}" failed. No retries permitted until 2026-04-23 17:53:29.364748374 +0000 UTC m=+6.187216106 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-49d2h" (UniqueName: "kubernetes.io/projected/0c49641e-88eb-49d0-b1e0-5408152b701d-kube-api-access-49d2h") pod "network-check-target-bztd4" (UID: "0c49641e-88eb-49d0-b1e0-5408152b701d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:27.752716 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:27.752222 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:27.752716 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:27.752330 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:27.752716 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:27.752333 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bztd4" podUID="0c49641e-88eb-49d0-b1e0-5408152b701d" Apr 23 17:53:27.752716 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:27.752441 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqfsb" podUID="e70550da-839d-4462-b368-c0139f793c15" Apr 23 17:53:27.818493 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:27.817299 2578 generic.go:358] "Generic (PLEG): container finished" podID="7e0da96a27e0b9059a934c06d8e50b1f" containerID="cf7450efea2583e3363e697e3b1f71c1039ef287be35768dc3d82cf01c665ef1" exitCode=0 Apr 23 17:53:27.818493 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:27.818211 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-63.ec2.internal" event={"ID":"7e0da96a27e0b9059a934c06d8e50b1f","Type":"ContainerDied","Data":"cf7450efea2583e3363e697e3b1f71c1039ef287be35768dc3d82cf01c665ef1"} Apr 23 17:53:27.835828 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:27.835782 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-142-63.ec2.internal" podStartSLOduration=2.8357418 podStartE2EDuration="2.8357418s" podCreationTimestamp="2026-04-23 17:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:53:26.779013114 +0000 UTC m=+3.601480851" watchObservedRunningTime="2026-04-23 17:53:27.8357418 +0000 UTC m=+4.658209537" Apr 23 17:53:28.826171 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:28.825520 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-63.ec2.internal" event={"ID":"7e0da96a27e0b9059a934c06d8e50b1f","Type":"ContainerStarted","Data":"9382a1748769f831161166c0bfd27099391f69404e31584a5e14269fbb76c20b"} Apr 23 17:53:29.280128 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:29.280023 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs\") pod \"network-metrics-daemon-mqfsb\" (UID: \"e70550da-839d-4462-b368-c0139f793c15\") " pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:29.280297 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:29.280189 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:29.280297 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:29.280261 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs podName:e70550da-839d-4462-b368-c0139f793c15 nodeName:}" failed. No retries permitted until 2026-04-23 17:53:33.280242356 +0000 UTC m=+10.102710081 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs") pod "network-metrics-daemon-mqfsb" (UID: "e70550da-839d-4462-b368-c0139f793c15") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:29.380856 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:29.380807 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49d2h\" (UniqueName: \"kubernetes.io/projected/0c49641e-88eb-49d0-b1e0-5408152b701d-kube-api-access-49d2h\") pod \"network-check-target-bztd4\" (UID: \"0c49641e-88eb-49d0-b1e0-5408152b701d\") " pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:29.381032 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:29.381012 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:53:29.381132 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:29.381033 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:53:29.381132 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:29.381046 2578 projected.go:194] Error preparing data for projected volume kube-api-access-49d2h for pod openshift-network-diagnostics/network-check-target-bztd4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:29.381132 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:29.381121 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c49641e-88eb-49d0-b1e0-5408152b701d-kube-api-access-49d2h podName:0c49641e-88eb-49d0-b1e0-5408152b701d nodeName:}" failed. No retries permitted until 2026-04-23 17:53:33.381101454 +0000 UTC m=+10.203569186 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-49d2h" (UniqueName: "kubernetes.io/projected/0c49641e-88eb-49d0-b1e0-5408152b701d-kube-api-access-49d2h") pod "network-check-target-bztd4" (UID: "0c49641e-88eb-49d0-b1e0-5408152b701d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:29.749555 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:29.749468 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:29.749722 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:29.749475 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:29.749722 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:29.749600 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bztd4" podUID="0c49641e-88eb-49d0-b1e0-5408152b701d" Apr 23 17:53:29.749722 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:29.749678 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqfsb" podUID="e70550da-839d-4462-b368-c0139f793c15" Apr 23 17:53:31.749410 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:31.749374 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:31.749855 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:31.749420 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:31.749855 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:31.749510 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bztd4" podUID="0c49641e-88eb-49d0-b1e0-5408152b701d" Apr 23 17:53:31.749855 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:31.749634 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqfsb" podUID="e70550da-839d-4462-b368-c0139f793c15" Apr 23 17:53:33.248003 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:33.247945 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-142-63.ec2.internal" podStartSLOduration=8.247928098 podStartE2EDuration="8.247928098s" podCreationTimestamp="2026-04-23 17:53:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:53:28.841869483 +0000 UTC m=+5.664337223" watchObservedRunningTime="2026-04-23 17:53:33.247928098 +0000 UTC m=+10.070395833" Apr 23 17:53:33.248619 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:33.248590 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-dr7p4"] Apr 23 17:53:33.251727 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:33.251604 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:33.251727 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:33.251697 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dr7p4" podUID="16856cbe-119d-49c7-aa8f-a7d0f4002555" Apr 23 17:53:33.313018 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:33.312985 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs\") pod \"network-metrics-daemon-mqfsb\" (UID: \"e70550da-839d-4462-b368-c0139f793c15\") " pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:33.313208 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:33.313042 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/16856cbe-119d-49c7-aa8f-a7d0f4002555-kubelet-config\") pod \"global-pull-secret-syncer-dr7p4\" (UID: \"16856cbe-119d-49c7-aa8f-a7d0f4002555\") " pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:33.313208 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:33.313110 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/16856cbe-119d-49c7-aa8f-a7d0f4002555-original-pull-secret\") pod \"global-pull-secret-syncer-dr7p4\" (UID: \"16856cbe-119d-49c7-aa8f-a7d0f4002555\") " pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:33.313208 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:33.313151 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/16856cbe-119d-49c7-aa8f-a7d0f4002555-dbus\") pod \"global-pull-secret-syncer-dr7p4\" (UID: \"16856cbe-119d-49c7-aa8f-a7d0f4002555\") " pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:33.313366 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:33.313280 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:33.313366 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:33.313332 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs podName:e70550da-839d-4462-b368-c0139f793c15 nodeName:}" failed. No retries permitted until 2026-04-23 17:53:41.31331367 +0000 UTC m=+18.135781387 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs") pod "network-metrics-daemon-mqfsb" (UID: "e70550da-839d-4462-b368-c0139f793c15") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:33.413904 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:33.413630 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/16856cbe-119d-49c7-aa8f-a7d0f4002555-kubelet-config\") pod \"global-pull-secret-syncer-dr7p4\" (UID: \"16856cbe-119d-49c7-aa8f-a7d0f4002555\") " pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:33.413904 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:33.413689 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49d2h\" (UniqueName: \"kubernetes.io/projected/0c49641e-88eb-49d0-b1e0-5408152b701d-kube-api-access-49d2h\") pod \"network-check-target-bztd4\" (UID: \"0c49641e-88eb-49d0-b1e0-5408152b701d\") " pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:33.413904 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:33.413720 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/16856cbe-119d-49c7-aa8f-a7d0f4002555-original-pull-secret\") pod \"global-pull-secret-syncer-dr7p4\" (UID: \"16856cbe-119d-49c7-aa8f-a7d0f4002555\") " pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:33.413904 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:33.413751 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/16856cbe-119d-49c7-aa8f-a7d0f4002555-dbus\") pod \"global-pull-secret-syncer-dr7p4\" (UID: \"16856cbe-119d-49c7-aa8f-a7d0f4002555\") " pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:33.413904 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:33.413875 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/16856cbe-119d-49c7-aa8f-a7d0f4002555-kubelet-config\") pod \"global-pull-secret-syncer-dr7p4\" (UID: \"16856cbe-119d-49c7-aa8f-a7d0f4002555\") " pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:33.414316 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:33.413926 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:53:33.414316 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:33.413953 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:53:33.414316 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:33.413967 2578 projected.go:194] Error preparing data for projected volume kube-api-access-49d2h for pod openshift-network-diagnostics/network-check-target-bztd4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:33.414316 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:33.414021 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c49641e-88eb-49d0-b1e0-5408152b701d-kube-api-access-49d2h podName:0c49641e-88eb-49d0-b1e0-5408152b701d nodeName:}" failed. No retries permitted until 2026-04-23 17:53:41.414002441 +0000 UTC m=+18.236470157 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-49d2h" (UniqueName: "kubernetes.io/projected/0c49641e-88eb-49d0-b1e0-5408152b701d-kube-api-access-49d2h") pod "network-check-target-bztd4" (UID: "0c49641e-88eb-49d0-b1e0-5408152b701d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:33.414316 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:33.414027 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/16856cbe-119d-49c7-aa8f-a7d0f4002555-dbus\") pod \"global-pull-secret-syncer-dr7p4\" (UID: \"16856cbe-119d-49c7-aa8f-a7d0f4002555\") " pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:33.414316 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:33.414150 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:53:33.414316 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:33.414201 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16856cbe-119d-49c7-aa8f-a7d0f4002555-original-pull-secret podName:16856cbe-119d-49c7-aa8f-a7d0f4002555 nodeName:}" failed. No retries permitted until 2026-04-23 17:53:33.914183313 +0000 UTC m=+10.736651027 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/16856cbe-119d-49c7-aa8f-a7d0f4002555-original-pull-secret") pod "global-pull-secret-syncer-dr7p4" (UID: "16856cbe-119d-49c7-aa8f-a7d0f4002555") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:53:33.750507 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:33.750474 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:33.750686 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:33.750579 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bztd4" podUID="0c49641e-88eb-49d0-b1e0-5408152b701d" Apr 23 17:53:33.750830 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:33.750802 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:33.750970 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:33.750909 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqfsb" podUID="e70550da-839d-4462-b368-c0139f793c15" Apr 23 17:53:33.917431 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:33.917299 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/16856cbe-119d-49c7-aa8f-a7d0f4002555-original-pull-secret\") pod \"global-pull-secret-syncer-dr7p4\" (UID: \"16856cbe-119d-49c7-aa8f-a7d0f4002555\") " pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:33.917602 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:33.917469 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:53:33.917602 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:33.917545 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16856cbe-119d-49c7-aa8f-a7d0f4002555-original-pull-secret podName:16856cbe-119d-49c7-aa8f-a7d0f4002555 nodeName:}" failed. No retries permitted until 2026-04-23 17:53:34.917524221 +0000 UTC m=+11.739991941 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/16856cbe-119d-49c7-aa8f-a7d0f4002555-original-pull-secret") pod "global-pull-secret-syncer-dr7p4" (UID: "16856cbe-119d-49c7-aa8f-a7d0f4002555") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:53:34.749934 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:34.749448 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:34.749934 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:34.749578 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dr7p4" podUID="16856cbe-119d-49c7-aa8f-a7d0f4002555" Apr 23 17:53:34.924835 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:34.924800 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/16856cbe-119d-49c7-aa8f-a7d0f4002555-original-pull-secret\") pod \"global-pull-secret-syncer-dr7p4\" (UID: \"16856cbe-119d-49c7-aa8f-a7d0f4002555\") " pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:34.925072 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:34.924941 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:53:34.925072 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:34.925007 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16856cbe-119d-49c7-aa8f-a7d0f4002555-original-pull-secret podName:16856cbe-119d-49c7-aa8f-a7d0f4002555 nodeName:}" failed. No retries permitted until 2026-04-23 17:53:36.924988942 +0000 UTC m=+13.747456673 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/16856cbe-119d-49c7-aa8f-a7d0f4002555-original-pull-secret") pod "global-pull-secret-syncer-dr7p4" (UID: "16856cbe-119d-49c7-aa8f-a7d0f4002555") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:53:35.749335 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:35.749299 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:35.749511 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:35.749316 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:35.749511 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:35.749411 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bztd4" podUID="0c49641e-88eb-49d0-b1e0-5408152b701d" Apr 23 17:53:35.749511 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:35.749478 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqfsb" podUID="e70550da-839d-4462-b368-c0139f793c15" Apr 23 17:53:36.749434 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:36.749398 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:36.749894 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:36.749527 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dr7p4" podUID="16856cbe-119d-49c7-aa8f-a7d0f4002555" Apr 23 17:53:36.939543 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:36.939503 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/16856cbe-119d-49c7-aa8f-a7d0f4002555-original-pull-secret\") pod \"global-pull-secret-syncer-dr7p4\" (UID: \"16856cbe-119d-49c7-aa8f-a7d0f4002555\") " pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:36.939706 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:36.939658 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:53:36.939761 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:36.939724 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16856cbe-119d-49c7-aa8f-a7d0f4002555-original-pull-secret podName:16856cbe-119d-49c7-aa8f-a7d0f4002555 nodeName:}" failed. No retries permitted until 2026-04-23 17:53:40.939710058 +0000 UTC m=+17.762177775 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/16856cbe-119d-49c7-aa8f-a7d0f4002555-original-pull-secret") pod "global-pull-secret-syncer-dr7p4" (UID: "16856cbe-119d-49c7-aa8f-a7d0f4002555") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:53:37.749813 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:37.749776 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:37.750310 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:37.749776 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:37.750310 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:37.749889 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqfsb" podUID="e70550da-839d-4462-b368-c0139f793c15" Apr 23 17:53:37.750310 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:37.749981 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bztd4" podUID="0c49641e-88eb-49d0-b1e0-5408152b701d" Apr 23 17:53:38.749595 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:38.749559 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:38.749773 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:38.749697 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dr7p4" podUID="16856cbe-119d-49c7-aa8f-a7d0f4002555" Apr 23 17:53:39.749304 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:39.749211 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:39.749304 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:39.749248 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:39.749760 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:39.749345 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bztd4" podUID="0c49641e-88eb-49d0-b1e0-5408152b701d" Apr 23 17:53:39.749760 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:39.749536 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqfsb" podUID="e70550da-839d-4462-b368-c0139f793c15" Apr 23 17:53:40.749583 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:40.749549 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:40.750036 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:40.749672 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dr7p4" podUID="16856cbe-119d-49c7-aa8f-a7d0f4002555" Apr 23 17:53:40.969081 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:40.969040 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/16856cbe-119d-49c7-aa8f-a7d0f4002555-original-pull-secret\") pod \"global-pull-secret-syncer-dr7p4\" (UID: \"16856cbe-119d-49c7-aa8f-a7d0f4002555\") " pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:40.969272 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:40.969188 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:53:40.969272 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:40.969264 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16856cbe-119d-49c7-aa8f-a7d0f4002555-original-pull-secret podName:16856cbe-119d-49c7-aa8f-a7d0f4002555 nodeName:}" failed. No retries permitted until 2026-04-23 17:53:48.969241377 +0000 UTC m=+25.791709091 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/16856cbe-119d-49c7-aa8f-a7d0f4002555-original-pull-secret") pod "global-pull-secret-syncer-dr7p4" (UID: "16856cbe-119d-49c7-aa8f-a7d0f4002555") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:53:41.373017 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:41.372978 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs\") pod \"network-metrics-daemon-mqfsb\" (UID: \"e70550da-839d-4462-b368-c0139f793c15\") " pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:41.373243 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:41.373125 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:41.373243 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:41.373196 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs podName:e70550da-839d-4462-b368-c0139f793c15 nodeName:}" failed. No retries permitted until 2026-04-23 17:53:57.373176953 +0000 UTC m=+34.195644670 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs") pod "network-metrics-daemon-mqfsb" (UID: "e70550da-839d-4462-b368-c0139f793c15") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 17:53:41.474167 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:41.474119 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49d2h\" (UniqueName: \"kubernetes.io/projected/0c49641e-88eb-49d0-b1e0-5408152b701d-kube-api-access-49d2h\") pod \"network-check-target-bztd4\" (UID: \"0c49641e-88eb-49d0-b1e0-5408152b701d\") " pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:41.474327 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:41.474292 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 17:53:41.474327 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:41.474319 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 17:53:41.474327 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:41.474330 2578 projected.go:194] Error preparing data for projected volume kube-api-access-49d2h for pod openshift-network-diagnostics/network-check-target-bztd4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:41.474480 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:41.474390 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c49641e-88eb-49d0-b1e0-5408152b701d-kube-api-access-49d2h podName:0c49641e-88eb-49d0-b1e0-5408152b701d nodeName:}" failed. No retries permitted until 2026-04-23 17:53:57.474369881 +0000 UTC m=+34.296837600 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-49d2h" (UniqueName: "kubernetes.io/projected/0c49641e-88eb-49d0-b1e0-5408152b701d-kube-api-access-49d2h") pod "network-check-target-bztd4" (UID: "0c49641e-88eb-49d0-b1e0-5408152b701d") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 17:53:41.749188 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:41.749080 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:41.749188 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:41.749141 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:41.749390 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:41.749232 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bztd4" podUID="0c49641e-88eb-49d0-b1e0-5408152b701d" Apr 23 17:53:41.749390 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:41.749370 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqfsb" podUID="e70550da-839d-4462-b368-c0139f793c15" Apr 23 17:53:42.749066 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:42.749035 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:42.749555 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:42.749169 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dr7p4" podUID="16856cbe-119d-49c7-aa8f-a7d0f4002555" Apr 23 17:53:43.749578 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:43.749421 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:43.750008 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:43.749486 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:43.750008 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:43.749636 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bztd4" podUID="0c49641e-88eb-49d0-b1e0-5408152b701d" Apr 23 17:53:43.750008 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:43.749722 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqfsb" podUID="e70550da-839d-4462-b368-c0139f793c15" Apr 23 17:53:43.852629 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:43.852606 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mn726" event={"ID":"890e39b8-16d9-4ffa-9934-ca657c99daf2","Type":"ContainerStarted","Data":"7a5cb84eafa23c5ef171b17a5eb644662ce598b08e0dc047e576e7db4be8a41e"} Apr 23 17:53:43.853925 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:43.853901 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" event={"ID":"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3","Type":"ContainerStarted","Data":"817bc7988bf72c72082cdc8ea4c1790e9425f5f6e74957b604b205c06ba74d10"} Apr 23 17:53:43.855144 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:43.855121 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qjrbv" event={"ID":"308874bf-36fb-4296-aa6f-8568677e83c4","Type":"ContainerStarted","Data":"ead808703786fbddd0841ac7e3a3f3b42b6a44fb323d63c590f04c868113629c"} Apr 23 17:53:43.856240 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:43.856212 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-442hb" event={"ID":"97a978cb-3849-4c89-bce7-b7b3126e771f","Type":"ContainerStarted","Data":"719fe3e5aef88830b38530156e663da97fb0b4eeee91b4a0c4061be534dbc9b7"} Apr 23 17:53:43.857351 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:43.857332 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-t84lj" event={"ID":"7560deb4-54dc-4f99-a04b-c7e973e8b201","Type":"ContainerStarted","Data":"ee686d13292ffc6365012d41f06500abbfb874f6b0a5d51332ebddca3109f774"} Apr 23 17:53:43.858484 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:43.858465 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" event={"ID":"8c323db9-9645-43cb-b997-4e141600d264","Type":"ContainerStarted","Data":"33de9dbe92bd7727f34632a77e0b26d8a7b86d2452b7c10e14d0fc29197b3632"} Apr 23 17:53:43.859502 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:43.859481 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-48vg7" event={"ID":"a6b6e3a3-edb0-41a4-877a-1eed7a82403d","Type":"ContainerStarted","Data":"59b14fa08df8f8f1f0987dbd54fbc174435665fc27f77618056312c401765443"} Apr 23 17:53:43.860741 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:43.860713 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2krs5" event={"ID":"5a131586-128b-4207-ac02-4240d9075bc2","Type":"ContainerStarted","Data":"641b3c5eb713425db247faf2a2cd5c561e232412183f2cc91f5656836c416936"} Apr 23 17:53:43.868177 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:43.868143 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-mn726" podStartSLOduration=2.7451951660000002 podStartE2EDuration="19.868132681s" podCreationTimestamp="2026-04-23 17:53:24 +0000 UTC" firstStartedPulling="2026-04-23 17:53:26.265417387 +0000 UTC m=+3.087885101" lastFinishedPulling="2026-04-23 17:53:43.388354901 +0000 UTC m=+20.210822616" observedRunningTime="2026-04-23 17:53:43.867740597 +0000 UTC m=+20.690208345" watchObservedRunningTime="2026-04-23 17:53:43.868132681 +0000 UTC m=+20.690600412" Apr 23 17:53:43.888714 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:43.888678 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-t84lj" podStartSLOduration=2.759526879 podStartE2EDuration="19.888670524s" podCreationTimestamp="2026-04-23 17:53:24 +0000 UTC" firstStartedPulling="2026-04-23 17:53:26.261154714 +0000 UTC m=+3.083622435" lastFinishedPulling="2026-04-23 17:53:43.39029835 +0000 UTC m=+20.212766080" observedRunningTime="2026-04-23 17:53:43.888244055 +0000 UTC m=+20.710711800" watchObservedRunningTime="2026-04-23 17:53:43.888670524 +0000 UTC m=+20.711138258" Apr 23 17:53:43.922577 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:43.922534 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qjrbv" podStartSLOduration=8.631756505 podStartE2EDuration="20.922519869s" podCreationTimestamp="2026-04-23 17:53:23 +0000 UTC" firstStartedPulling="2026-04-23 17:53:26.262414656 +0000 UTC m=+3.084882369" lastFinishedPulling="2026-04-23 17:53:38.553178019 +0000 UTC m=+15.375645733" observedRunningTime="2026-04-23 17:53:43.921967784 +0000 UTC m=+20.744435520" watchObservedRunningTime="2026-04-23 17:53:43.922519869 +0000 UTC m=+20.744987607" Apr 23 17:53:43.969013 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:43.968974 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-48vg7" podStartSLOduration=2.819072107 podStartE2EDuration="19.968962408s" podCreationTimestamp="2026-04-23 17:53:24 +0000 UTC" firstStartedPulling="2026-04-23 17:53:26.258365914 +0000 UTC m=+3.080833627" lastFinishedPulling="2026-04-23 17:53:43.408256215 +0000 UTC m=+20.230723928" observedRunningTime="2026-04-23 17:53:43.951611972 +0000 UTC m=+20.774079708" watchObservedRunningTime="2026-04-23 17:53:43.968962408 +0000 UTC m=+20.791430143" Apr 23 17:53:43.969332 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:43.969293 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-442hb" podStartSLOduration=2.837178588 podStartE2EDuration="19.969282336s" podCreationTimestamp="2026-04-23 17:53:24 +0000 UTC" firstStartedPulling="2026-04-23 17:53:26.256382487 +0000 UTC m=+3.078850202" lastFinishedPulling="2026-04-23 17:53:43.388486238 +0000 UTC m=+20.210953950" observedRunningTime="2026-04-23 17:53:43.968731931 +0000 UTC m=+20.791199666" watchObservedRunningTime="2026-04-23 17:53:43.969282336 +0000 UTC m=+20.791750072" Apr 23 17:53:44.749741 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:44.749558 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:44.750299 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:44.749808 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dr7p4" podUID="16856cbe-119d-49c7-aa8f-a7d0f4002555" Apr 23 17:53:44.865451 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:44.865417 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xvbsl" event={"ID":"2467946b-effa-4a29-a822-8670defce032","Type":"ContainerStarted","Data":"75463e828e1c2ed8d1242fc16a04e157c1e32ff9b71128fc77ae98681ed9cbc3"} Apr 23 17:53:44.866656 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:44.866631 2578 generic.go:358] "Generic (PLEG): container finished" podID="5a131586-128b-4207-ac02-4240d9075bc2" containerID="641b3c5eb713425db247faf2a2cd5c561e232412183f2cc91f5656836c416936" exitCode=0 Apr 23 17:53:44.866772 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:44.866702 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2krs5" event={"ID":"5a131586-128b-4207-ac02-4240d9075bc2","Type":"ContainerDied","Data":"641b3c5eb713425db247faf2a2cd5c561e232412183f2cc91f5656836c416936"} Apr 23 17:53:44.869258 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:44.869241 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 17:53:44.869561 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:44.869522 2578 generic.go:358] "Generic (PLEG): container finished" podID="2fb5e8ca-0609-4dd5-ac79-69c12ad152a3" containerID="1147830d1d7b8663f5a4db566b5988889cff5fd6559d9023c53b23fead399f3a" exitCode=1 Apr 23 17:53:44.869634 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:44.869609 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" event={"ID":"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3","Type":"ContainerStarted","Data":"eb973c93115db9dfab74fe5841093b3f014b3a23ab7f9f0dad46debb4003bbac"} Apr 23 17:53:44.869688 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:44.869639 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" event={"ID":"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3","Type":"ContainerStarted","Data":"51087604c2d6ad8975db501286ef05de76789a0db12315e27085e5a4bb141825"} Apr 23 17:53:44.869688 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:44.869653 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" event={"ID":"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3","Type":"ContainerStarted","Data":"377bb51d3d88e74a2ca31dcd19f68eb2c8fa7332967377f3df8404df436949c6"} Apr 23 17:53:44.869688 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:44.869666 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" event={"ID":"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3","Type":"ContainerStarted","Data":"1ff4933a20049a8feca2eab42f512fcdb21c63d6b7569effd7b3d71a6af8b06c"} Apr 23 17:53:44.869688 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:44.869680 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" event={"ID":"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3","Type":"ContainerDied","Data":"1147830d1d7b8663f5a4db566b5988889cff5fd6559d9023c53b23fead399f3a"} Apr 23 17:53:44.881323 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:44.881282 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-xvbsl" podStartSLOduration=3.753323189 podStartE2EDuration="20.881270149s" podCreationTimestamp="2026-04-23 17:53:24 +0000 UTC" firstStartedPulling="2026-04-23 17:53:26.260626708 +0000 UTC m=+3.083094421" lastFinishedPulling="2026-04-23 17:53:43.388573665 +0000 UTC m=+20.211041381" observedRunningTime="2026-04-23 17:53:44.880593904 +0000 UTC m=+21.703061639" watchObservedRunningTime="2026-04-23 17:53:44.881270149 +0000 UTC m=+21.703737884" Apr 23 17:53:45.197121 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:45.197080 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 17:53:45.315962 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:45.315891 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-442hb" Apr 23 17:53:45.316684 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:45.316662 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-442hb" Apr 23 17:53:45.711368 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:45.711227 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T17:53:45.197115428Z","UUID":"8f1b5a04-723b-4720-b696-6bee82b40034","Handler":null,"Name":"","Endpoint":""} Apr 23 17:53:45.715737 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:45.715711 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 17:53:45.715737 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:45.715740 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 17:53:45.749227 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:45.749198 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:45.749388 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:45.749316 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bztd4" podUID="0c49641e-88eb-49d0-b1e0-5408152b701d" Apr 23 17:53:45.749454 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:45.749388 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:45.749534 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:45.749510 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqfsb" podUID="e70550da-839d-4462-b368-c0139f793c15" Apr 23 17:53:45.876839 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:45.876765 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" event={"ID":"8c323db9-9645-43cb-b997-4e141600d264","Type":"ContainerStarted","Data":"e14249dacc6ba70b0e564fc565f102497c30cc60080c54b4e8174ea115b435b8"} Apr 23 17:53:45.877365 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:45.877343 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-442hb" Apr 23 17:53:45.877995 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:45.877968 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-442hb" Apr 23 17:53:46.749893 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:46.749667 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:46.750069 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:46.749921 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dr7p4" podUID="16856cbe-119d-49c7-aa8f-a7d0f4002555" Apr 23 17:53:46.880806 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:46.880780 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 17:53:46.881259 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:46.881189 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" event={"ID":"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3","Type":"ContainerStarted","Data":"7158386a2f713edc0caa8187b611aeb0523c30816809e459933557f7bea12d78"} Apr 23 17:53:46.883358 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:46.883323 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" event={"ID":"8c323db9-9645-43cb-b997-4e141600d264","Type":"ContainerStarted","Data":"e36b590df7413b3f2add5d8376b41f221727b6ba1627551a557b9d492f1d75be"} Apr 23 17:53:46.907780 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:46.907731 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-fvq78" podStartSLOduration=2.830012801 podStartE2EDuration="22.907718012s" podCreationTimestamp="2026-04-23 17:53:24 +0000 UTC" firstStartedPulling="2026-04-23 17:53:26.264364199 +0000 UTC m=+3.086831913" lastFinishedPulling="2026-04-23 17:53:46.342069404 +0000 UTC m=+23.164537124" observedRunningTime="2026-04-23 17:53:46.907439594 +0000 UTC m=+23.729907330" watchObservedRunningTime="2026-04-23 17:53:46.907718012 +0000 UTC m=+23.730185757" Apr 23 17:53:47.749560 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:47.749528 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:47.749747 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:47.749641 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqfsb" podUID="e70550da-839d-4462-b368-c0139f793c15" Apr 23 17:53:47.749747 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:47.749702 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:47.749865 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:47.749801 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bztd4" podUID="0c49641e-88eb-49d0-b1e0-5408152b701d" Apr 23 17:53:48.749329 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:48.749297 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:48.749985 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:48.749427 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dr7p4" podUID="16856cbe-119d-49c7-aa8f-a7d0f4002555" Apr 23 17:53:49.031021 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:49.030837 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/16856cbe-119d-49c7-aa8f-a7d0f4002555-original-pull-secret\") pod \"global-pull-secret-syncer-dr7p4\" (UID: \"16856cbe-119d-49c7-aa8f-a7d0f4002555\") " pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:49.031135 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:49.030945 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 17:53:49.031188 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:49.031176 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16856cbe-119d-49c7-aa8f-a7d0f4002555-original-pull-secret podName:16856cbe-119d-49c7-aa8f-a7d0f4002555 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:05.03115673 +0000 UTC m=+41.853624443 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/16856cbe-119d-49c7-aa8f-a7d0f4002555-original-pull-secret") pod "global-pull-secret-syncer-dr7p4" (UID: "16856cbe-119d-49c7-aa8f-a7d0f4002555") : object "kube-system"/"original-pull-secret" not registered Apr 23 17:53:49.748934 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:49.748899 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:49.749125 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:49.748906 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:49.749125 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:49.749027 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bztd4" podUID="0c49641e-88eb-49d0-b1e0-5408152b701d" Apr 23 17:53:49.749262 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:49.749132 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqfsb" podUID="e70550da-839d-4462-b368-c0139f793c15" Apr 23 17:53:49.889691 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:49.889657 2578 generic.go:358] "Generic (PLEG): container finished" podID="5a131586-128b-4207-ac02-4240d9075bc2" containerID="6800f30b94d4843ce73e9ff9e8f152c4311232bcfb92e81cae1e5e738c7a8393" exitCode=0 Apr 23 17:53:49.890365 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:49.889727 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2krs5" event={"ID":"5a131586-128b-4207-ac02-4240d9075bc2","Type":"ContainerDied","Data":"6800f30b94d4843ce73e9ff9e8f152c4311232bcfb92e81cae1e5e738c7a8393"} Apr 23 17:53:49.892802 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:49.892784 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 17:53:49.893050 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:49.893031 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" event={"ID":"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3","Type":"ContainerStarted","Data":"04062da1d06298dda8d56feb8ccbb5a7b3e791b6ff271a406580f8d637aad84a"} Apr 23 17:53:49.893310 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:49.893293 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:49.893357 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:49.893322 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:49.893501 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:49.893488 2578 scope.go:117] "RemoveContainer" containerID="1147830d1d7b8663f5a4db566b5988889cff5fd6559d9023c53b23fead399f3a" Apr 23 17:53:49.908207 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:49.908186 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:50.749566 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:50.749541 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:50.749695 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:50.749672 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dr7p4" podUID="16856cbe-119d-49c7-aa8f-a7d0f4002555" Apr 23 17:53:50.860482 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:50.860457 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dr7p4"] Apr 23 17:53:50.863961 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:50.863933 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mqfsb"] Apr 23 17:53:50.864071 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:50.864056 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:50.864241 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:50.864194 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqfsb" podUID="e70550da-839d-4462-b368-c0139f793c15" Apr 23 17:53:50.864672 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:50.864653 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bztd4"] Apr 23 17:53:50.864747 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:50.864735 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:50.864850 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:50.864827 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bztd4" podUID="0c49641e-88eb-49d0-b1e0-5408152b701d" Apr 23 17:53:50.898811 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:50.898658 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 17:53:50.899267 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:50.899233 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" event={"ID":"2fb5e8ca-0609-4dd5-ac79-69c12ad152a3","Type":"ContainerStarted","Data":"2004284aa10ef0346cfef20507d5a75246d68987e23d6c6c79600d7edfee1600"} Apr 23 17:53:50.899605 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:50.899586 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:50.901518 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:50.901494 2578 generic.go:358] "Generic (PLEG): container finished" podID="5a131586-128b-4207-ac02-4240d9075bc2" containerID="bf15bbf9d623ba73f2744ce00a3b450013f1cdd186387a9e32eb7a78a1d3a34f" exitCode=0 Apr 23 17:53:50.901610 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:50.901558 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:50.901610 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:50.901561 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2krs5" event={"ID":"5a131586-128b-4207-ac02-4240d9075bc2","Type":"ContainerDied","Data":"bf15bbf9d623ba73f2744ce00a3b450013f1cdd186387a9e32eb7a78a1d3a34f"} Apr 23 17:53:50.901701 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:50.901659 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dr7p4" podUID="16856cbe-119d-49c7-aa8f-a7d0f4002555" Apr 23 17:53:50.915518 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:50.915500 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:53:50.941633 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:50.941594 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" podStartSLOduration=9.772201673 podStartE2EDuration="26.941581673s" podCreationTimestamp="2026-04-23 17:53:24 +0000 UTC" firstStartedPulling="2026-04-23 17:53:26.264417179 +0000 UTC m=+3.086884891" lastFinishedPulling="2026-04-23 17:53:43.433797177 +0000 UTC m=+20.256264891" observedRunningTime="2026-04-23 17:53:50.941329909 +0000 UTC m=+27.763797644" watchObservedRunningTime="2026-04-23 17:53:50.941581673 +0000 UTC m=+27.764049407" Apr 23 17:53:51.905618 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:51.905539 2578 generic.go:358] "Generic (PLEG): container finished" podID="5a131586-128b-4207-ac02-4240d9075bc2" containerID="3f2e5caa97f6a9a7c623860700831d484b182257ee0aa6be2ebc7d0086660b80" exitCode=0 Apr 23 17:53:51.905991 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:51.905624 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2krs5" event={"ID":"5a131586-128b-4207-ac02-4240d9075bc2","Type":"ContainerDied","Data":"3f2e5caa97f6a9a7c623860700831d484b182257ee0aa6be2ebc7d0086660b80"} Apr 23 17:53:52.749889 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:52.749856 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:52.750034 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:52.749856 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:52.750034 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:52.749967 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dr7p4" podUID="16856cbe-119d-49c7-aa8f-a7d0f4002555" Apr 23 17:53:52.750145 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:52.750075 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqfsb" podUID="e70550da-839d-4462-b368-c0139f793c15" Apr 23 17:53:52.750145 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:52.749856 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:52.750259 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:52.750212 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bztd4" podUID="0c49641e-88eb-49d0-b1e0-5408152b701d" Apr 23 17:53:54.748932 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:54.748897 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:54.749426 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:54.748897 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:54.749426 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:54.749017 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-dr7p4" podUID="16856cbe-119d-49c7-aa8f-a7d0f4002555" Apr 23 17:53:54.749426 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:54.748901 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:54.749426 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:54.749105 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bztd4" podUID="0c49641e-88eb-49d0-b1e0-5408152b701d" Apr 23 17:53:54.749426 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:54.749191 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mqfsb" podUID="e70550da-839d-4462-b368-c0139f793c15" Apr 23 17:53:56.555532 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.555497 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-142-63.ec2.internal" event="NodeReady" Apr 23 17:53:56.556267 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.555665 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 17:53:56.618496 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.618424 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-ddf697474-pclcg"] Apr 23 17:53:56.622638 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.622618 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.629431 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.629408 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 17:53:56.629775 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.629730 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 17:53:56.629888 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.629799 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 17:53:56.630290 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.630135 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ccrfl\"" Apr 23 17:53:56.635106 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.635072 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 17:53:56.648473 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.648448 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7t847"] Apr 23 17:53:56.651371 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.651355 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7t847" Apr 23 17:53:56.654014 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.653990 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-j7wvx"] Apr 23 17:53:56.654812 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.654788 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nr8tq\"" Apr 23 17:53:56.655078 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.655057 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 17:53:56.655078 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.655073 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 17:53:56.657513 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.657367 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-ddf697474-pclcg"] Apr 23 17:53:56.657513 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.657472 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-j7wvx" Apr 23 17:53:56.659845 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.659821 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jg9fn\"" Apr 23 17:53:56.659938 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.659833 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 17:53:56.660176 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.660151 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 17:53:56.660271 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.660200 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 17:53:56.666607 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.666589 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7t847"] Apr 23 17:53:56.668813 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.668793 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-j7wvx"] Apr 23 17:53:56.749273 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.749239 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:56.749438 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.749363 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:53:56.749554 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.749243 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:56.757585 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.757006 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 17:53:56.757585 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.757255 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-p6fds\"" Apr 23 17:53:56.757585 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.757317 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x582n\"" Apr 23 17:53:56.757585 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.757353 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 17:53:56.757585 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.757386 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 17:53:56.757585 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.757319 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 17:53:56.792960 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.792934 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.793100 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.792978 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-image-registry-private-configuration\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.793100 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.793018 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls\") pod \"dns-default-7t847\" (UID: \"17f79bf8-0319-418c-a852-2ce9a897e648\") " pod="openshift-dns/dns-default-7t847" Apr 23 17:53:56.793218 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.793124 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nlct\" (UniqueName: \"kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-kube-api-access-8nlct\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.793218 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.793162 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/17f79bf8-0319-418c-a852-2ce9a897e648-tmp-dir\") pod \"dns-default-7t847\" (UID: \"17f79bf8-0319-418c-a852-2ce9a897e648\") " pod="openshift-dns/dns-default-7t847" Apr 23 17:53:56.793218 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.793187 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-certificates\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.793370 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.793219 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flkww\" (UniqueName: \"kubernetes.io/projected/17f79bf8-0319-418c-a852-2ce9a897e648-kube-api-access-flkww\") pod \"dns-default-7t847\" (UID: \"17f79bf8-0319-418c-a852-2ce9a897e648\") " pod="openshift-dns/dns-default-7t847" Apr 23 17:53:56.793370 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.793236 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-installation-pull-secrets\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.793370 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.793263 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-ca-trust-extracted\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.793370 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.793286 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-trusted-ca\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.793370 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.793311 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmdpx\" (UniqueName: \"kubernetes.io/projected/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-kube-api-access-tmdpx\") pod \"ingress-canary-j7wvx\" (UID: \"f33cc58f-96ef-4c06-8b13-ae89d3b2c805\") " pod="openshift-ingress-canary/ingress-canary-j7wvx" Apr 23 17:53:56.793370 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.793338 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-bound-sa-token\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.793594 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.793395 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17f79bf8-0319-418c-a852-2ce9a897e648-config-volume\") pod \"dns-default-7t847\" (UID: \"17f79bf8-0319-418c-a852-2ce9a897e648\") " pod="openshift-dns/dns-default-7t847" Apr 23 17:53:56.793594 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.793410 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert\") pod \"ingress-canary-j7wvx\" (UID: \"f33cc58f-96ef-4c06-8b13-ae89d3b2c805\") " pod="openshift-ingress-canary/ingress-canary-j7wvx" Apr 23 17:53:56.893897 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.893821 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tmdpx\" (UniqueName: \"kubernetes.io/projected/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-kube-api-access-tmdpx\") pod \"ingress-canary-j7wvx\" (UID: \"f33cc58f-96ef-4c06-8b13-ae89d3b2c805\") " pod="openshift-ingress-canary/ingress-canary-j7wvx" Apr 23 17:53:56.893897 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.893859 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-bound-sa-token\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.894141 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.894046 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17f79bf8-0319-418c-a852-2ce9a897e648-config-volume\") pod \"dns-default-7t847\" (UID: \"17f79bf8-0319-418c-a852-2ce9a897e648\") " pod="openshift-dns/dns-default-7t847" Apr 23 17:53:56.894141 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.894081 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert\") pod \"ingress-canary-j7wvx\" (UID: \"f33cc58f-96ef-4c06-8b13-ae89d3b2c805\") " pod="openshift-ingress-canary/ingress-canary-j7wvx" Apr 23 17:53:56.894141 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.894129 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.894289 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.894161 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-image-registry-private-configuration\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.894289 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.894212 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls\") pod \"dns-default-7t847\" (UID: \"17f79bf8-0319-418c-a852-2ce9a897e648\") " pod="openshift-dns/dns-default-7t847" Apr 23 17:53:56.894289 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.894244 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nlct\" (UniqueName: \"kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-kube-api-access-8nlct\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.894289 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:56.894269 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:53:56.894474 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:56.894313 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:53:56.894474 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:56.894333 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-ddf697474-pclcg: secret "image-registry-tls" not found Apr 23 17:53:56.894474 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:56.894344 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert podName:f33cc58f-96ef-4c06-8b13-ae89d3b2c805 nodeName:}" failed. No retries permitted until 2026-04-23 17:53:57.39432307 +0000 UTC m=+34.216790787 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert") pod "ingress-canary-j7wvx" (UID: "f33cc58f-96ef-4c06-8b13-ae89d3b2c805") : secret "canary-serving-cert" not found Apr 23 17:53:56.894474 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:56.894398 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls podName:fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4 nodeName:}" failed. No retries permitted until 2026-04-23 17:53:57.394378342 +0000 UTC m=+34.216846069 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls") pod "image-registry-ddf697474-pclcg" (UID: "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4") : secret "image-registry-tls" not found Apr 23 17:53:56.894474 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.894467 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/17f79bf8-0319-418c-a852-2ce9a897e648-tmp-dir\") pod \"dns-default-7t847\" (UID: \"17f79bf8-0319-418c-a852-2ce9a897e648\") " pod="openshift-dns/dns-default-7t847" Apr 23 17:53:56.894474 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:56.894475 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:53:56.894787 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.894499 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-certificates\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.894787 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:56.894511 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls podName:17f79bf8-0319-418c-a852-2ce9a897e648 nodeName:}" failed. No retries permitted until 2026-04-23 17:53:57.394498724 +0000 UTC m=+34.216966450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls") pod "dns-default-7t847" (UID: "17f79bf8-0319-418c-a852-2ce9a897e648") : secret "dns-default-metrics-tls" not found Apr 23 17:53:56.894787 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.894561 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flkww\" (UniqueName: \"kubernetes.io/projected/17f79bf8-0319-418c-a852-2ce9a897e648-kube-api-access-flkww\") pod \"dns-default-7t847\" (UID: \"17f79bf8-0319-418c-a852-2ce9a897e648\") " pod="openshift-dns/dns-default-7t847" Apr 23 17:53:56.894787 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.894591 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-installation-pull-secrets\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.894787 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.894616 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/17f79bf8-0319-418c-a852-2ce9a897e648-config-volume\") pod \"dns-default-7t847\" (UID: \"17f79bf8-0319-418c-a852-2ce9a897e648\") " pod="openshift-dns/dns-default-7t847" Apr 23 17:53:56.894787 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.894635 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-ca-trust-extracted\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.894787 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.894664 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-trusted-ca\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.894787 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.894741 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/17f79bf8-0319-418c-a852-2ce9a897e648-tmp-dir\") pod \"dns-default-7t847\" (UID: \"17f79bf8-0319-418c-a852-2ce9a897e648\") " pod="openshift-dns/dns-default-7t847" Apr 23 17:53:56.895222 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.895010 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-certificates\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.895222 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.895050 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-ca-trust-extracted\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.895599 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.895578 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-trusted-ca\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.898747 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.898721 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-image-registry-private-configuration\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.898849 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.898774 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-installation-pull-secrets\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.906464 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.905786 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmdpx\" (UniqueName: \"kubernetes.io/projected/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-kube-api-access-tmdpx\") pod \"ingress-canary-j7wvx\" (UID: \"f33cc58f-96ef-4c06-8b13-ae89d3b2c805\") " pod="openshift-ingress-canary/ingress-canary-j7wvx" Apr 23 17:53:56.907179 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.907134 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flkww\" (UniqueName: \"kubernetes.io/projected/17f79bf8-0319-418c-a852-2ce9a897e648-kube-api-access-flkww\") pod \"dns-default-7t847\" (UID: \"17f79bf8-0319-418c-a852-2ce9a897e648\") " pod="openshift-dns/dns-default-7t847" Apr 23 17:53:56.908019 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.907995 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-bound-sa-token\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:56.908188 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:56.908163 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nlct\" (UniqueName: \"kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-kube-api-access-8nlct\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:57.398614 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:57.398580 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs\") pod \"network-metrics-daemon-mqfsb\" (UID: \"e70550da-839d-4462-b368-c0139f793c15\") " pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:53:57.398812 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:57.398669 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert\") pod \"ingress-canary-j7wvx\" (UID: \"f33cc58f-96ef-4c06-8b13-ae89d3b2c805\") " pod="openshift-ingress-canary/ingress-canary-j7wvx" Apr 23 17:53:57.398812 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:57.398698 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:57.398812 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:57.398733 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls\") pod \"dns-default-7t847\" (UID: \"17f79bf8-0319-418c-a852-2ce9a897e648\") " pod="openshift-dns/dns-default-7t847" Apr 23 17:53:57.398969 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:57.398859 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:53:57.398969 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:57.398925 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls podName:17f79bf8-0319-418c-a852-2ce9a897e648 nodeName:}" failed. No retries permitted until 2026-04-23 17:53:58.398904962 +0000 UTC m=+35.221372689 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls") pod "dns-default-7t847" (UID: "17f79bf8-0319-418c-a852-2ce9a897e648") : secret "dns-default-metrics-tls" not found Apr 23 17:53:57.399215 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:57.399189 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 17:53:57.399345 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:57.399226 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:53:57.399345 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:57.399252 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs podName:e70550da-839d-4462-b368-c0139f793c15 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:29.399235882 +0000 UTC m=+66.221703615 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs") pod "network-metrics-daemon-mqfsb" (UID: "e70550da-839d-4462-b368-c0139f793c15") : secret "metrics-daemon-secret" not found Apr 23 17:53:57.399345 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:57.399265 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:53:57.399345 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:57.399287 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-ddf697474-pclcg: secret "image-registry-tls" not found Apr 23 17:53:57.399345 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:57.399296 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert podName:f33cc58f-96ef-4c06-8b13-ae89d3b2c805 nodeName:}" failed. No retries permitted until 2026-04-23 17:53:58.399278639 +0000 UTC m=+35.221746369 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert") pod "ingress-canary-j7wvx" (UID: "f33cc58f-96ef-4c06-8b13-ae89d3b2c805") : secret "canary-serving-cert" not found Apr 23 17:53:57.399345 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:57.399350 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls podName:fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4 nodeName:}" failed. No retries permitted until 2026-04-23 17:53:58.399334422 +0000 UTC m=+35.221802139 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls") pod "image-registry-ddf697474-pclcg" (UID: "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4") : secret "image-registry-tls" not found Apr 23 17:53:57.499839 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:57.499808 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49d2h\" (UniqueName: \"kubernetes.io/projected/0c49641e-88eb-49d0-b1e0-5408152b701d-kube-api-access-49d2h\") pod \"network-check-target-bztd4\" (UID: \"0c49641e-88eb-49d0-b1e0-5408152b701d\") " pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:57.502318 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:57.502301 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49d2h\" (UniqueName: \"kubernetes.io/projected/0c49641e-88eb-49d0-b1e0-5408152b701d-kube-api-access-49d2h\") pod \"network-check-target-bztd4\" (UID: \"0c49641e-88eb-49d0-b1e0-5408152b701d\") " pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:57.659594 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:57.659517 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:53:57.819734 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:57.819565 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bztd4"] Apr 23 17:53:57.823232 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:53:57.823204 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c49641e_88eb_49d0_b1e0_5408152b701d.slice/crio-55f9f6222c975368a95f33cefecc945697a6b722af8d7e3c5f8af4cfde1c6ab2 WatchSource:0}: Error finding container 55f9f6222c975368a95f33cefecc945697a6b722af8d7e3c5f8af4cfde1c6ab2: Status 404 returned error can't find the container with id 55f9f6222c975368a95f33cefecc945697a6b722af8d7e3c5f8af4cfde1c6ab2 Apr 23 17:53:57.919370 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:57.919275 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bztd4" event={"ID":"0c49641e-88eb-49d0-b1e0-5408152b701d","Type":"ContainerStarted","Data":"55f9f6222c975368a95f33cefecc945697a6b722af8d7e3c5f8af4cfde1c6ab2"} Apr 23 17:53:58.407095 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:58.407059 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert\") pod \"ingress-canary-j7wvx\" (UID: \"f33cc58f-96ef-4c06-8b13-ae89d3b2c805\") " pod="openshift-ingress-canary/ingress-canary-j7wvx" Apr 23 17:53:58.407296 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:58.407125 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:53:58.407296 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:58.407172 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls\") pod \"dns-default-7t847\" (UID: \"17f79bf8-0319-418c-a852-2ce9a897e648\") " pod="openshift-dns/dns-default-7t847" Apr 23 17:53:58.407296 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:58.407225 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:53:58.407296 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:58.407277 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:53:58.407296 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:58.407293 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert podName:f33cc58f-96ef-4c06-8b13-ae89d3b2c805 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:00.40727347 +0000 UTC m=+37.229741186 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert") pod "ingress-canary-j7wvx" (UID: "f33cc58f-96ef-4c06-8b13-ae89d3b2c805") : secret "canary-serving-cert" not found Apr 23 17:53:58.407540 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:58.407317 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls podName:17f79bf8-0319-418c-a852-2ce9a897e648 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:00.407305637 +0000 UTC m=+37.229773350 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls") pod "dns-default-7t847" (UID: "17f79bf8-0319-418c-a852-2ce9a897e648") : secret "dns-default-metrics-tls" not found Apr 23 17:53:58.407540 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:58.407383 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:53:58.407540 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:58.407402 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-ddf697474-pclcg: secret "image-registry-tls" not found Apr 23 17:53:58.407540 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:53:58.407453 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls podName:fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:00.407438911 +0000 UTC m=+37.229906643 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls") pod "image-registry-ddf697474-pclcg" (UID: "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4") : secret "image-registry-tls" not found Apr 23 17:53:58.924648 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:58.924604 2578 generic.go:358] "Generic (PLEG): container finished" podID="5a131586-128b-4207-ac02-4240d9075bc2" containerID="2d3652a9b19d47ade19064dce4ef255c64c958fb07f3398964f419d3fdcf43ff" exitCode=0 Apr 23 17:53:58.925230 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:58.924661 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2krs5" event={"ID":"5a131586-128b-4207-ac02-4240d9075bc2","Type":"ContainerDied","Data":"2d3652a9b19d47ade19064dce4ef255c64c958fb07f3398964f419d3fdcf43ff"} Apr 23 17:53:59.929910 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:59.929872 2578 generic.go:358] "Generic (PLEG): container finished" podID="5a131586-128b-4207-ac02-4240d9075bc2" containerID="922cc75b70e3f3c219eed1e69c688e74a704761c6617f3b117ab997c2ae10d7d" exitCode=0 Apr 23 17:53:59.930412 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:53:59.929942 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2krs5" event={"ID":"5a131586-128b-4207-ac02-4240d9075bc2","Type":"ContainerDied","Data":"922cc75b70e3f3c219eed1e69c688e74a704761c6617f3b117ab997c2ae10d7d"} Apr 23 17:54:00.424617 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:00.424573 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls\") pod \"dns-default-7t847\" (UID: \"17f79bf8-0319-418c-a852-2ce9a897e648\") " pod="openshift-dns/dns-default-7t847" Apr 23 17:54:00.424778 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:00.424681 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert\") pod \"ingress-canary-j7wvx\" (UID: \"f33cc58f-96ef-4c06-8b13-ae89d3b2c805\") " pod="openshift-ingress-canary/ingress-canary-j7wvx" Apr 23 17:54:00.424778 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:00.424711 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:54:00.424778 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:00.424738 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:54:00.424958 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:00.424785 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:54:00.424958 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:00.424816 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:54:00.424958 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:00.424832 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-ddf697474-pclcg: secret "image-registry-tls" not found Apr 23 17:54:00.425267 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:00.424814 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls podName:17f79bf8-0319-418c-a852-2ce9a897e648 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:04.424792968 +0000 UTC m=+41.247260701 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls") pod "dns-default-7t847" (UID: "17f79bf8-0319-418c-a852-2ce9a897e648") : secret "dns-default-metrics-tls" not found Apr 23 17:54:00.425355 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:00.425284 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert podName:f33cc58f-96ef-4c06-8b13-ae89d3b2c805 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:04.425271082 +0000 UTC m=+41.247738796 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert") pod "ingress-canary-j7wvx" (UID: "f33cc58f-96ef-4c06-8b13-ae89d3b2c805") : secret "canary-serving-cert" not found Apr 23 17:54:00.425355 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:00.425301 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls podName:fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:04.425294102 +0000 UTC m=+41.247761815 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls") pod "image-registry-ddf697474-pclcg" (UID: "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4") : secret "image-registry-tls" not found Apr 23 17:54:00.934524 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:00.934496 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2krs5" event={"ID":"5a131586-128b-4207-ac02-4240d9075bc2","Type":"ContainerStarted","Data":"bf6ee3ff7e9deb30e1186c177d41d002a609522acf119bf69a78cded86172276"} Apr 23 17:54:00.937967 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:00.937939 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bztd4" event={"ID":"0c49641e-88eb-49d0-b1e0-5408152b701d","Type":"ContainerStarted","Data":"f0371f3cf3640816baf35e5a9d273b3a81f015f1be78b2b4d9fe3369857b0b71"} Apr 23 17:54:00.963371 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:00.963326 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2krs5" podStartSLOduration=6.368197966 podStartE2EDuration="37.963313043s" podCreationTimestamp="2026-04-23 17:53:23 +0000 UTC" firstStartedPulling="2026-04-23 17:53:26.254701099 +0000 UTC m=+3.077168819" lastFinishedPulling="2026-04-23 17:53:57.849816183 +0000 UTC m=+34.672283896" observedRunningTime="2026-04-23 17:54:00.961859911 +0000 UTC m=+37.784327647" watchObservedRunningTime="2026-04-23 17:54:00.963313043 +0000 UTC m=+37.785780778" Apr 23 17:54:01.940530 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:01.940497 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:54:01.958812 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:01.958772 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bztd4" podStartSLOduration=35.960194877 podStartE2EDuration="38.958757392s" podCreationTimestamp="2026-04-23 17:53:23 +0000 UTC" firstStartedPulling="2026-04-23 17:53:57.827563793 +0000 UTC m=+34.650031505" lastFinishedPulling="2026-04-23 17:54:00.826126294 +0000 UTC m=+37.648594020" observedRunningTime="2026-04-23 17:54:01.957801411 +0000 UTC m=+38.780269146" watchObservedRunningTime="2026-04-23 17:54:01.958757392 +0000 UTC m=+38.781225127" Apr 23 17:54:04.454144 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:04.454103 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert\") pod \"ingress-canary-j7wvx\" (UID: \"f33cc58f-96ef-4c06-8b13-ae89d3b2c805\") " pod="openshift-ingress-canary/ingress-canary-j7wvx" Apr 23 17:54:04.454624 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:04.454152 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:54:04.454624 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:04.454191 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls\") pod \"dns-default-7t847\" (UID: \"17f79bf8-0319-418c-a852-2ce9a897e648\") " pod="openshift-dns/dns-default-7t847" Apr 23 17:54:04.454624 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:04.454273 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:54:04.454624 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:04.454328 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:54:04.454624 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:04.454347 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-ddf697474-pclcg: secret "image-registry-tls" not found Apr 23 17:54:04.454624 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:04.454273 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:54:04.454624 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:04.454334 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls podName:17f79bf8-0319-418c-a852-2ce9a897e648 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:12.454317217 +0000 UTC m=+49.276784931 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls") pod "dns-default-7t847" (UID: "17f79bf8-0319-418c-a852-2ce9a897e648") : secret "dns-default-metrics-tls" not found Apr 23 17:54:04.454624 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:04.454399 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls podName:fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:12.454385251 +0000 UTC m=+49.276852963 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls") pod "image-registry-ddf697474-pclcg" (UID: "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4") : secret "image-registry-tls" not found Apr 23 17:54:04.454624 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:04.454433 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert podName:f33cc58f-96ef-4c06-8b13-ae89d3b2c805 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:12.454412958 +0000 UTC m=+49.276880685 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert") pod "ingress-canary-j7wvx" (UID: "f33cc58f-96ef-4c06-8b13-ae89d3b2c805") : secret "canary-serving-cert" not found Apr 23 17:54:05.058823 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:05.058768 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/16856cbe-119d-49c7-aa8f-a7d0f4002555-original-pull-secret\") pod \"global-pull-secret-syncer-dr7p4\" (UID: \"16856cbe-119d-49c7-aa8f-a7d0f4002555\") " pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:54:05.062865 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:05.062837 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/16856cbe-119d-49c7-aa8f-a7d0f4002555-original-pull-secret\") pod \"global-pull-secret-syncer-dr7p4\" (UID: \"16856cbe-119d-49c7-aa8f-a7d0f4002555\") " pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:54:05.167234 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:05.167199 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-dr7p4" Apr 23 17:54:05.304668 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:05.304636 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-dr7p4"] Apr 23 17:54:05.308167 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:54:05.308140 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16856cbe_119d_49c7_aa8f_a7d0f4002555.slice/crio-362d5691a03515e620bd7a944aa8db398eeb598db22c99e6795786c3aa6f14d1 WatchSource:0}: Error finding container 362d5691a03515e620bd7a944aa8db398eeb598db22c99e6795786c3aa6f14d1: Status 404 returned error can't find the container with id 362d5691a03515e620bd7a944aa8db398eeb598db22c99e6795786c3aa6f14d1 Apr 23 17:54:05.949493 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:05.949448 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dr7p4" event={"ID":"16856cbe-119d-49c7-aa8f-a7d0f4002555","Type":"ContainerStarted","Data":"362d5691a03515e620bd7a944aa8db398eeb598db22c99e6795786c3aa6f14d1"} Apr 23 17:54:08.956737 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:08.956653 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-dr7p4" event={"ID":"16856cbe-119d-49c7-aa8f-a7d0f4002555","Type":"ContainerStarted","Data":"6dc1eebfc048a32f2f808156a610b4fbb20ad05b3cb6ed646e978b7ee3cc0e29"} Apr 23 17:54:08.974122 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:08.974063 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-dr7p4" podStartSLOduration=32.586277077 podStartE2EDuration="35.974047318s" podCreationTimestamp="2026-04-23 17:53:33 +0000 UTC" firstStartedPulling="2026-04-23 17:54:05.309842486 +0000 UTC m=+42.132310199" lastFinishedPulling="2026-04-23 17:54:08.697612728 +0000 UTC m=+45.520080440" observedRunningTime="2026-04-23 17:54:08.973937817 +0000 UTC m=+45.796405553" watchObservedRunningTime="2026-04-23 17:54:08.974047318 +0000 UTC m=+45.796515056" Apr 23 17:54:12.509302 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:12.509264 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls\") pod \"dns-default-7t847\" (UID: \"17f79bf8-0319-418c-a852-2ce9a897e648\") " pod="openshift-dns/dns-default-7t847" Apr 23 17:54:12.509713 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:12.509326 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert\") pod \"ingress-canary-j7wvx\" (UID: \"f33cc58f-96ef-4c06-8b13-ae89d3b2c805\") " pod="openshift-ingress-canary/ingress-canary-j7wvx" Apr 23 17:54:12.509713 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:12.509345 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:54:12.509713 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:12.509421 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:54:12.509713 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:12.509431 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-ddf697474-pclcg: secret "image-registry-tls" not found Apr 23 17:54:12.509713 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:12.509428 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:54:12.509713 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:12.509462 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:54:12.509713 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:12.509478 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls podName:fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:28.509464852 +0000 UTC m=+65.331932564 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls") pod "image-registry-ddf697474-pclcg" (UID: "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4") : secret "image-registry-tls" not found Apr 23 17:54:12.509713 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:12.509490 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls podName:17f79bf8-0319-418c-a852-2ce9a897e648 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:28.509484492 +0000 UTC m=+65.331952205 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls") pod "dns-default-7t847" (UID: "17f79bf8-0319-418c-a852-2ce9a897e648") : secret "dns-default-metrics-tls" not found Apr 23 17:54:12.509713 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:12.509507 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert podName:f33cc58f-96ef-4c06-8b13-ae89d3b2c805 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:28.509495049 +0000 UTC m=+65.331962761 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert") pod "ingress-canary-j7wvx" (UID: "f33cc58f-96ef-4c06-8b13-ae89d3b2c805") : secret "canary-serving-cert" not found Apr 23 17:54:22.918050 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:22.918020 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x2gvq" Apr 23 17:54:26.866483 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:26.866442 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp"] Apr 23 17:54:26.874564 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:26.874541 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" Apr 23 17:54:26.877544 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:26.877521 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 23 17:54:26.877904 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:26.877889 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 23 17:54:26.877949 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:26.877914 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 23 17:54:26.878974 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:26.878952 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 23 17:54:26.879097 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:26.879053 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 23 17:54:26.879097 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:26.879073 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 23 17:54:26.879221 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:26.879104 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 23 17:54:26.881718 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:26.881693 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp"] Apr 23 17:54:26.909743 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:26.909712 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/bf5d8d3c-de11-4bf0-872e-708dfdd6f61b-ca\") pod \"cluster-proxy-proxy-agent-857469989d-k22fp\" (UID: \"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" Apr 23 17:54:26.909853 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:26.909777 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/bf5d8d3c-de11-4bf0-872e-708dfdd6f61b-hub\") pod \"cluster-proxy-proxy-agent-857469989d-k22fp\" (UID: \"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" Apr 23 17:54:26.909853 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:26.909820 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/bf5d8d3c-de11-4bf0-872e-708dfdd6f61b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-857469989d-k22fp\" (UID: \"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" Apr 23 17:54:26.909853 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:26.909835 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/bf5d8d3c-de11-4bf0-872e-708dfdd6f61b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-857469989d-k22fp\" (UID: \"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" Apr 23 17:54:26.909971 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:26.909854 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/bf5d8d3c-de11-4bf0-872e-708dfdd6f61b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-857469989d-k22fp\" (UID: \"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" Apr 23 17:54:26.909971 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:26.909879 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xwdw\" (UniqueName: \"kubernetes.io/projected/bf5d8d3c-de11-4bf0-872e-708dfdd6f61b-kube-api-access-6xwdw\") pod \"cluster-proxy-proxy-agent-857469989d-k22fp\" (UID: \"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" Apr 23 17:54:27.010679 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:27.010659 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/bf5d8d3c-de11-4bf0-872e-708dfdd6f61b-hub\") pod \"cluster-proxy-proxy-agent-857469989d-k22fp\" (UID: \"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" Apr 23 17:54:27.010779 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:27.010699 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/bf5d8d3c-de11-4bf0-872e-708dfdd6f61b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-857469989d-k22fp\" (UID: \"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" Apr 23 17:54:27.010779 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:27.010723 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/bf5d8d3c-de11-4bf0-872e-708dfdd6f61b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-857469989d-k22fp\" (UID: \"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" Apr 23 17:54:27.010779 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:27.010742 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/bf5d8d3c-de11-4bf0-872e-708dfdd6f61b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-857469989d-k22fp\" (UID: \"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" Apr 23 17:54:27.010779 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:27.010776 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6xwdw\" (UniqueName: \"kubernetes.io/projected/bf5d8d3c-de11-4bf0-872e-708dfdd6f61b-kube-api-access-6xwdw\") pod \"cluster-proxy-proxy-agent-857469989d-k22fp\" (UID: \"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" Apr 23 17:54:27.010976 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:27.010805 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/bf5d8d3c-de11-4bf0-872e-708dfdd6f61b-ca\") pod \"cluster-proxy-proxy-agent-857469989d-k22fp\" (UID: \"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" Apr 23 17:54:27.011431 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:27.011407 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/bf5d8d3c-de11-4bf0-872e-708dfdd6f61b-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-857469989d-k22fp\" (UID: \"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" Apr 23 17:54:27.013336 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:27.013314 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/bf5d8d3c-de11-4bf0-872e-708dfdd6f61b-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-857469989d-k22fp\" (UID: \"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" Apr 23 17:54:27.013336 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:27.013330 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/bf5d8d3c-de11-4bf0-872e-708dfdd6f61b-hub\") pod \"cluster-proxy-proxy-agent-857469989d-k22fp\" (UID: \"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" Apr 23 17:54:27.013504 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:27.013488 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/bf5d8d3c-de11-4bf0-872e-708dfdd6f61b-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-857469989d-k22fp\" (UID: \"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" Apr 23 17:54:27.013586 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:27.013565 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/bf5d8d3c-de11-4bf0-872e-708dfdd6f61b-ca\") pod \"cluster-proxy-proxy-agent-857469989d-k22fp\" (UID: \"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" Apr 23 17:54:27.019486 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:27.019461 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xwdw\" (UniqueName: \"kubernetes.io/projected/bf5d8d3c-de11-4bf0-872e-708dfdd6f61b-kube-api-access-6xwdw\") pod \"cluster-proxy-proxy-agent-857469989d-k22fp\" (UID: \"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" Apr 23 17:54:27.191196 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:27.191080 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" Apr 23 17:54:27.302275 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:27.302244 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp"] Apr 23 17:54:27.305592 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:54:27.305566 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf5d8d3c_de11_4bf0_872e_708dfdd6f61b.slice/crio-df278034bd59ab786856178c8f8fba8c4caf2f03f0781e90aa4cb9e9ce5b0686 WatchSource:0}: Error finding container df278034bd59ab786856178c8f8fba8c4caf2f03f0781e90aa4cb9e9ce5b0686: Status 404 returned error can't find the container with id df278034bd59ab786856178c8f8fba8c4caf2f03f0781e90aa4cb9e9ce5b0686 Apr 23 17:54:27.995660 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:27.995614 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" event={"ID":"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b","Type":"ContainerStarted","Data":"df278034bd59ab786856178c8f8fba8c4caf2f03f0781e90aa4cb9e9ce5b0686"} Apr 23 17:54:28.523481 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:28.523440 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls\") pod \"dns-default-7t847\" (UID: \"17f79bf8-0319-418c-a852-2ce9a897e648\") " pod="openshift-dns/dns-default-7t847" Apr 23 17:54:28.523691 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:28.523567 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert\") pod \"ingress-canary-j7wvx\" (UID: \"f33cc58f-96ef-4c06-8b13-ae89d3b2c805\") " pod="openshift-ingress-canary/ingress-canary-j7wvx" Apr 23 17:54:28.523691 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:28.523594 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:54:28.523810 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:28.523716 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 17:54:28.523810 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:28.523732 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-ddf697474-pclcg: secret "image-registry-tls" not found Apr 23 17:54:28.523810 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:28.523789 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls podName:fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:00.523774564 +0000 UTC m=+97.346242283 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls") pod "image-registry-ddf697474-pclcg" (UID: "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4") : secret "image-registry-tls" not found Apr 23 17:54:28.524146 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:28.524118 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 17:54:28.524146 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:28.524138 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 17:54:28.524398 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:28.524191 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert podName:f33cc58f-96ef-4c06-8b13-ae89d3b2c805 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:00.524169422 +0000 UTC m=+97.346637148 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert") pod "ingress-canary-j7wvx" (UID: "f33cc58f-96ef-4c06-8b13-ae89d3b2c805") : secret "canary-serving-cert" not found Apr 23 17:54:28.524398 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:28.524211 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls podName:17f79bf8-0319-418c-a852-2ce9a897e648 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:00.524202114 +0000 UTC m=+97.346669831 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls") pod "dns-default-7t847" (UID: "17f79bf8-0319-418c-a852-2ce9a897e648") : secret "dns-default-metrics-tls" not found Apr 23 17:54:29.431036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:29.430998 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs\") pod \"network-metrics-daemon-mqfsb\" (UID: \"e70550da-839d-4462-b368-c0139f793c15\") " pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:54:29.431490 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:29.431175 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 17:54:29.431490 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:29.431246 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs podName:e70550da-839d-4462-b368-c0139f793c15 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:33.431226685 +0000 UTC m=+130.253694398 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs") pod "network-metrics-daemon-mqfsb" (UID: "e70550da-839d-4462-b368-c0139f793c15") : secret "metrics-daemon-secret" not found Apr 23 17:54:31.004162 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:31.004114 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" event={"ID":"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b","Type":"ContainerStarted","Data":"8d11224e47b1f45724a57b515df2f3f626388bf89de4f270af1a225ff492fd90"} Apr 23 17:54:32.944376 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:32.944295 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bztd4" Apr 23 17:54:33.009869 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:33.009839 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" event={"ID":"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b","Type":"ContainerStarted","Data":"b54ee30bc48b198f28413bf7ad0ae34bd914426a282193859fdc514d5385cb74"} Apr 23 17:54:33.009869 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:33.009873 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" event={"ID":"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b","Type":"ContainerStarted","Data":"bd19dfc72d1b4a0d141da33f851ffbc633542deaf662e5b2214ea9b2d8004d31"} Apr 23 17:54:33.029757 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:33.029709 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" podStartSLOduration=1.910424433 podStartE2EDuration="7.029696215s" podCreationTimestamp="2026-04-23 17:54:26 +0000 UTC" firstStartedPulling="2026-04-23 17:54:27.308784371 +0000 UTC m=+64.131252087" lastFinishedPulling="2026-04-23 17:54:32.42805615 +0000 UTC m=+69.250523869" observedRunningTime="2026-04-23 17:54:33.028675389 +0000 UTC m=+69.851143134" watchObservedRunningTime="2026-04-23 17:54:33.029696215 +0000 UTC m=+69.852163950" Apr 23 17:54:43.309027 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.308993 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-2t94g"] Apr 23 17:54:43.313362 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.313347 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2t94g" Apr 23 17:54:43.315826 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.315807 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-zxhj4\"" Apr 23 17:54:43.319710 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.319688 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-2t94g"] Apr 23 17:54:43.430369 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.430338 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkt2n\" (UniqueName: \"kubernetes.io/projected/47475636-63bf-4a11-9285-cce1b1df596d-kube-api-access-vkt2n\") pod \"network-check-source-8894fc9bd-2t94g\" (UID: \"47475636-63bf-4a11-9285-cce1b1df596d\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2t94g" Apr 23 17:54:43.524002 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.523964 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fns5j"] Apr 23 17:54:43.527138 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.527117 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5f6sp"] Apr 23 17:54:43.527303 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.527284 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fns5j" Apr 23 17:54:43.529942 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.529924 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5f6sp" Apr 23 17:54:43.531012 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.530994 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkt2n\" (UniqueName: \"kubernetes.io/projected/47475636-63bf-4a11-9285-cce1b1df596d-kube-api-access-vkt2n\") pod \"network-check-source-8894fc9bd-2t94g\" (UID: \"47475636-63bf-4a11-9285-cce1b1df596d\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2t94g" Apr 23 17:54:43.540061 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.540044 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-lzb89\"" Apr 23 17:54:43.544952 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.544933 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 23 17:54:43.544952 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.544945 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 23 17:54:43.545125 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.544945 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:54:43.545125 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.544985 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 23 17:54:43.545125 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.545006 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 23 17:54:43.545274 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.545170 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-n2b9q\"" Apr 23 17:54:43.556433 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.556412 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5f6sp"] Apr 23 17:54:43.574911 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.574862 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkt2n\" (UniqueName: \"kubernetes.io/projected/47475636-63bf-4a11-9285-cce1b1df596d-kube-api-access-vkt2n\") pod \"network-check-source-8894fc9bd-2t94g\" (UID: \"47475636-63bf-4a11-9285-cce1b1df596d\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2t94g" Apr 23 17:54:43.579269 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.579236 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fns5j"] Apr 23 17:54:43.621937 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.621915 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2t94g" Apr 23 17:54:43.631914 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.631892 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c7hq\" (UniqueName: \"kubernetes.io/projected/d5396980-27a8-413a-b739-f40a60724e10-kube-api-access-9c7hq\") pod \"cluster-samples-operator-6dc5bdb6b4-fns5j\" (UID: \"d5396980-27a8-413a-b739-f40a60724e10\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fns5j" Apr 23 17:54:43.632001 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.631921 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcnmq\" (UniqueName: \"kubernetes.io/projected/8d77429e-b156-4f21-8e2d-958b0183cfe9-kube-api-access-vcnmq\") pod \"volume-data-source-validator-7c6cbb6c87-5f6sp\" (UID: \"8d77429e-b156-4f21-8e2d-958b0183cfe9\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5f6sp" Apr 23 17:54:43.632001 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.631956 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5396980-27a8-413a-b739-f40a60724e10-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fns5j\" (UID: \"d5396980-27a8-413a-b739-f40a60724e10\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fns5j" Apr 23 17:54:43.732865 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.732835 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9c7hq\" (UniqueName: \"kubernetes.io/projected/d5396980-27a8-413a-b739-f40a60724e10-kube-api-access-9c7hq\") pod \"cluster-samples-operator-6dc5bdb6b4-fns5j\" (UID: \"d5396980-27a8-413a-b739-f40a60724e10\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fns5j" Apr 23 17:54:43.733028 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.732873 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vcnmq\" (UniqueName: \"kubernetes.io/projected/8d77429e-b156-4f21-8e2d-958b0183cfe9-kube-api-access-vcnmq\") pod \"volume-data-source-validator-7c6cbb6c87-5f6sp\" (UID: \"8d77429e-b156-4f21-8e2d-958b0183cfe9\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5f6sp" Apr 23 17:54:43.733028 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.733015 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5396980-27a8-413a-b739-f40a60724e10-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fns5j\" (UID: \"d5396980-27a8-413a-b739-f40a60724e10\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fns5j" Apr 23 17:54:43.733188 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:43.733161 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 17:54:43.733283 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:43.733272 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5396980-27a8-413a-b739-f40a60724e10-samples-operator-tls podName:d5396980-27a8-413a-b739-f40a60724e10 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:44.233231473 +0000 UTC m=+81.055699186 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d5396980-27a8-413a-b739-f40a60724e10-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fns5j" (UID: "d5396980-27a8-413a-b739-f40a60724e10") : secret "samples-operator-tls" not found Apr 23 17:54:43.736977 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.736948 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-2t94g"] Apr 23 17:54:43.739460 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:54:43.739437 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47475636_63bf_4a11_9285_cce1b1df596d.slice/crio-5b3ecb0d021c0b852edc0b2e56e95c6d189ccd73d5ca8375f6f229c197c49432 WatchSource:0}: Error finding container 5b3ecb0d021c0b852edc0b2e56e95c6d189ccd73d5ca8375f6f229c197c49432: Status 404 returned error can't find the container with id 5b3ecb0d021c0b852edc0b2e56e95c6d189ccd73d5ca8375f6f229c197c49432 Apr 23 17:54:43.744213 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.744191 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcnmq\" (UniqueName: \"kubernetes.io/projected/8d77429e-b156-4f21-8e2d-958b0183cfe9-kube-api-access-vcnmq\") pod \"volume-data-source-validator-7c6cbb6c87-5f6sp\" (UID: \"8d77429e-b156-4f21-8e2d-958b0183cfe9\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5f6sp" Apr 23 17:54:43.744367 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.744349 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c7hq\" (UniqueName: \"kubernetes.io/projected/d5396980-27a8-413a-b739-f40a60724e10-kube-api-access-9c7hq\") pod \"cluster-samples-operator-6dc5bdb6b4-fns5j\" (UID: \"d5396980-27a8-413a-b739-f40a60724e10\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fns5j" Apr 23 17:54:43.841423 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.841352 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5f6sp" Apr 23 17:54:43.957782 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:43.957751 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5f6sp"] Apr 23 17:54:43.960960 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:54:43.960937 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d77429e_b156_4f21_8e2d_958b0183cfe9.slice/crio-1fc4dc54c6c5414760a3a6ebd7647b9dec842dc480614def05b7c215e928ddf4 WatchSource:0}: Error finding container 1fc4dc54c6c5414760a3a6ebd7647b9dec842dc480614def05b7c215e928ddf4: Status 404 returned error can't find the container with id 1fc4dc54c6c5414760a3a6ebd7647b9dec842dc480614def05b7c215e928ddf4 Apr 23 17:54:44.033186 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:44.033152 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5f6sp" event={"ID":"8d77429e-b156-4f21-8e2d-958b0183cfe9","Type":"ContainerStarted","Data":"1fc4dc54c6c5414760a3a6ebd7647b9dec842dc480614def05b7c215e928ddf4"} Apr 23 17:54:44.034372 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:44.034346 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2t94g" event={"ID":"47475636-63bf-4a11-9285-cce1b1df596d","Type":"ContainerStarted","Data":"90d22ccd4e5a9094d88da78d30c21cdeaf57b41395213506f94c01640e64917c"} Apr 23 17:54:44.034372 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:44.034379 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2t94g" event={"ID":"47475636-63bf-4a11-9285-cce1b1df596d","Type":"ContainerStarted","Data":"5b3ecb0d021c0b852edc0b2e56e95c6d189ccd73d5ca8375f6f229c197c49432"} Apr 23 17:54:44.050011 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:44.049956 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-2t94g" podStartSLOduration=1.049944325 podStartE2EDuration="1.049944325s" podCreationTimestamp="2026-04-23 17:54:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:54:44.049698835 +0000 UTC m=+80.872166572" watchObservedRunningTime="2026-04-23 17:54:44.049944325 +0000 UTC m=+80.872412060" Apr 23 17:54:44.236840 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:44.236751 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5396980-27a8-413a-b739-f40a60724e10-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fns5j\" (UID: \"d5396980-27a8-413a-b739-f40a60724e10\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fns5j" Apr 23 17:54:44.236982 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:44.236893 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 17:54:44.236982 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:44.236956 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5396980-27a8-413a-b739-f40a60724e10-samples-operator-tls podName:d5396980-27a8-413a-b739-f40a60724e10 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:45.23694023 +0000 UTC m=+82.059407943 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d5396980-27a8-413a-b739-f40a60724e10-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fns5j" (UID: "d5396980-27a8-413a-b739-f40a60724e10") : secret "samples-operator-tls" not found Apr 23 17:54:45.245169 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:45.245135 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5396980-27a8-413a-b739-f40a60724e10-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fns5j\" (UID: \"d5396980-27a8-413a-b739-f40a60724e10\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fns5j" Apr 23 17:54:45.245514 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:45.245276 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 17:54:45.245514 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:45.245337 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5396980-27a8-413a-b739-f40a60724e10-samples-operator-tls podName:d5396980-27a8-413a-b739-f40a60724e10 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:47.245322529 +0000 UTC m=+84.067790247 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d5396980-27a8-413a-b739-f40a60724e10-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fns5j" (UID: "d5396980-27a8-413a-b739-f40a60724e10") : secret "samples-operator-tls" not found Apr 23 17:54:46.042178 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:46.042141 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5f6sp" event={"ID":"8d77429e-b156-4f21-8e2d-958b0183cfe9","Type":"ContainerStarted","Data":"fafeec07e9eae8e13b24e0fc7287fcd2846d6ba62cde0d50e170846ea7bb490c"} Apr 23 17:54:46.059918 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:46.059871 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5f6sp" podStartSLOduration=2.019808889 podStartE2EDuration="3.059858323s" podCreationTimestamp="2026-04-23 17:54:43 +0000 UTC" firstStartedPulling="2026-04-23 17:54:43.96294395 +0000 UTC m=+80.785411664" lastFinishedPulling="2026-04-23 17:54:45.002993381 +0000 UTC m=+81.825461098" observedRunningTime="2026-04-23 17:54:46.059015089 +0000 UTC m=+82.881482824" watchObservedRunningTime="2026-04-23 17:54:46.059858323 +0000 UTC m=+82.882326058" Apr 23 17:54:47.259321 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:47.259284 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5396980-27a8-413a-b739-f40a60724e10-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fns5j\" (UID: \"d5396980-27a8-413a-b739-f40a60724e10\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fns5j" Apr 23 17:54:47.259704 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:47.259420 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 17:54:47.259704 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:47.259481 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5396980-27a8-413a-b739-f40a60724e10-samples-operator-tls podName:d5396980-27a8-413a-b739-f40a60724e10 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:51.259466599 +0000 UTC m=+88.081934312 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d5396980-27a8-413a-b739-f40a60724e10-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fns5j" (UID: "d5396980-27a8-413a-b739-f40a60724e10") : secret "samples-operator-tls" not found Apr 23 17:54:49.753731 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:49.753702 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mn726_890e39b8-16d9-4ffa-9934-ca657c99daf2/dns-node-resolver/0.log" Apr 23 17:54:50.055725 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:50.055695 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-n6mwn"] Apr 23 17:54:50.058913 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:50.058897 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-n6mwn" Apr 23 17:54:50.061526 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:50.061497 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 23 17:54:50.061526 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:50.061516 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-mln2t\"" Apr 23 17:54:50.061710 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:50.061510 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 23 17:54:50.066949 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:50.066927 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-n6mwn"] Apr 23 17:54:50.181378 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:50.181349 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/904ab1d8-f170-427c-b547-546b37cd8388-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-n6mwn\" (UID: \"904ab1d8-f170-427c-b547-546b37cd8388\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-n6mwn" Apr 23 17:54:50.181521 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:50.181384 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/904ab1d8-f170-427c-b547-546b37cd8388-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-n6mwn\" (UID: \"904ab1d8-f170-427c-b547-546b37cd8388\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-n6mwn" Apr 23 17:54:50.282211 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:50.282174 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/904ab1d8-f170-427c-b547-546b37cd8388-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-n6mwn\" (UID: \"904ab1d8-f170-427c-b547-546b37cd8388\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-n6mwn" Apr 23 17:54:50.282211 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:50.282217 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/904ab1d8-f170-427c-b547-546b37cd8388-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-n6mwn\" (UID: \"904ab1d8-f170-427c-b547-546b37cd8388\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-n6mwn" Apr 23 17:54:50.282439 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:50.282331 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 17:54:50.282439 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:50.282380 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/904ab1d8-f170-427c-b547-546b37cd8388-networking-console-plugin-cert podName:904ab1d8-f170-427c-b547-546b37cd8388 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:50.782364896 +0000 UTC m=+87.604832609 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/904ab1d8-f170-427c-b547-546b37cd8388-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-n6mwn" (UID: "904ab1d8-f170-427c-b547-546b37cd8388") : secret "networking-console-plugin-cert" not found Apr 23 17:54:50.282769 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:50.282749 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/904ab1d8-f170-427c-b547-546b37cd8388-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-n6mwn\" (UID: \"904ab1d8-f170-427c-b547-546b37cd8388\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-n6mwn" Apr 23 17:54:50.543911 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:50.543885 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qjrbv_308874bf-36fb-4296-aa6f-8568677e83c4/node-ca/0.log" Apr 23 17:54:50.786300 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:50.786260 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/904ab1d8-f170-427c-b547-546b37cd8388-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-n6mwn\" (UID: \"904ab1d8-f170-427c-b547-546b37cd8388\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-n6mwn" Apr 23 17:54:50.786679 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:50.786415 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 17:54:50.786679 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:50.786508 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/904ab1d8-f170-427c-b547-546b37cd8388-networking-console-plugin-cert podName:904ab1d8-f170-427c-b547-546b37cd8388 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:51.786491283 +0000 UTC m=+88.608958996 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/904ab1d8-f170-427c-b547-546b37cd8388-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-n6mwn" (UID: "904ab1d8-f170-427c-b547-546b37cd8388") : secret "networking-console-plugin-cert" not found Apr 23 17:54:51.290432 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:51.290381 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5396980-27a8-413a-b739-f40a60724e10-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fns5j\" (UID: \"d5396980-27a8-413a-b739-f40a60724e10\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fns5j" Apr 23 17:54:51.290613 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:51.290531 2578 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 17:54:51.290613 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:51.290603 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d5396980-27a8-413a-b739-f40a60724e10-samples-operator-tls podName:d5396980-27a8-413a-b739-f40a60724e10 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:59.290584655 +0000 UTC m=+96.113052368 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/d5396980-27a8-413a-b739-f40a60724e10-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-fns5j" (UID: "d5396980-27a8-413a-b739-f40a60724e10") : secret "samples-operator-tls" not found Apr 23 17:54:51.795212 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:51.795176 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/904ab1d8-f170-427c-b547-546b37cd8388-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-n6mwn\" (UID: \"904ab1d8-f170-427c-b547-546b37cd8388\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-n6mwn" Apr 23 17:54:51.795567 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:51.795315 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 17:54:51.795567 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:51.795381 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/904ab1d8-f170-427c-b547-546b37cd8388-networking-console-plugin-cert podName:904ab1d8-f170-427c-b547-546b37cd8388 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:53.795362915 +0000 UTC m=+90.617830637 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/904ab1d8-f170-427c-b547-546b37cd8388-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-n6mwn" (UID: "904ab1d8-f170-427c-b547-546b37cd8388") : secret "networking-console-plugin-cert" not found Apr 23 17:54:51.874396 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:51.874363 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xwfkk"] Apr 23 17:54:51.877538 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:51.877522 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-xwfkk" Apr 23 17:54:51.880196 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:51.880171 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 23 17:54:51.880329 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:51.880281 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 23 17:54:51.880437 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:51.880412 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 23 17:54:51.882035 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:51.881787 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 23 17:54:51.882035 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:51.881912 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-dsvqx\"" Apr 23 17:54:51.887549 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:51.887529 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xwfkk"] Apr 23 17:54:51.997130 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:51.997080 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ef4c6714-360a-4a5c-ae8c-de076749025d-signing-key\") pod \"service-ca-865cb79987-xwfkk\" (UID: \"ef4c6714-360a-4a5c-ae8c-de076749025d\") " pod="openshift-service-ca/service-ca-865cb79987-xwfkk" Apr 23 17:54:51.997258 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:51.997135 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ef4c6714-360a-4a5c-ae8c-de076749025d-signing-cabundle\") pod \"service-ca-865cb79987-xwfkk\" (UID: \"ef4c6714-360a-4a5c-ae8c-de076749025d\") " pod="openshift-service-ca/service-ca-865cb79987-xwfkk" Apr 23 17:54:51.997258 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:51.997228 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx2jg\" (UniqueName: \"kubernetes.io/projected/ef4c6714-360a-4a5c-ae8c-de076749025d-kube-api-access-dx2jg\") pod \"service-ca-865cb79987-xwfkk\" (UID: \"ef4c6714-360a-4a5c-ae8c-de076749025d\") " pod="openshift-service-ca/service-ca-865cb79987-xwfkk" Apr 23 17:54:52.098622 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:52.098533 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dx2jg\" (UniqueName: \"kubernetes.io/projected/ef4c6714-360a-4a5c-ae8c-de076749025d-kube-api-access-dx2jg\") pod \"service-ca-865cb79987-xwfkk\" (UID: \"ef4c6714-360a-4a5c-ae8c-de076749025d\") " pod="openshift-service-ca/service-ca-865cb79987-xwfkk" Apr 23 17:54:52.098756 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:52.098630 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ef4c6714-360a-4a5c-ae8c-de076749025d-signing-key\") pod \"service-ca-865cb79987-xwfkk\" (UID: \"ef4c6714-360a-4a5c-ae8c-de076749025d\") " pod="openshift-service-ca/service-ca-865cb79987-xwfkk" Apr 23 17:54:52.098756 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:52.098649 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ef4c6714-360a-4a5c-ae8c-de076749025d-signing-cabundle\") pod \"service-ca-865cb79987-xwfkk\" (UID: \"ef4c6714-360a-4a5c-ae8c-de076749025d\") " pod="openshift-service-ca/service-ca-865cb79987-xwfkk" Apr 23 17:54:52.099284 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:52.099265 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ef4c6714-360a-4a5c-ae8c-de076749025d-signing-cabundle\") pod \"service-ca-865cb79987-xwfkk\" (UID: \"ef4c6714-360a-4a5c-ae8c-de076749025d\") " pod="openshift-service-ca/service-ca-865cb79987-xwfkk" Apr 23 17:54:52.101127 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:52.101105 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ef4c6714-360a-4a5c-ae8c-de076749025d-signing-key\") pod \"service-ca-865cb79987-xwfkk\" (UID: \"ef4c6714-360a-4a5c-ae8c-de076749025d\") " pod="openshift-service-ca/service-ca-865cb79987-xwfkk" Apr 23 17:54:52.107978 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:52.107947 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx2jg\" (UniqueName: \"kubernetes.io/projected/ef4c6714-360a-4a5c-ae8c-de076749025d-kube-api-access-dx2jg\") pod \"service-ca-865cb79987-xwfkk\" (UID: \"ef4c6714-360a-4a5c-ae8c-de076749025d\") " pod="openshift-service-ca/service-ca-865cb79987-xwfkk" Apr 23 17:54:52.188348 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:52.188302 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-xwfkk" Apr 23 17:54:52.305253 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:52.305221 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-xwfkk"] Apr 23 17:54:52.307877 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:54:52.307843 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef4c6714_360a_4a5c_ae8c_de076749025d.slice/crio-8014935c2f98033de22653e4b00b3a8cc09b066803350a095979577e653bc2d3 WatchSource:0}: Error finding container 8014935c2f98033de22653e4b00b3a8cc09b066803350a095979577e653bc2d3: Status 404 returned error can't find the container with id 8014935c2f98033de22653e4b00b3a8cc09b066803350a095979577e653bc2d3 Apr 23 17:54:53.063102 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:53.063050 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-xwfkk" event={"ID":"ef4c6714-360a-4a5c-ae8c-de076749025d","Type":"ContainerStarted","Data":"8014935c2f98033de22653e4b00b3a8cc09b066803350a095979577e653bc2d3"} Apr 23 17:54:53.811267 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:53.811235 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/904ab1d8-f170-427c-b547-546b37cd8388-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-n6mwn\" (UID: \"904ab1d8-f170-427c-b547-546b37cd8388\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-n6mwn" Apr 23 17:54:53.811454 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:53.811367 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 17:54:53.811454 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:53.811444 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/904ab1d8-f170-427c-b547-546b37cd8388-networking-console-plugin-cert podName:904ab1d8-f170-427c-b547-546b37cd8388 nodeName:}" failed. No retries permitted until 2026-04-23 17:54:57.811424738 +0000 UTC m=+94.633892465 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/904ab1d8-f170-427c-b547-546b37cd8388-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-n6mwn" (UID: "904ab1d8-f170-427c-b547-546b37cd8388") : secret "networking-console-plugin-cert" not found Apr 23 17:54:55.070279 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:55.070241 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-xwfkk" event={"ID":"ef4c6714-360a-4a5c-ae8c-de076749025d","Type":"ContainerStarted","Data":"d36e053084f00179a8d94469a3de2bd76882c470ac9c9c4dca56bced9f78c739"} Apr 23 17:54:55.087360 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:55.087315 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-xwfkk" podStartSLOduration=2.380398594 podStartE2EDuration="4.087301349s" podCreationTimestamp="2026-04-23 17:54:51 +0000 UTC" firstStartedPulling="2026-04-23 17:54:52.30975963 +0000 UTC m=+89.132227342" lastFinishedPulling="2026-04-23 17:54:54.016662384 +0000 UTC m=+90.839130097" observedRunningTime="2026-04-23 17:54:55.087073116 +0000 UTC m=+91.909540854" watchObservedRunningTime="2026-04-23 17:54:55.087301349 +0000 UTC m=+91.909769084" Apr 23 17:54:57.846162 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:57.846121 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/904ab1d8-f170-427c-b547-546b37cd8388-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-n6mwn\" (UID: \"904ab1d8-f170-427c-b547-546b37cd8388\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-n6mwn" Apr 23 17:54:57.846545 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:57.846260 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 17:54:57.846545 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:54:57.846326 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/904ab1d8-f170-427c-b547-546b37cd8388-networking-console-plugin-cert podName:904ab1d8-f170-427c-b547-546b37cd8388 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:05.846308517 +0000 UTC m=+102.668776241 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/904ab1d8-f170-427c-b547-546b37cd8388-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-n6mwn" (UID: "904ab1d8-f170-427c-b547-546b37cd8388") : secret "networking-console-plugin-cert" not found Apr 23 17:54:59.358357 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:59.358316 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5396980-27a8-413a-b739-f40a60724e10-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fns5j\" (UID: \"d5396980-27a8-413a-b739-f40a60724e10\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fns5j" Apr 23 17:54:59.361251 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:59.361226 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/d5396980-27a8-413a-b739-f40a60724e10-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-fns5j\" (UID: \"d5396980-27a8-413a-b739-f40a60724e10\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fns5j" Apr 23 17:54:59.437505 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:59.437477 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fns5j" Apr 23 17:54:59.554789 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:54:59.554651 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fns5j"] Apr 23 17:55:00.082816 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:00.082782 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fns5j" event={"ID":"d5396980-27a8-413a-b739-f40a60724e10","Type":"ContainerStarted","Data":"cd1c80b08bb0338e82b367854975edfa4eb67979cbed7bc9283216022f845259"} Apr 23 17:55:00.568242 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:00.568207 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert\") pod \"ingress-canary-j7wvx\" (UID: \"f33cc58f-96ef-4c06-8b13-ae89d3b2c805\") " pod="openshift-ingress-canary/ingress-canary-j7wvx" Apr 23 17:55:00.568685 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:00.568253 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:55:00.568685 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:00.568294 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls\") pod \"dns-default-7t847\" (UID: \"17f79bf8-0319-418c-a852-2ce9a897e648\") " pod="openshift-dns/dns-default-7t847" Apr 23 17:55:00.571246 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:00.571221 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls\") pod \"image-registry-ddf697474-pclcg\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:55:00.571355 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:00.571271 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/17f79bf8-0319-418c-a852-2ce9a897e648-metrics-tls\") pod \"dns-default-7t847\" (UID: \"17f79bf8-0319-418c-a852-2ce9a897e648\") " pod="openshift-dns/dns-default-7t847" Apr 23 17:55:00.571461 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:00.571438 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f33cc58f-96ef-4c06-8b13-ae89d3b2c805-cert\") pod \"ingress-canary-j7wvx\" (UID: \"f33cc58f-96ef-4c06-8b13-ae89d3b2c805\") " pod="openshift-ingress-canary/ingress-canary-j7wvx" Apr 23 17:55:00.842234 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:00.842202 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-ccrfl\"" Apr 23 17:55:00.845181 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:00.845155 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:55:00.866030 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:00.865981 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nr8tq\"" Apr 23 17:55:00.872380 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:00.872343 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7t847" Apr 23 17:55:00.872793 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:00.872773 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jg9fn\"" Apr 23 17:55:00.880514 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:00.880492 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-j7wvx" Apr 23 17:55:01.120389 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:01.120362 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7t847"] Apr 23 17:55:01.129239 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:55:01.129211 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17f79bf8_0319_418c_a852_2ce9a897e648.slice/crio-541e494171e3d288d9c00f24b13c16393d0568309c0faf78cac8ec57165a9599 WatchSource:0}: Error finding container 541e494171e3d288d9c00f24b13c16393d0568309c0faf78cac8ec57165a9599: Status 404 returned error can't find the container with id 541e494171e3d288d9c00f24b13c16393d0568309c0faf78cac8ec57165a9599 Apr 23 17:55:01.338796 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:01.338146 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-j7wvx"] Apr 23 17:55:01.340146 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:01.340121 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-ddf697474-pclcg"] Apr 23 17:55:01.341325 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:55:01.341298 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf33cc58f_96ef_4c06_8b13_ae89d3b2c805.slice/crio-71ddfd120f18a6b3525f77a9569875b96eb3249e2c1fd68565cd765b32e65793 WatchSource:0}: Error finding container 71ddfd120f18a6b3525f77a9569875b96eb3249e2c1fd68565cd765b32e65793: Status 404 returned error can't find the container with id 71ddfd120f18a6b3525f77a9569875b96eb3249e2c1fd68565cd765b32e65793 Apr 23 17:55:01.343434 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:55:01.343401 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa5eda1b_c6c9_44cc_9e1a_b67e0654acc4.slice/crio-88f57c32eac799b71429f0c30ba12781e4640b0de690555f74bbe2394d16258f WatchSource:0}: Error finding container 88f57c32eac799b71429f0c30ba12781e4640b0de690555f74bbe2394d16258f: Status 404 returned error can't find the container with id 88f57c32eac799b71429f0c30ba12781e4640b0de690555f74bbe2394d16258f Apr 23 17:55:02.089634 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:02.089593 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-ddf697474-pclcg" event={"ID":"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4","Type":"ContainerStarted","Data":"951f88fb1ba2498d466281e7289b43faa08f790168f9ed6747510c48b135248f"} Apr 23 17:55:02.090108 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:02.089641 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-ddf697474-pclcg" event={"ID":"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4","Type":"ContainerStarted","Data":"88f57c32eac799b71429f0c30ba12781e4640b0de690555f74bbe2394d16258f"} Apr 23 17:55:02.090108 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:02.089738 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:55:02.090936 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:02.090905 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-j7wvx" event={"ID":"f33cc58f-96ef-4c06-8b13-ae89d3b2c805","Type":"ContainerStarted","Data":"71ddfd120f18a6b3525f77a9569875b96eb3249e2c1fd68565cd765b32e65793"} Apr 23 17:55:02.092006 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:02.091975 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7t847" event={"ID":"17f79bf8-0319-418c-a852-2ce9a897e648","Type":"ContainerStarted","Data":"541e494171e3d288d9c00f24b13c16393d0568309c0faf78cac8ec57165a9599"} Apr 23 17:55:02.093606 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:02.093579 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fns5j" event={"ID":"d5396980-27a8-413a-b739-f40a60724e10","Type":"ContainerStarted","Data":"471b2f2a2e40a025ed7449f4a5757d24efddb2b2b3dc87601d2e01aab6cdb621"} Apr 23 17:55:02.093606 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:02.093606 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fns5j" event={"ID":"d5396980-27a8-413a-b739-f40a60724e10","Type":"ContainerStarted","Data":"4da00af3d7e0f3a5e566926f75a16169b553575f2bfe2460d457f36e01fb81c6"} Apr 23 17:55:02.122633 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:02.122580 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-ddf697474-pclcg" podStartSLOduration=98.122562809 podStartE2EDuration="1m38.122562809s" podCreationTimestamp="2026-04-23 17:53:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:55:02.122134276 +0000 UTC m=+98.944602010" watchObservedRunningTime="2026-04-23 17:55:02.122562809 +0000 UTC m=+98.945030546" Apr 23 17:55:03.098046 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:03.098008 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7t847" event={"ID":"17f79bf8-0319-418c-a852-2ce9a897e648","Type":"ContainerStarted","Data":"1d76077d12188ee5d8d269ca6b4c376f776a96d943243930d8a37a1d829e2017"} Apr 23 17:55:03.098046 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:03.098051 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7t847" event={"ID":"17f79bf8-0319-418c-a852-2ce9a897e648","Type":"ContainerStarted","Data":"e1507e33e0b226f37f73134d351a71744d9fe8e4e72293ff33893ad6c3be7ad2"} Apr 23 17:55:03.120469 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:03.120422 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7t847" podStartSLOduration=65.832062375 podStartE2EDuration="1m7.120405616s" podCreationTimestamp="2026-04-23 17:53:56 +0000 UTC" firstStartedPulling="2026-04-23 17:55:01.13139623 +0000 UTC m=+97.953863957" lastFinishedPulling="2026-04-23 17:55:02.419739471 +0000 UTC m=+99.242207198" observedRunningTime="2026-04-23 17:55:03.119213346 +0000 UTC m=+99.941681080" watchObservedRunningTime="2026-04-23 17:55:03.120405616 +0000 UTC m=+99.942873351" Apr 23 17:55:03.120771 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:03.120751 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-fns5j" podStartSLOduration=18.713641374 podStartE2EDuration="20.120745594s" podCreationTimestamp="2026-04-23 17:54:43 +0000 UTC" firstStartedPulling="2026-04-23 17:54:59.612352316 +0000 UTC m=+96.434820029" lastFinishedPulling="2026-04-23 17:55:01.019456523 +0000 UTC m=+97.841924249" observedRunningTime="2026-04-23 17:55:02.150305067 +0000 UTC m=+98.972772802" watchObservedRunningTime="2026-04-23 17:55:03.120745594 +0000 UTC m=+99.943213307" Apr 23 17:55:04.101737 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:04.101696 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-j7wvx" event={"ID":"f33cc58f-96ef-4c06-8b13-ae89d3b2c805","Type":"ContainerStarted","Data":"4b19a0d12af5c6afe1cb4d40a3acb27787a9aaf766b1536ede144e960bf3f1a8"} Apr 23 17:55:04.102205 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:04.101876 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-7t847" Apr 23 17:55:05.909800 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:05.909757 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/904ab1d8-f170-427c-b547-546b37cd8388-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-n6mwn\" (UID: \"904ab1d8-f170-427c-b547-546b37cd8388\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-n6mwn" Apr 23 17:55:05.912282 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:05.912237 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/904ab1d8-f170-427c-b547-546b37cd8388-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-n6mwn\" (UID: \"904ab1d8-f170-427c-b547-546b37cd8388\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-n6mwn" Apr 23 17:55:05.968272 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:05.968242 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-n6mwn" Apr 23 17:55:06.117028 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:06.116825 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-j7wvx" podStartSLOduration=67.840672533 podStartE2EDuration="1m10.116804828s" podCreationTimestamp="2026-04-23 17:53:56 +0000 UTC" firstStartedPulling="2026-04-23 17:55:01.343358291 +0000 UTC m=+98.165826008" lastFinishedPulling="2026-04-23 17:55:03.61949059 +0000 UTC m=+100.441958303" observedRunningTime="2026-04-23 17:55:04.123344407 +0000 UTC m=+100.945812134" watchObservedRunningTime="2026-04-23 17:55:06.116804828 +0000 UTC m=+102.939272563" Apr 23 17:55:06.117112 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:06.117058 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-n6mwn"] Apr 23 17:55:06.119391 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:55:06.119364 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod904ab1d8_f170_427c_b547_546b37cd8388.slice/crio-35a6114ac86b9bf6a53f773b69ab82743a0dfe6577038452e8f2452f9b819fbb WatchSource:0}: Error finding container 35a6114ac86b9bf6a53f773b69ab82743a0dfe6577038452e8f2452f9b819fbb: Status 404 returned error can't find the container with id 35a6114ac86b9bf6a53f773b69ab82743a0dfe6577038452e8f2452f9b819fbb Apr 23 17:55:07.110203 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:07.110162 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-n6mwn" event={"ID":"904ab1d8-f170-427c-b547-546b37cd8388","Type":"ContainerStarted","Data":"0a4f6d30929ff270d336ccc1ca855f713af4038a88ad385943b59b79cf26e597"} Apr 23 17:55:07.110515 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:07.110205 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-n6mwn" event={"ID":"904ab1d8-f170-427c-b547-546b37cd8388","Type":"ContainerStarted","Data":"35a6114ac86b9bf6a53f773b69ab82743a0dfe6577038452e8f2452f9b819fbb"} Apr 23 17:55:07.128666 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:07.128620 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-n6mwn" podStartSLOduration=16.266280658 podStartE2EDuration="17.12860794s" podCreationTimestamp="2026-04-23 17:54:50 +0000 UTC" firstStartedPulling="2026-04-23 17:55:06.121226838 +0000 UTC m=+102.943694554" lastFinishedPulling="2026-04-23 17:55:06.983554113 +0000 UTC m=+103.806021836" observedRunningTime="2026-04-23 17:55:07.126834691 +0000 UTC m=+103.949302426" watchObservedRunningTime="2026-04-23 17:55:07.12860794 +0000 UTC m=+103.951075675" Apr 23 17:55:14.106472 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:14.106437 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7t847" Apr 23 17:55:15.812437 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:15.812405 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-kcw7j"] Apr 23 17:55:15.817683 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:15.817657 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kcw7j" Apr 23 17:55:15.824196 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:15.824161 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 17:55:15.825163 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:15.825145 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-2kkk6\"" Apr 23 17:55:15.830196 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:15.830178 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 17:55:15.831179 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:15.831161 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 17:55:15.831745 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:15.831732 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 17:55:15.837321 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:15.837296 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kcw7j"] Apr 23 17:55:15.980369 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:15.980341 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e1cea0c8-f980-48ab-9c67-01d902f521d7-crio-socket\") pod \"insights-runtime-extractor-kcw7j\" (UID: \"e1cea0c8-f980-48ab-9c67-01d902f521d7\") " pod="openshift-insights/insights-runtime-extractor-kcw7j" Apr 23 17:55:15.980552 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:15.980374 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqnv7\" (UniqueName: \"kubernetes.io/projected/e1cea0c8-f980-48ab-9c67-01d902f521d7-kube-api-access-nqnv7\") pod \"insights-runtime-extractor-kcw7j\" (UID: \"e1cea0c8-f980-48ab-9c67-01d902f521d7\") " pod="openshift-insights/insights-runtime-extractor-kcw7j" Apr 23 17:55:15.980552 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:15.980399 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e1cea0c8-f980-48ab-9c67-01d902f521d7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kcw7j\" (UID: \"e1cea0c8-f980-48ab-9c67-01d902f521d7\") " pod="openshift-insights/insights-runtime-extractor-kcw7j" Apr 23 17:55:15.980552 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:15.980464 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e1cea0c8-f980-48ab-9c67-01d902f521d7-data-volume\") pod \"insights-runtime-extractor-kcw7j\" (UID: \"e1cea0c8-f980-48ab-9c67-01d902f521d7\") " pod="openshift-insights/insights-runtime-extractor-kcw7j" Apr 23 17:55:15.980552 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:15.980516 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e1cea0c8-f980-48ab-9c67-01d902f521d7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kcw7j\" (UID: \"e1cea0c8-f980-48ab-9c67-01d902f521d7\") " pod="openshift-insights/insights-runtime-extractor-kcw7j" Apr 23 17:55:16.081594 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:16.081512 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e1cea0c8-f980-48ab-9c67-01d902f521d7-crio-socket\") pod \"insights-runtime-extractor-kcw7j\" (UID: \"e1cea0c8-f980-48ab-9c67-01d902f521d7\") " pod="openshift-insights/insights-runtime-extractor-kcw7j" Apr 23 17:55:16.081594 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:16.081548 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqnv7\" (UniqueName: \"kubernetes.io/projected/e1cea0c8-f980-48ab-9c67-01d902f521d7-kube-api-access-nqnv7\") pod \"insights-runtime-extractor-kcw7j\" (UID: \"e1cea0c8-f980-48ab-9c67-01d902f521d7\") " pod="openshift-insights/insights-runtime-extractor-kcw7j" Apr 23 17:55:16.081594 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:16.081567 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e1cea0c8-f980-48ab-9c67-01d902f521d7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kcw7j\" (UID: \"e1cea0c8-f980-48ab-9c67-01d902f521d7\") " pod="openshift-insights/insights-runtime-extractor-kcw7j" Apr 23 17:55:16.081836 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:16.081599 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e1cea0c8-f980-48ab-9c67-01d902f521d7-data-volume\") pod \"insights-runtime-extractor-kcw7j\" (UID: \"e1cea0c8-f980-48ab-9c67-01d902f521d7\") " pod="openshift-insights/insights-runtime-extractor-kcw7j" Apr 23 17:55:16.081836 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:16.081638 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e1cea0c8-f980-48ab-9c67-01d902f521d7-crio-socket\") pod \"insights-runtime-extractor-kcw7j\" (UID: \"e1cea0c8-f980-48ab-9c67-01d902f521d7\") " pod="openshift-insights/insights-runtime-extractor-kcw7j" Apr 23 17:55:16.081836 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:16.081649 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e1cea0c8-f980-48ab-9c67-01d902f521d7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kcw7j\" (UID: \"e1cea0c8-f980-48ab-9c67-01d902f521d7\") " pod="openshift-insights/insights-runtime-extractor-kcw7j" Apr 23 17:55:16.081963 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:16.081942 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e1cea0c8-f980-48ab-9c67-01d902f521d7-data-volume\") pod \"insights-runtime-extractor-kcw7j\" (UID: \"e1cea0c8-f980-48ab-9c67-01d902f521d7\") " pod="openshift-insights/insights-runtime-extractor-kcw7j" Apr 23 17:55:16.082227 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:16.082207 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e1cea0c8-f980-48ab-9c67-01d902f521d7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-kcw7j\" (UID: \"e1cea0c8-f980-48ab-9c67-01d902f521d7\") " pod="openshift-insights/insights-runtime-extractor-kcw7j" Apr 23 17:55:16.083842 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:16.083809 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e1cea0c8-f980-48ab-9c67-01d902f521d7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-kcw7j\" (UID: \"e1cea0c8-f980-48ab-9c67-01d902f521d7\") " pod="openshift-insights/insights-runtime-extractor-kcw7j" Apr 23 17:55:16.102715 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:16.102689 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqnv7\" (UniqueName: \"kubernetes.io/projected/e1cea0c8-f980-48ab-9c67-01d902f521d7-kube-api-access-nqnv7\") pod \"insights-runtime-extractor-kcw7j\" (UID: \"e1cea0c8-f980-48ab-9c67-01d902f521d7\") " pod="openshift-insights/insights-runtime-extractor-kcw7j" Apr 23 17:55:16.128165 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:16.128145 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-kcw7j" Apr 23 17:55:16.246003 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:16.245980 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-kcw7j"] Apr 23 17:55:16.248337 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:55:16.248304 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1cea0c8_f980_48ab_9c67_01d902f521d7.slice/crio-f517b03e98810ce84210b5e2e21f0a339f97cc53fccc67c0d159aa9b9bf3432c WatchSource:0}: Error finding container f517b03e98810ce84210b5e2e21f0a339f97cc53fccc67c0d159aa9b9bf3432c: Status 404 returned error can't find the container with id f517b03e98810ce84210b5e2e21f0a339f97cc53fccc67c0d159aa9b9bf3432c Apr 23 17:55:17.137589 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:17.137504 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kcw7j" event={"ID":"e1cea0c8-f980-48ab-9c67-01d902f521d7","Type":"ContainerStarted","Data":"b94c76915a07db4e27cecf7738c2beef8350ee0556a86b7e9f6edea10022258d"} Apr 23 17:55:17.137589 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:17.137540 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kcw7j" event={"ID":"e1cea0c8-f980-48ab-9c67-01d902f521d7","Type":"ContainerStarted","Data":"155e0c6f464d3e97812b0bd92620bb715414a5ba636972f926d2bf30488755a9"} Apr 23 17:55:17.137589 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:17.137549 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kcw7j" event={"ID":"e1cea0c8-f980-48ab-9c67-01d902f521d7","Type":"ContainerStarted","Data":"f517b03e98810ce84210b5e2e21f0a339f97cc53fccc67c0d159aa9b9bf3432c"} Apr 23 17:55:18.081195 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:18.081160 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxgc8"] Apr 23 17:55:18.085477 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:18.085457 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxgc8" Apr 23 17:55:18.088423 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:18.088397 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-54dbs\"" Apr 23 17:55:18.088786 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:18.088768 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 23 17:55:18.105738 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:18.105709 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxgc8"] Apr 23 17:55:18.199001 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:18.198971 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/25e3c9d3-09db-4fd7-8fb2-232077749fa6-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hxgc8\" (UID: \"25e3c9d3-09db-4fd7-8fb2-232077749fa6\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxgc8" Apr 23 17:55:18.299651 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:18.299631 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/25e3c9d3-09db-4fd7-8fb2-232077749fa6-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hxgc8\" (UID: \"25e3c9d3-09db-4fd7-8fb2-232077749fa6\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxgc8" Apr 23 17:55:18.302101 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:18.302062 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/25e3c9d3-09db-4fd7-8fb2-232077749fa6-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-hxgc8\" (UID: \"25e3c9d3-09db-4fd7-8fb2-232077749fa6\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxgc8" Apr 23 17:55:18.398186 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:18.398161 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxgc8" Apr 23 17:55:18.518045 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:18.518019 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxgc8"] Apr 23 17:55:18.520540 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:55:18.520517 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25e3c9d3_09db_4fd7_8fb2_232077749fa6.slice/crio-72661ce40899553e42cdf7250275160c9305ea4c3e41b663ecaa60e0568aa2cc WatchSource:0}: Error finding container 72661ce40899553e42cdf7250275160c9305ea4c3e41b663ecaa60e0568aa2cc: Status 404 returned error can't find the container with id 72661ce40899553e42cdf7250275160c9305ea4c3e41b663ecaa60e0568aa2cc Apr 23 17:55:19.145869 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:19.145828 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-kcw7j" event={"ID":"e1cea0c8-f980-48ab-9c67-01d902f521d7","Type":"ContainerStarted","Data":"8de3c69ffd575525ae0e6d23d0e1bb0373cefb4a777b725a8860e2f5a79b5366"} Apr 23 17:55:19.147037 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:19.147007 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxgc8" event={"ID":"25e3c9d3-09db-4fd7-8fb2-232077749fa6","Type":"ContainerStarted","Data":"72661ce40899553e42cdf7250275160c9305ea4c3e41b663ecaa60e0568aa2cc"} Apr 23 17:55:19.175777 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:19.175730 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-kcw7j" podStartSLOduration=2.185847233 podStartE2EDuration="4.175715378s" podCreationTimestamp="2026-04-23 17:55:15 +0000 UTC" firstStartedPulling="2026-04-23 17:55:16.306537579 +0000 UTC m=+113.129005295" lastFinishedPulling="2026-04-23 17:55:18.296405724 +0000 UTC m=+115.118873440" observedRunningTime="2026-04-23 17:55:19.175233389 +0000 UTC m=+115.997701138" watchObservedRunningTime="2026-04-23 17:55:19.175715378 +0000 UTC m=+115.998183114" Apr 23 17:55:20.151408 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:20.151365 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxgc8" event={"ID":"25e3c9d3-09db-4fd7-8fb2-232077749fa6","Type":"ContainerStarted","Data":"6cb37426eed2219f63b2f8bfab85a1f084b9509c708bd15036cce13374c65f6f"} Apr 23 17:55:20.174892 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:20.174847 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxgc8" podStartSLOduration=1.250363394 podStartE2EDuration="2.17483456s" podCreationTimestamp="2026-04-23 17:55:18 +0000 UTC" firstStartedPulling="2026-04-23 17:55:18.522853975 +0000 UTC m=+115.345321688" lastFinishedPulling="2026-04-23 17:55:19.447325127 +0000 UTC m=+116.269792854" observedRunningTime="2026-04-23 17:55:20.174173202 +0000 UTC m=+116.996640959" watchObservedRunningTime="2026-04-23 17:55:20.17483456 +0000 UTC m=+116.997302295" Apr 23 17:55:20.849673 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:20.849630 2578 patch_prober.go:28] interesting pod/image-registry-ddf697474-pclcg container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 17:55:20.849844 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:20.849685 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-ddf697474-pclcg" podUID="fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 17:55:21.154514 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:21.154434 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxgc8" Apr 23 17:55:21.159500 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:21.159476 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-hxgc8" Apr 23 17:55:22.203398 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.203367 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9fdwj"] Apr 23 17:55:22.206710 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.206694 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-9fdwj" Apr 23 17:55:22.209504 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.209478 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 17:55:22.210679 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.210659 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 17:55:22.210803 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.210784 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 23 17:55:22.210870 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.210851 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 23 17:55:22.210929 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.210871 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-5x7l7\"" Apr 23 17:55:22.214922 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.214906 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 17:55:22.221330 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.221311 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9fdwj"] Apr 23 17:55:22.334578 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.334543 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad280719-5945-4ba9-b574-bb0345e669c9-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9fdwj\" (UID: \"ad280719-5945-4ba9-b574-bb0345e669c9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9fdwj" Apr 23 17:55:22.334732 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.334583 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9mtn\" (UniqueName: \"kubernetes.io/projected/ad280719-5945-4ba9-b574-bb0345e669c9-kube-api-access-x9mtn\") pod \"prometheus-operator-5676c8c784-9fdwj\" (UID: \"ad280719-5945-4ba9-b574-bb0345e669c9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9fdwj" Apr 23 17:55:22.334732 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.334611 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ad280719-5945-4ba9-b574-bb0345e669c9-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9fdwj\" (UID: \"ad280719-5945-4ba9-b574-bb0345e669c9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9fdwj" Apr 23 17:55:22.334732 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.334676 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad280719-5945-4ba9-b574-bb0345e669c9-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9fdwj\" (UID: \"ad280719-5945-4ba9-b574-bb0345e669c9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9fdwj" Apr 23 17:55:22.435819 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.435780 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x9mtn\" (UniqueName: \"kubernetes.io/projected/ad280719-5945-4ba9-b574-bb0345e669c9-kube-api-access-x9mtn\") pod \"prometheus-operator-5676c8c784-9fdwj\" (UID: \"ad280719-5945-4ba9-b574-bb0345e669c9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9fdwj" Apr 23 17:55:22.435819 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.435821 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ad280719-5945-4ba9-b574-bb0345e669c9-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9fdwj\" (UID: \"ad280719-5945-4ba9-b574-bb0345e669c9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9fdwj" Apr 23 17:55:22.436054 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.435989 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad280719-5945-4ba9-b574-bb0345e669c9-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9fdwj\" (UID: \"ad280719-5945-4ba9-b574-bb0345e669c9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9fdwj" Apr 23 17:55:22.436157 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.436125 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad280719-5945-4ba9-b574-bb0345e669c9-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9fdwj\" (UID: \"ad280719-5945-4ba9-b574-bb0345e669c9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9fdwj" Apr 23 17:55:22.436755 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.436732 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ad280719-5945-4ba9-b574-bb0345e669c9-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9fdwj\" (UID: \"ad280719-5945-4ba9-b574-bb0345e669c9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9fdwj" Apr 23 17:55:22.438200 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.438178 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ad280719-5945-4ba9-b574-bb0345e669c9-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9fdwj\" (UID: \"ad280719-5945-4ba9-b574-bb0345e669c9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9fdwj" Apr 23 17:55:22.438293 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.438231 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad280719-5945-4ba9-b574-bb0345e669c9-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9fdwj\" (UID: \"ad280719-5945-4ba9-b574-bb0345e669c9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9fdwj" Apr 23 17:55:22.447109 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.447078 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9mtn\" (UniqueName: \"kubernetes.io/projected/ad280719-5945-4ba9-b574-bb0345e669c9-kube-api-access-x9mtn\") pod \"prometheus-operator-5676c8c784-9fdwj\" (UID: \"ad280719-5945-4ba9-b574-bb0345e669c9\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9fdwj" Apr 23 17:55:22.516021 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.515938 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-9fdwj" Apr 23 17:55:22.642374 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:22.642200 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9fdwj"] Apr 23 17:55:22.644800 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:55:22.644771 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad280719_5945_4ba9_b574_bb0345e669c9.slice/crio-ca755af7a88319839efc3d8150202644c4f449eec4c0fe61c38a8fcdbc6016f9 WatchSource:0}: Error finding container ca755af7a88319839efc3d8150202644c4f449eec4c0fe61c38a8fcdbc6016f9: Status 404 returned error can't find the container with id ca755af7a88319839efc3d8150202644c4f449eec4c0fe61c38a8fcdbc6016f9 Apr 23 17:55:23.103206 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:23.103174 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:55:23.160983 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:23.160943 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9fdwj" event={"ID":"ad280719-5945-4ba9-b574-bb0345e669c9","Type":"ContainerStarted","Data":"ca755af7a88319839efc3d8150202644c4f449eec4c0fe61c38a8fcdbc6016f9"} Apr 23 17:55:24.165046 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:24.165012 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9fdwj" event={"ID":"ad280719-5945-4ba9-b574-bb0345e669c9","Type":"ContainerStarted","Data":"2d4ee8afad1fb3d12796409327b7d93ef7129787440afe63946532b7bf0d7c20"} Apr 23 17:55:24.165046 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:24.165052 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9fdwj" event={"ID":"ad280719-5945-4ba9-b574-bb0345e669c9","Type":"ContainerStarted","Data":"6da1ccf5525e4c50e8784e8ca5ecff82f8a8534407df5e9b4f5496033c8d0a3d"} Apr 23 17:55:26.609173 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.609105 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-9fdwj" podStartSLOduration=3.544126002 podStartE2EDuration="4.609070569s" podCreationTimestamp="2026-04-23 17:55:22 +0000 UTC" firstStartedPulling="2026-04-23 17:55:22.646724814 +0000 UTC m=+119.469192526" lastFinishedPulling="2026-04-23 17:55:23.71166938 +0000 UTC m=+120.534137093" observedRunningTime="2026-04-23 17:55:24.196950589 +0000 UTC m=+121.019418337" watchObservedRunningTime="2026-04-23 17:55:26.609070569 +0000 UTC m=+123.431538306" Apr 23 17:55:26.609837 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.609811 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-8c8wq"] Apr 23 17:55:26.614709 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.614688 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh"] Apr 23 17:55:26.614878 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.614859 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.617595 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.617556 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 17:55:26.617717 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.617597 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-l9gtc\"" Apr 23 17:55:26.617717 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.617599 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 17:55:26.617953 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.617930 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 17:55:26.617953 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.617944 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-99rsf"] Apr 23 17:55:26.618246 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.618225 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh" Apr 23 17:55:26.620912 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.620893 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 23 17:55:26.621169 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.621148 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 23 17:55:26.621480 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.621465 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-bmdwk\"" Apr 23 17:55:26.621797 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.621778 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" Apr 23 17:55:26.626640 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.626621 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 23 17:55:26.626993 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.626976 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 23 17:55:26.627337 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.627321 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 23 17:55:26.627398 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.627355 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-5lwth\"" Apr 23 17:55:26.632408 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.632391 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-99rsf"] Apr 23 17:55:26.632977 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.632952 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh"] Apr 23 17:55:26.670968 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.670942 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/20babaa9-49a8-431c-a51c-fc72be72a2cb-node-exporter-textfile\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.671184 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.670988 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/aefeedb2-a459-4b8f-9510-da8a136c2add-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-99rsf\" (UID: \"aefeedb2-a459-4b8f-9510-da8a136c2add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" Apr 23 17:55:26.671184 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.671022 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/20babaa9-49a8-431c-a51c-fc72be72a2cb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.671184 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.671052 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/20babaa9-49a8-431c-a51c-fc72be72a2cb-root\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.671184 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.671096 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/20babaa9-49a8-431c-a51c-fc72be72a2cb-node-exporter-wtmp\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.671184 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.671127 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2z6n\" (UniqueName: \"kubernetes.io/projected/aefeedb2-a459-4b8f-9510-da8a136c2add-kube-api-access-c2z6n\") pod \"kube-state-metrics-69db897b98-99rsf\" (UID: \"aefeedb2-a459-4b8f-9510-da8a136c2add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" Apr 23 17:55:26.671184 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.671163 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/aefeedb2-a459-4b8f-9510-da8a136c2add-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-99rsf\" (UID: \"aefeedb2-a459-4b8f-9510-da8a136c2add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" Apr 23 17:55:26.671586 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.671205 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3c070709-8b02-40df-a7c8-e5b5d0ad22a6-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-brvjh\" (UID: \"3c070709-8b02-40df-a7c8-e5b5d0ad22a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh" Apr 23 17:55:26.671586 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.671236 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nwvw\" (UniqueName: \"kubernetes.io/projected/3c070709-8b02-40df-a7c8-e5b5d0ad22a6-kube-api-access-8nwvw\") pod \"openshift-state-metrics-9d44df66c-brvjh\" (UID: \"3c070709-8b02-40df-a7c8-e5b5d0ad22a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh" Apr 23 17:55:26.671586 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.671267 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aefeedb2-a459-4b8f-9510-da8a136c2add-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-99rsf\" (UID: \"aefeedb2-a459-4b8f-9510-da8a136c2add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" Apr 23 17:55:26.671586 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.671313 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/20babaa9-49a8-431c-a51c-fc72be72a2cb-metrics-client-ca\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.671586 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.671340 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20babaa9-49a8-431c-a51c-fc72be72a2cb-sys\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.671586 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.671370 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/20babaa9-49a8-431c-a51c-fc72be72a2cb-node-exporter-tls\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.671586 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.671400 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3c070709-8b02-40df-a7c8-e5b5d0ad22a6-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-brvjh\" (UID: \"3c070709-8b02-40df-a7c8-e5b5d0ad22a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh" Apr 23 17:55:26.671586 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.671426 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3c070709-8b02-40df-a7c8-e5b5d0ad22a6-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-brvjh\" (UID: \"3c070709-8b02-40df-a7c8-e5b5d0ad22a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh" Apr 23 17:55:26.671586 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.671456 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/aefeedb2-a459-4b8f-9510-da8a136c2add-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-99rsf\" (UID: \"aefeedb2-a459-4b8f-9510-da8a136c2add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" Apr 23 17:55:26.671586 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.671483 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/20babaa9-49a8-431c-a51c-fc72be72a2cb-node-exporter-accelerators-collector-config\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.671586 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.671577 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aefeedb2-a459-4b8f-9510-da8a136c2add-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-99rsf\" (UID: \"aefeedb2-a459-4b8f-9510-da8a136c2add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" Apr 23 17:55:26.672213 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.671641 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v7tw\" (UniqueName: \"kubernetes.io/projected/20babaa9-49a8-431c-a51c-fc72be72a2cb-kube-api-access-4v7tw\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.772646 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.772620 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/20babaa9-49a8-431c-a51c-fc72be72a2cb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.772826 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.772652 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/20babaa9-49a8-431c-a51c-fc72be72a2cb-root\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.772826 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.772672 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/20babaa9-49a8-431c-a51c-fc72be72a2cb-node-exporter-wtmp\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.772826 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.772689 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2z6n\" (UniqueName: \"kubernetes.io/projected/aefeedb2-a459-4b8f-9510-da8a136c2add-kube-api-access-c2z6n\") pod \"kube-state-metrics-69db897b98-99rsf\" (UID: \"aefeedb2-a459-4b8f-9510-da8a136c2add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" Apr 23 17:55:26.772826 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.772714 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/aefeedb2-a459-4b8f-9510-da8a136c2add-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-99rsf\" (UID: \"aefeedb2-a459-4b8f-9510-da8a136c2add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" Apr 23 17:55:26.772826 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.772761 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/20babaa9-49a8-431c-a51c-fc72be72a2cb-root\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.772826 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.772763 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3c070709-8b02-40df-a7c8-e5b5d0ad22a6-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-brvjh\" (UID: \"3c070709-8b02-40df-a7c8-e5b5d0ad22a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh" Apr 23 17:55:26.772826 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.772826 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nwvw\" (UniqueName: \"kubernetes.io/projected/3c070709-8b02-40df-a7c8-e5b5d0ad22a6-kube-api-access-8nwvw\") pod \"openshift-state-metrics-9d44df66c-brvjh\" (UID: \"3c070709-8b02-40df-a7c8-e5b5d0ad22a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh" Apr 23 17:55:26.773304 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.772857 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aefeedb2-a459-4b8f-9510-da8a136c2add-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-99rsf\" (UID: \"aefeedb2-a459-4b8f-9510-da8a136c2add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" Apr 23 17:55:26.773304 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.772859 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/20babaa9-49a8-431c-a51c-fc72be72a2cb-node-exporter-wtmp\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.773304 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.772919 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/20babaa9-49a8-431c-a51c-fc72be72a2cb-metrics-client-ca\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.773304 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.772947 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20babaa9-49a8-431c-a51c-fc72be72a2cb-sys\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.773304 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.772975 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/20babaa9-49a8-431c-a51c-fc72be72a2cb-node-exporter-tls\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.773304 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.773007 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3c070709-8b02-40df-a7c8-e5b5d0ad22a6-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-brvjh\" (UID: \"3c070709-8b02-40df-a7c8-e5b5d0ad22a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh" Apr 23 17:55:26.773304 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.773035 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3c070709-8b02-40df-a7c8-e5b5d0ad22a6-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-brvjh\" (UID: \"3c070709-8b02-40df-a7c8-e5b5d0ad22a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh" Apr 23 17:55:26.773304 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.773064 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/aefeedb2-a459-4b8f-9510-da8a136c2add-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-99rsf\" (UID: \"aefeedb2-a459-4b8f-9510-da8a136c2add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" Apr 23 17:55:26.773768 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.773393 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20babaa9-49a8-431c-a51c-fc72be72a2cb-sys\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.773768 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.773455 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/aefeedb2-a459-4b8f-9510-da8a136c2add-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-99rsf\" (UID: \"aefeedb2-a459-4b8f-9510-da8a136c2add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" Apr 23 17:55:26.773768 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.773518 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3c070709-8b02-40df-a7c8-e5b5d0ad22a6-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-brvjh\" (UID: \"3c070709-8b02-40df-a7c8-e5b5d0ad22a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh" Apr 23 17:55:26.773768 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:55:26.773561 2578 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 17:55:26.773768 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:55:26.773616 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20babaa9-49a8-431c-a51c-fc72be72a2cb-node-exporter-tls podName:20babaa9-49a8-431c-a51c-fc72be72a2cb nodeName:}" failed. No retries permitted until 2026-04-23 17:55:27.273598574 +0000 UTC m=+124.096066288 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/20babaa9-49a8-431c-a51c-fc72be72a2cb-node-exporter-tls") pod "node-exporter-8c8wq" (UID: "20babaa9-49a8-431c-a51c-fc72be72a2cb") : secret "node-exporter-tls" not found Apr 23 17:55:26.774104 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:55:26.773784 2578 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 23 17:55:26.774104 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:55:26.773829 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c070709-8b02-40df-a7c8-e5b5d0ad22a6-openshift-state-metrics-tls podName:3c070709-8b02-40df-a7c8-e5b5d0ad22a6 nodeName:}" failed. No retries permitted until 2026-04-23 17:55:27.273815405 +0000 UTC m=+124.096283120 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/3c070709-8b02-40df-a7c8-e5b5d0ad22a6-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-brvjh" (UID: "3c070709-8b02-40df-a7c8-e5b5d0ad22a6") : secret "openshift-state-metrics-tls" not found Apr 23 17:55:26.774104 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.773871 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/aefeedb2-a459-4b8f-9510-da8a136c2add-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-99rsf\" (UID: \"aefeedb2-a459-4b8f-9510-da8a136c2add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" Apr 23 17:55:26.774104 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.773937 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/20babaa9-49a8-431c-a51c-fc72be72a2cb-node-exporter-accelerators-collector-config\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.774104 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.773960 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/aefeedb2-a459-4b8f-9510-da8a136c2add-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-99rsf\" (UID: \"aefeedb2-a459-4b8f-9510-da8a136c2add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" Apr 23 17:55:26.774104 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.773974 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aefeedb2-a459-4b8f-9510-da8a136c2add-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-99rsf\" (UID: \"aefeedb2-a459-4b8f-9510-da8a136c2add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" Apr 23 17:55:26.774104 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.774014 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4v7tw\" (UniqueName: \"kubernetes.io/projected/20babaa9-49a8-431c-a51c-fc72be72a2cb-kube-api-access-4v7tw\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.774104 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.774063 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/20babaa9-49a8-431c-a51c-fc72be72a2cb-node-exporter-textfile\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.774549 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.774134 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/aefeedb2-a459-4b8f-9510-da8a136c2add-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-99rsf\" (UID: \"aefeedb2-a459-4b8f-9510-da8a136c2add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" Apr 23 17:55:26.774549 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.774502 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/20babaa9-49a8-431c-a51c-fc72be72a2cb-node-exporter-accelerators-collector-config\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.774765 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.774741 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/20babaa9-49a8-431c-a51c-fc72be72a2cb-metrics-client-ca\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.774855 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.774835 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/20babaa9-49a8-431c-a51c-fc72be72a2cb-node-exporter-textfile\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.775963 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.775501 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/20babaa9-49a8-431c-a51c-fc72be72a2cb-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.776648 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.776618 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3c070709-8b02-40df-a7c8-e5b5d0ad22a6-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-brvjh\" (UID: \"3c070709-8b02-40df-a7c8-e5b5d0ad22a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh" Apr 23 17:55:26.776937 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.776915 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/aefeedb2-a459-4b8f-9510-da8a136c2add-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-99rsf\" (UID: \"aefeedb2-a459-4b8f-9510-da8a136c2add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" Apr 23 17:55:26.777423 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.777403 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/aefeedb2-a459-4b8f-9510-da8a136c2add-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-99rsf\" (UID: \"aefeedb2-a459-4b8f-9510-da8a136c2add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" Apr 23 17:55:26.784064 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.784024 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2z6n\" (UniqueName: \"kubernetes.io/projected/aefeedb2-a459-4b8f-9510-da8a136c2add-kube-api-access-c2z6n\") pod \"kube-state-metrics-69db897b98-99rsf\" (UID: \"aefeedb2-a459-4b8f-9510-da8a136c2add\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" Apr 23 17:55:26.784332 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.784312 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v7tw\" (UniqueName: \"kubernetes.io/projected/20babaa9-49a8-431c-a51c-fc72be72a2cb-kube-api-access-4v7tw\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:26.784820 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.784797 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nwvw\" (UniqueName: \"kubernetes.io/projected/3c070709-8b02-40df-a7c8-e5b5d0ad22a6-kube-api-access-8nwvw\") pod \"openshift-state-metrics-9d44df66c-brvjh\" (UID: \"3c070709-8b02-40df-a7c8-e5b5d0ad22a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh" Apr 23 17:55:26.942542 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:26.942466 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" Apr 23 17:55:27.064114 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.064070 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-99rsf"] Apr 23 17:55:27.066351 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:55:27.066321 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaefeedb2_a459_4b8f_9510_da8a136c2add.slice/crio-d2328557ab8d395fa2c37ee334d154da43807dacf19911e4dc2a8207ab531389 WatchSource:0}: Error finding container d2328557ab8d395fa2c37ee334d154da43807dacf19911e4dc2a8207ab531389: Status 404 returned error can't find the container with id d2328557ab8d395fa2c37ee334d154da43807dacf19911e4dc2a8207ab531389 Apr 23 17:55:27.176971 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.176940 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" event={"ID":"aefeedb2-a459-4b8f-9510-da8a136c2add","Type":"ContainerStarted","Data":"d2328557ab8d395fa2c37ee334d154da43807dacf19911e4dc2a8207ab531389"} Apr 23 17:55:27.279682 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.279604 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3c070709-8b02-40df-a7c8-e5b5d0ad22a6-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-brvjh\" (UID: \"3c070709-8b02-40df-a7c8-e5b5d0ad22a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh" Apr 23 17:55:27.279827 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.279722 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/20babaa9-49a8-431c-a51c-fc72be72a2cb-node-exporter-tls\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:27.282001 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.281976 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/20babaa9-49a8-431c-a51c-fc72be72a2cb-node-exporter-tls\") pod \"node-exporter-8c8wq\" (UID: \"20babaa9-49a8-431c-a51c-fc72be72a2cb\") " pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:27.282194 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.282174 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/3c070709-8b02-40df-a7c8-e5b5d0ad22a6-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-brvjh\" (UID: \"3c070709-8b02-40df-a7c8-e5b5d0ad22a6\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh" Apr 23 17:55:27.527596 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.527504 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8c8wq" Apr 23 17:55:27.535672 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.535650 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh" Apr 23 17:55:27.538119 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:55:27.538069 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20babaa9_49a8_431c_a51c_fc72be72a2cb.slice/crio-492d46d271cd9a1da17c42193df3ce922c55d6fca637e93ddec52897b1efb060 WatchSource:0}: Error finding container 492d46d271cd9a1da17c42193df3ce922c55d6fca637e93ddec52897b1efb060: Status 404 returned error can't find the container with id 492d46d271cd9a1da17c42193df3ce922c55d6fca637e93ddec52897b1efb060 Apr 23 17:55:27.678033 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.678001 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh"] Apr 23 17:55:27.681461 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:55:27.681431 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c070709_8b02_40df_a7c8_e5b5d0ad22a6.slice/crio-d55545a4fb96f3eec9fcb069745d331f583dca1905cd4e5bc220fe1cdafd497a WatchSource:0}: Error finding container d55545a4fb96f3eec9fcb069745d331f583dca1905cd4e5bc220fe1cdafd497a: Status 404 returned error can't find the container with id d55545a4fb96f3eec9fcb069745d331f583dca1905cd4e5bc220fe1cdafd497a Apr 23 17:55:27.688109 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.687560 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 17:55:27.693504 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.693486 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.696886 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.696409 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 17:55:27.696886 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.696476 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-s8vnv\"" Apr 23 17:55:27.696886 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.696529 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 17:55:27.696886 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.696604 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 17:55:27.696886 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.696658 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 17:55:27.696886 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.696676 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 17:55:27.696886 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.696726 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 17:55:27.696886 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.696782 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 17:55:27.696886 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.696872 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 17:55:27.697371 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.696947 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 17:55:27.706844 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.706805 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 17:55:27.784654 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.784600 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1a4e260-a755-4505-99e6-e0afee647d86-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.784807 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.784660 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-config-volume\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.784807 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.784685 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.784807 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.784707 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.784807 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.784740 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f1a4e260-a755-4505-99e6-e0afee647d86-config-out\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.784807 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.784762 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f1a4e260-a755-4505-99e6-e0afee647d86-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.784807 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.784793 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.785105 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.784837 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f1a4e260-a755-4505-99e6-e0afee647d86-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.785105 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.784892 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.785105 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.784928 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1a4e260-a755-4505-99e6-e0afee647d86-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.785105 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.784959 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-web-config\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.785105 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.785003 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxxzd\" (UniqueName: \"kubernetes.io/projected/f1a4e260-a755-4505-99e6-e0afee647d86-kube-api-access-wxxzd\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.785105 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.785033 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.886119 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.886075 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-web-config\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.886325 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.886228 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxxzd\" (UniqueName: \"kubernetes.io/projected/f1a4e260-a755-4505-99e6-e0afee647d86-kube-api-access-wxxzd\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.886325 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.886265 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.886325 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.886296 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1a4e260-a755-4505-99e6-e0afee647d86-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.886491 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.886341 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-config-volume\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.886491 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.886365 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.886491 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.886391 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.886491 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.886432 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f1a4e260-a755-4505-99e6-e0afee647d86-config-out\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.886491 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.886455 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f1a4e260-a755-4505-99e6-e0afee647d86-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.886726 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.886490 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.886726 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.886536 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f1a4e260-a755-4505-99e6-e0afee647d86-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.886726 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.886597 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.886726 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.886636 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1a4e260-a755-4505-99e6-e0afee647d86-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.887581 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.887553 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1a4e260-a755-4505-99e6-e0afee647d86-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.888437 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.887730 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1a4e260-a755-4505-99e6-e0afee647d86-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.888437 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.888040 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f1a4e260-a755-4505-99e6-e0afee647d86-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.891069 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.891023 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-config-volume\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.894484 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.894436 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f1a4e260-a755-4505-99e6-e0afee647d86-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.895022 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.894979 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.895899 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.895507 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.895899 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.895864 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.896547 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.896507 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.896844 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.896818 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.896981 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.896841 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f1a4e260-a755-4505-99e6-e0afee647d86-config-out\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.897355 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.897277 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-web-config\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:27.916521 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:27.916498 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxxzd\" (UniqueName: \"kubernetes.io/projected/f1a4e260-a755-4505-99e6-e0afee647d86-kube-api-access-wxxzd\") pod \"alertmanager-main-0\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:28.011845 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:28.011736 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:55:28.182308 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:28.182263 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh" event={"ID":"3c070709-8b02-40df-a7c8-e5b5d0ad22a6","Type":"ContainerStarted","Data":"4202ecef1a47dfc6d77b90ddd6b6cef9287c05fb0cb8e170b922699a77d88866"} Apr 23 17:55:28.182308 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:28.182311 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh" event={"ID":"3c070709-8b02-40df-a7c8-e5b5d0ad22a6","Type":"ContainerStarted","Data":"bc26d329edbbf5341f8181ef1b41ffdc342546d92b0ce88c042dc9926ef2c7a5"} Apr 23 17:55:28.182545 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:28.182326 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh" event={"ID":"3c070709-8b02-40df-a7c8-e5b5d0ad22a6","Type":"ContainerStarted","Data":"d55545a4fb96f3eec9fcb069745d331f583dca1905cd4e5bc220fe1cdafd497a"} Apr 23 17:55:28.183600 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:28.183568 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8c8wq" event={"ID":"20babaa9-49a8-431c-a51c-fc72be72a2cb","Type":"ContainerStarted","Data":"492d46d271cd9a1da17c42193df3ce922c55d6fca637e93ddec52897b1efb060"} Apr 23 17:55:28.634697 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:28.634647 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 17:55:28.640793 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:55:28.640738 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1a4e260_a755_4505_99e6_e0afee647d86.slice/crio-c1ba13bffe5913947541bb9f360a46698de6c5faa8eb22da4e8a28c964a5f494 WatchSource:0}: Error finding container c1ba13bffe5913947541bb9f360a46698de6c5faa8eb22da4e8a28c964a5f494: Status 404 returned error can't find the container with id c1ba13bffe5913947541bb9f360a46698de6c5faa8eb22da4e8a28c964a5f494 Apr 23 17:55:29.187380 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:29.187348 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1a4e260-a755-4505-99e6-e0afee647d86","Type":"ContainerStarted","Data":"c1ba13bffe5913947541bb9f360a46698de6c5faa8eb22da4e8a28c964a5f494"} Apr 23 17:55:29.189272 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:29.189248 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" event={"ID":"aefeedb2-a459-4b8f-9510-da8a136c2add","Type":"ContainerStarted","Data":"77a4546ffb5c9e51b5bfc15ced99f934be1c93fe2ec36b1811e95033ec450e07"} Apr 23 17:55:29.189272 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:29.189281 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" event={"ID":"aefeedb2-a459-4b8f-9510-da8a136c2add","Type":"ContainerStarted","Data":"aa599d61e50d55c145474b3ad6fafc90dadb35eee73ebdca3b5c9aa883039ba4"} Apr 23 17:55:29.189435 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:29.189296 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" event={"ID":"aefeedb2-a459-4b8f-9510-da8a136c2add","Type":"ContainerStarted","Data":"01f4aab1ec9d29e707947a39b76cde1f91371191c8438b61eef7b6b2af7964d8"} Apr 23 17:55:29.190833 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:29.190806 2578 generic.go:358] "Generic (PLEG): container finished" podID="20babaa9-49a8-431c-a51c-fc72be72a2cb" containerID="8bd78b59845bb397c7ab7bed04b532e66c1e29778ac815930586e09b83eb7467" exitCode=0 Apr 23 17:55:29.190939 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:29.190884 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8c8wq" event={"ID":"20babaa9-49a8-431c-a51c-fc72be72a2cb","Type":"ContainerDied","Data":"8bd78b59845bb397c7ab7bed04b532e66c1e29778ac815930586e09b83eb7467"} Apr 23 17:55:29.212439 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:29.212390 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-99rsf" podStartSLOduration=1.810755215 podStartE2EDuration="3.212376535s" podCreationTimestamp="2026-04-23 17:55:26 +0000 UTC" firstStartedPulling="2026-04-23 17:55:27.068262365 +0000 UTC m=+123.890730081" lastFinishedPulling="2026-04-23 17:55:28.469883687 +0000 UTC m=+125.292351401" observedRunningTime="2026-04-23 17:55:29.210833901 +0000 UTC m=+126.033301636" watchObservedRunningTime="2026-04-23 17:55:29.212376535 +0000 UTC m=+126.034844269" Apr 23 17:55:30.195162 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:30.195062 2578 generic.go:358] "Generic (PLEG): container finished" podID="f1a4e260-a755-4505-99e6-e0afee647d86" containerID="455d76fb0c6442a1d5dab02692ecc6193470c7d03687916377db439133da5a51" exitCode=0 Apr 23 17:55:30.195162 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:30.195127 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1a4e260-a755-4505-99e6-e0afee647d86","Type":"ContainerDied","Data":"455d76fb0c6442a1d5dab02692ecc6193470c7d03687916377db439133da5a51"} Apr 23 17:55:30.197394 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:30.197368 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8c8wq" event={"ID":"20babaa9-49a8-431c-a51c-fc72be72a2cb","Type":"ContainerStarted","Data":"309ffd26dc8955894adee3b8fcd649f5f8c1e3741ff63e4f7d12080c1ebc5a6b"} Apr 23 17:55:30.197545 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:30.197407 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8c8wq" event={"ID":"20babaa9-49a8-431c-a51c-fc72be72a2cb","Type":"ContainerStarted","Data":"905c6a03690ef4c891b132a4b8645137ecf19fe508fcd13243305c1b4855e3de"} Apr 23 17:55:30.199182 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:30.199157 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh" event={"ID":"3c070709-8b02-40df-a7c8-e5b5d0ad22a6","Type":"ContainerStarted","Data":"c372863d5b8393380f296f3c72ef0f5bcf1509eb627b4789537f0c5aa44e184a"} Apr 23 17:55:30.265921 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:30.265874 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-8c8wq" podStartSLOduration=3.335729477 podStartE2EDuration="4.26585892s" podCreationTimestamp="2026-04-23 17:55:26 +0000 UTC" firstStartedPulling="2026-04-23 17:55:27.541032629 +0000 UTC m=+124.363500349" lastFinishedPulling="2026-04-23 17:55:28.471162078 +0000 UTC m=+125.293629792" observedRunningTime="2026-04-23 17:55:30.261610037 +0000 UTC m=+127.084077772" watchObservedRunningTime="2026-04-23 17:55:30.26585892 +0000 UTC m=+127.088326653" Apr 23 17:55:30.295106 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:30.295046 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-brvjh" podStartSLOduration=3.007174418 podStartE2EDuration="4.295034183s" podCreationTimestamp="2026-04-23 17:55:26 +0000 UTC" firstStartedPulling="2026-04-23 17:55:27.861667536 +0000 UTC m=+124.684135249" lastFinishedPulling="2026-04-23 17:55:29.149527273 +0000 UTC m=+125.971995014" observedRunningTime="2026-04-23 17:55:30.294644101 +0000 UTC m=+127.117111835" watchObservedRunningTime="2026-04-23 17:55:30.295034183 +0000 UTC m=+127.117501917" Apr 23 17:55:31.395235 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:31.395200 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-z5dwz"] Apr 23 17:55:31.400119 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:31.400060 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-z5dwz" Apr 23 17:55:31.402611 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:31.402585 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 23 17:55:31.402919 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:31.402903 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-d6pq2\"" Apr 23 17:55:31.407828 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:31.407783 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-z5dwz"] Apr 23 17:55:31.523360 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:31.523329 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb4573a0-b31c-44f4-aab7-34751555bf31-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-z5dwz\" (UID: \"eb4573a0-b31c-44f4-aab7-34751555bf31\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-z5dwz" Apr 23 17:55:31.624526 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:31.624496 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb4573a0-b31c-44f4-aab7-34751555bf31-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-z5dwz\" (UID: \"eb4573a0-b31c-44f4-aab7-34751555bf31\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-z5dwz" Apr 23 17:55:31.634216 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:31.628619 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eb4573a0-b31c-44f4-aab7-34751555bf31-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-z5dwz\" (UID: \"eb4573a0-b31c-44f4-aab7-34751555bf31\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-z5dwz" Apr 23 17:55:31.712886 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:31.712858 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-z5dwz" Apr 23 17:55:31.862007 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:31.861974 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-z5dwz"] Apr 23 17:55:31.865981 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:55:31.865945 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb4573a0_b31c_44f4_aab7_34751555bf31.slice/crio-82024f861fa5a081200be64e3fa6e0c4e4e47f1184becb22afd6bbe496b8886f WatchSource:0}: Error finding container 82024f861fa5a081200be64e3fa6e0c4e4e47f1184becb22afd6bbe496b8886f: Status 404 returned error can't find the container with id 82024f861fa5a081200be64e3fa6e0c4e4e47f1184becb22afd6bbe496b8886f Apr 23 17:55:32.207106 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.207011 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-z5dwz" event={"ID":"eb4573a0-b31c-44f4-aab7-34751555bf31","Type":"ContainerStarted","Data":"82024f861fa5a081200be64e3fa6e0c4e4e47f1184becb22afd6bbe496b8886f"} Apr 23 17:55:32.209806 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.209777 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1a4e260-a755-4505-99e6-e0afee647d86","Type":"ContainerStarted","Data":"a379dfdba37ba5bd6d4ec6866ef630cc39b019895f27be1eeaec5836b4438b2f"} Apr 23 17:55:32.209921 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.209812 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1a4e260-a755-4505-99e6-e0afee647d86","Type":"ContainerStarted","Data":"737d727c4c56f78d8ab50733d458f59b1620aa11f821bf519c62c21cb76b1393"} Apr 23 17:55:32.209921 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.209822 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1a4e260-a755-4505-99e6-e0afee647d86","Type":"ContainerStarted","Data":"b217f9eab9526d918465cb2d0bf95a0be007ebda18998877937d5dca25b994fc"} Apr 23 17:55:32.209921 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.209831 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1a4e260-a755-4505-99e6-e0afee647d86","Type":"ContainerStarted","Data":"c444eafcf81213f682216e68c3ab3eb18441d5d947101eb771f3e99f4a81a086"} Apr 23 17:55:32.209921 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.209839 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1a4e260-a755-4505-99e6-e0afee647d86","Type":"ContainerStarted","Data":"9d44b333a70d39aa80b1c2d1345b51c32e555ec748f1cb3b749d5157080c40a9"} Apr 23 17:55:32.942100 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.940994 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:55:32.946161 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.946129 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:32.949658 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.949357 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 17:55:32.949658 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.949515 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 17:55:32.949658 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.949519 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 17:55:32.949658 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.949566 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-d6lu1r4c085fu\"" Apr 23 17:55:32.950710 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.950684 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 17:55:32.950710 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.950700 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 17:55:32.950912 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.950895 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 17:55:32.951005 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.950989 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 17:55:32.951286 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.951268 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 17:55:32.951286 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.951279 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 17:55:32.951488 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.951471 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 17:55:32.951552 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.951498 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 17:55:32.952108 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.951906 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-mh29w\"" Apr 23 17:55:32.952311 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.952287 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 17:55:32.953173 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.952926 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 17:55:32.962185 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:32.962164 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:55:33.038566 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.038527 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.038733 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.038580 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.038733 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.038669 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-config-out\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.038830 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.038762 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.038830 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.038797 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.038916 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.038829 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.038916 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.038892 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.039107 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.039061 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.039220 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.039130 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.039220 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.039160 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhk6l\" (UniqueName: \"kubernetes.io/projected/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-kube-api-access-qhk6l\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.039220 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.039210 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.039388 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.039338 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.039430 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.039388 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.039430 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.039421 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.039513 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.039467 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.039513 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.039494 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.039583 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.039518 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-web-config\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.039583 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.039544 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-config\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.140376 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.140284 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.140553 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.140419 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.140553 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.140486 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.140553 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.140528 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.140716 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.140571 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.140716 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.140604 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.140716 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.140630 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhk6l\" (UniqueName: \"kubernetes.io/projected/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-kube-api-access-qhk6l\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.140716 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.140691 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.140895 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.140755 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.140895 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.140805 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.140895 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.140858 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.141053 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.140930 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.141053 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.140959 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.141053 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.141002 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-web-config\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.141053 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.141031 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-config\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.141816 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.141153 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.141816 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.141204 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.141957 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.141931 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-config-out\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.143124 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.142805 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.144469 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.144139 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.147133 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.147001 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-web-config\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.147838 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.147506 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.147838 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.147800 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.148690 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.148328 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.148690 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.148447 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-config-out\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.148690 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.148612 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.148889 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.148771 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.149655 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.149343 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.150403 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.150345 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.151048 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.151009 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.151819 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.151793 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.154124 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.154068 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.154763 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.154723 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhk6l\" (UniqueName: \"kubernetes.io/projected/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-kube-api-access-qhk6l\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.155207 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.155049 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.157344 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.155808 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-config\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.157600 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.157550 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.218216 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.218127 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1a4e260-a755-4505-99e6-e0afee647d86","Type":"ContainerStarted","Data":"d5a73a3fcf82cf783db5ff115ee514b7c6cd7dd6989199b9542ea7eb0cf50894"} Apr 23 17:55:33.255112 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.255029 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.214020194 podStartE2EDuration="6.25500825s" podCreationTimestamp="2026-04-23 17:55:27 +0000 UTC" firstStartedPulling="2026-04-23 17:55:28.643813888 +0000 UTC m=+125.466281617" lastFinishedPulling="2026-04-23 17:55:32.684801942 +0000 UTC m=+129.507269673" observedRunningTime="2026-04-23 17:55:33.25244236 +0000 UTC m=+130.074910097" watchObservedRunningTime="2026-04-23 17:55:33.25500825 +0000 UTC m=+130.077475986" Apr 23 17:55:33.260854 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.260820 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:33.418302 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.418267 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:55:33.445836 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.445811 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs\") pod \"network-metrics-daemon-mqfsb\" (UID: \"e70550da-839d-4462-b368-c0139f793c15\") " pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:55:33.448717 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.448539 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e70550da-839d-4462-b368-c0139f793c15-metrics-certs\") pod \"network-metrics-daemon-mqfsb\" (UID: \"e70550da-839d-4462-b368-c0139f793c15\") " pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:55:33.679107 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.675617 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x582n\"" Apr 23 17:55:33.682805 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.682778 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mqfsb" Apr 23 17:55:33.801744 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:33.801717 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mqfsb"] Apr 23 17:55:33.804411 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:55:33.804381 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode70550da_839d_4462_b368_c0139f793c15.slice/crio-061a8ed10da64fd196f8db34b15d8cef398542157858ad4e44c5177926a57e60 WatchSource:0}: Error finding container 061a8ed10da64fd196f8db34b15d8cef398542157858ad4e44c5177926a57e60: Status 404 returned error can't find the container with id 061a8ed10da64fd196f8db34b15d8cef398542157858ad4e44c5177926a57e60 Apr 23 17:55:34.223365 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:34.223323 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-z5dwz" event={"ID":"eb4573a0-b31c-44f4-aab7-34751555bf31","Type":"ContainerStarted","Data":"6685a4f7326dcf1a4cfa06ef615c755205a357e2e91beae94ac00582ceb2fa19"} Apr 23 17:55:34.223822 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:34.223559 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-z5dwz" Apr 23 17:55:34.224639 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:34.224615 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mqfsb" event={"ID":"e70550da-839d-4462-b368-c0139f793c15","Type":"ContainerStarted","Data":"061a8ed10da64fd196f8db34b15d8cef398542157858ad4e44c5177926a57e60"} Apr 23 17:55:34.226334 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:34.226308 2578 generic.go:358] "Generic (PLEG): container finished" podID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerID="49b556de5e0c6e1ecee13409efacec517e493030a838ecbec4325dbb06a8ef68" exitCode=0 Apr 23 17:55:34.226439 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:34.226347 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e","Type":"ContainerDied","Data":"49b556de5e0c6e1ecee13409efacec517e493030a838ecbec4325dbb06a8ef68"} Apr 23 17:55:34.226439 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:34.226385 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e","Type":"ContainerStarted","Data":"1be0116d4366cc105a9e41e0f701adf689d0cb30cd99fa3cba683876af77e67f"} Apr 23 17:55:34.229878 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:34.229832 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-z5dwz" Apr 23 17:55:34.252279 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:34.252228 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-z5dwz" podStartSLOduration=1.810719878 podStartE2EDuration="3.252209748s" podCreationTimestamp="2026-04-23 17:55:31 +0000 UTC" firstStartedPulling="2026-04-23 17:55:31.868051063 +0000 UTC m=+128.690518777" lastFinishedPulling="2026-04-23 17:55:33.309540932 +0000 UTC m=+130.132008647" observedRunningTime="2026-04-23 17:55:34.250390905 +0000 UTC m=+131.072858642" watchObservedRunningTime="2026-04-23 17:55:34.252209748 +0000 UTC m=+131.074677482" Apr 23 17:55:35.231398 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:35.231330 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mqfsb" event={"ID":"e70550da-839d-4462-b368-c0139f793c15","Type":"ContainerStarted","Data":"bba905b1af5c4db65eae917cb3630e88cc506ebb234bfdabc818bd2d9eb0765d"} Apr 23 17:55:35.231398 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:35.231396 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mqfsb" event={"ID":"e70550da-839d-4462-b368-c0139f793c15","Type":"ContainerStarted","Data":"3e69b66ac29a15107b65e1db408f14bdee381c1df131f44d0757b9533ac0b8e0"} Apr 23 17:55:35.254977 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:35.254930 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mqfsb" podStartSLOduration=130.386821553 podStartE2EDuration="2m11.254914137s" podCreationTimestamp="2026-04-23 17:53:24 +0000 UTC" firstStartedPulling="2026-04-23 17:55:33.806356265 +0000 UTC m=+130.628823978" lastFinishedPulling="2026-04-23 17:55:34.674448848 +0000 UTC m=+131.496916562" observedRunningTime="2026-04-23 17:55:35.253380461 +0000 UTC m=+132.075848198" watchObservedRunningTime="2026-04-23 17:55:35.254914137 +0000 UTC m=+132.077381873" Apr 23 17:55:37.242285 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:37.242206 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e","Type":"ContainerStarted","Data":"83ab36b82cc2e9353b1b128ee5f9a22ccf59b4bfc01c80689efc42bce596f8e5"} Apr 23 17:55:37.242285 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:37.242240 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e","Type":"ContainerStarted","Data":"70a19b34eeaff693fadbe98dbda66d4a9eb5b84e020d678622df7b86e95db571"} Apr 23 17:55:38.488840 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:38.488803 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-ddf697474-pclcg"] Apr 23 17:55:39.252583 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:39.252543 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e","Type":"ContainerStarted","Data":"da101d894a4d89d8391d820855a1bd992f89cc547827ed5d126bf968bf30400d"} Apr 23 17:55:39.252583 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:39.252582 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e","Type":"ContainerStarted","Data":"b6bed4fbe32b42e9aacef143eb26ceccb54a8667d8f7c3217a6c68e69345c7df"} Apr 23 17:55:39.252811 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:39.252595 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e","Type":"ContainerStarted","Data":"922cd5d3fa2d41a4736710b3c9608d891903a23205b23c3b93fdc45fcad71419"} Apr 23 17:55:39.252811 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:39.252607 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e","Type":"ContainerStarted","Data":"8ac51988602267ef533ade514adbea0242c618c84ad2902c9553f4dd003ff49e"} Apr 23 17:55:39.285363 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:39.285313 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.774064883 podStartE2EDuration="7.285299541s" podCreationTimestamp="2026-04-23 17:55:32 +0000 UTC" firstStartedPulling="2026-04-23 17:55:34.227891394 +0000 UTC m=+131.050359107" lastFinishedPulling="2026-04-23 17:55:38.739126049 +0000 UTC m=+135.561593765" observedRunningTime="2026-04-23 17:55:39.282960512 +0000 UTC m=+136.105428248" watchObservedRunningTime="2026-04-23 17:55:39.285299541 +0000 UTC m=+136.107767276" Apr 23 17:55:43.261360 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:43.261328 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:55:57.192635 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:55:57.192596 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" podUID="bf5d8d3c-de11-4bf0-872e-708dfdd6f61b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 17:56:03.511995 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.511888 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-ddf697474-pclcg" podUID="fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4" containerName="registry" containerID="cri-o://951f88fb1ba2498d466281e7289b43faa08f790168f9ed6747510c48b135248f" gracePeriod=30 Apr 23 17:56:03.763839 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.763786 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:56:03.818792 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.818761 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-installation-pull-secrets\") pod \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " Apr 23 17:56:03.818932 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.818815 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-image-registry-private-configuration\") pod \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " Apr 23 17:56:03.818932 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.818836 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-certificates\") pod \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " Apr 23 17:56:03.818932 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.818873 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls\") pod \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " Apr 23 17:56:03.818932 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.818894 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-ca-trust-extracted\") pod \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " Apr 23 17:56:03.818932 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.818922 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nlct\" (UniqueName: \"kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-kube-api-access-8nlct\") pod \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " Apr 23 17:56:03.819180 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.818954 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-trusted-ca\") pod \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " Apr 23 17:56:03.819180 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.818979 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-bound-sa-token\") pod \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\" (UID: \"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4\") " Apr 23 17:56:03.819529 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.819481 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4" (UID: "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:56:03.820114 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.820029 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4" (UID: "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:56:03.821618 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.821585 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-kube-api-access-8nlct" (OuterVolumeSpecName: "kube-api-access-8nlct") pod "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4" (UID: "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4"). InnerVolumeSpecName "kube-api-access-8nlct". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:56:03.821719 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.821667 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4" (UID: "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:56:03.821802 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.821779 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4" (UID: "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:56:03.821900 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.821881 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4" (UID: "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:56:03.821941 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.821900 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4" (UID: "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:56:03.827353 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.827310 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4" (UID: "fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:56:03.920455 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.920417 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-image-registry-private-configuration\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:03.920455 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.920451 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-certificates\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:03.920455 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.920464 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-registry-tls\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:03.920694 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.920473 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-ca-trust-extracted\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:03.920694 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.920482 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8nlct\" (UniqueName: \"kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-kube-api-access-8nlct\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:03.920694 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.920490 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-trusted-ca\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:03.920694 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.920500 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-bound-sa-token\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:03.920694 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:03.920509 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4-installation-pull-secrets\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:04.338629 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:04.338588 2578 generic.go:358] "Generic (PLEG): container finished" podID="fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4" containerID="951f88fb1ba2498d466281e7289b43faa08f790168f9ed6747510c48b135248f" exitCode=0 Apr 23 17:56:04.338804 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:04.338652 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-ddf697474-pclcg" Apr 23 17:56:04.338804 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:04.338675 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-ddf697474-pclcg" event={"ID":"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4","Type":"ContainerDied","Data":"951f88fb1ba2498d466281e7289b43faa08f790168f9ed6747510c48b135248f"} Apr 23 17:56:04.338804 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:04.338714 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-ddf697474-pclcg" event={"ID":"fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4","Type":"ContainerDied","Data":"88f57c32eac799b71429f0c30ba12781e4640b0de690555f74bbe2394d16258f"} Apr 23 17:56:04.338804 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:04.338730 2578 scope.go:117] "RemoveContainer" containerID="951f88fb1ba2498d466281e7289b43faa08f790168f9ed6747510c48b135248f" Apr 23 17:56:04.351464 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:04.351441 2578 scope.go:117] "RemoveContainer" containerID="951f88fb1ba2498d466281e7289b43faa08f790168f9ed6747510c48b135248f" Apr 23 17:56:04.351746 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:56:04.351725 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"951f88fb1ba2498d466281e7289b43faa08f790168f9ed6747510c48b135248f\": container with ID starting with 951f88fb1ba2498d466281e7289b43faa08f790168f9ed6747510c48b135248f not found: ID does not exist" containerID="951f88fb1ba2498d466281e7289b43faa08f790168f9ed6747510c48b135248f" Apr 23 17:56:04.351807 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:04.351754 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"951f88fb1ba2498d466281e7289b43faa08f790168f9ed6747510c48b135248f"} err="failed to get container status \"951f88fb1ba2498d466281e7289b43faa08f790168f9ed6747510c48b135248f\": rpc error: code = NotFound desc = could not find container \"951f88fb1ba2498d466281e7289b43faa08f790168f9ed6747510c48b135248f\": container with ID starting with 951f88fb1ba2498d466281e7289b43faa08f790168f9ed6747510c48b135248f not found: ID does not exist" Apr 23 17:56:04.367809 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:04.367779 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-ddf697474-pclcg"] Apr 23 17:56:04.371747 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:04.371721 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-ddf697474-pclcg"] Apr 23 17:56:04.643493 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:04.643363 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f1a4e260-a755-4505-99e6-e0afee647d86/init-config-reloader/0.log" Apr 23 17:56:04.649863 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:04.649846 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f1a4e260-a755-4505-99e6-e0afee647d86/alertmanager/0.log" Apr 23 17:56:04.798160 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:04.798130 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f1a4e260-a755-4505-99e6-e0afee647d86/config-reloader/0.log" Apr 23 17:56:05.000929 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:05.000852 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f1a4e260-a755-4505-99e6-e0afee647d86/kube-rbac-proxy-web/0.log" Apr 23 17:56:05.197932 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:05.197907 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f1a4e260-a755-4505-99e6-e0afee647d86/kube-rbac-proxy/0.log" Apr 23 17:56:05.397898 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:05.397872 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f1a4e260-a755-4505-99e6-e0afee647d86/kube-rbac-proxy-metric/0.log" Apr 23 17:56:05.597559 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:05.597527 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f1a4e260-a755-4505-99e6-e0afee647d86/prom-label-proxy/0.log" Apr 23 17:56:05.753492 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:05.753413 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4" path="/var/lib/kubelet/pods/fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4/volumes" Apr 23 17:56:05.999171 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:05.999123 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-99rsf_aefeedb2-a459-4b8f-9510-da8a136c2add/kube-state-metrics/0.log" Apr 23 17:56:06.197285 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:06.197247 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-99rsf_aefeedb2-a459-4b8f-9510-da8a136c2add/kube-rbac-proxy-main/0.log" Apr 23 17:56:06.397721 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:06.397691 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-99rsf_aefeedb2-a459-4b8f-9510-da8a136c2add/kube-rbac-proxy-self/0.log" Apr 23 17:56:06.798159 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:06.798128 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-z5dwz_eb4573a0-b31c-44f4-aab7-34751555bf31/monitoring-plugin/0.log" Apr 23 17:56:06.998525 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:06.998498 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8c8wq_20babaa9-49a8-431c-a51c-fc72be72a2cb/init-textfile/0.log" Apr 23 17:56:07.192380 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:07.192290 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" podUID="bf5d8d3c-de11-4bf0-872e-708dfdd6f61b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 17:56:07.197935 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:07.197912 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8c8wq_20babaa9-49a8-431c-a51c-fc72be72a2cb/node-exporter/0.log" Apr 23 17:56:07.398249 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:07.398226 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8c8wq_20babaa9-49a8-431c-a51c-fc72be72a2cb/kube-rbac-proxy/0.log" Apr 23 17:56:08.200904 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:08.200874 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-brvjh_3c070709-8b02-40df-a7c8-e5b5d0ad22a6/kube-rbac-proxy-main/0.log" Apr 23 17:56:08.397299 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:08.397269 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-brvjh_3c070709-8b02-40df-a7c8-e5b5d0ad22a6/kube-rbac-proxy-self/0.log" Apr 23 17:56:08.598412 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:08.598371 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-brvjh_3c070709-8b02-40df-a7c8-e5b5d0ad22a6/openshift-state-metrics/0.log" Apr 23 17:56:08.797470 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:08.797429 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e/init-config-reloader/0.log" Apr 23 17:56:08.999309 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:08.999218 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e/prometheus/0.log" Apr 23 17:56:09.197928 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:09.197892 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e/config-reloader/0.log" Apr 23 17:56:09.398767 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:09.398735 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e/thanos-sidecar/0.log" Apr 23 17:56:09.597731 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:09.597700 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e/kube-rbac-proxy-web/0.log" Apr 23 17:56:09.797069 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:09.797044 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e/kube-rbac-proxy/0.log" Apr 23 17:56:09.997766 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:09.997740 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e/kube-rbac-proxy-thanos/0.log" Apr 23 17:56:10.199556 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:10.199482 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-9fdwj_ad280719-5945-4ba9-b574-bb0345e669c9/prometheus-operator/0.log" Apr 23 17:56:10.399672 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:10.399637 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-9fdwj_ad280719-5945-4ba9-b574-bb0345e669c9/kube-rbac-proxy/0.log" Apr 23 17:56:10.597237 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:10.597210 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-hxgc8_25e3c9d3-09db-4fd7-8fb2-232077749fa6/prometheus-operator-admission-webhook/0.log" Apr 23 17:56:12.598775 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:12.598728 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-n6mwn_904ab1d8-f170-427c-b547-546b37cd8388/networking-console-plugin/0.log" Apr 23 17:56:17.192736 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:17.192692 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" podUID="bf5d8d3c-de11-4bf0-872e-708dfdd6f61b" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 23 17:56:17.193212 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:17.192784 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" Apr 23 17:56:17.193446 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:17.193422 2578 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"b54ee30bc48b198f28413bf7ad0ae34bd914426a282193859fdc514d5385cb74"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 23 17:56:17.193521 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:17.193472 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" podUID="bf5d8d3c-de11-4bf0-872e-708dfdd6f61b" containerName="service-proxy" containerID="cri-o://b54ee30bc48b198f28413bf7ad0ae34bd914426a282193859fdc514d5385cb74" gracePeriod=30 Apr 23 17:56:17.384676 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:17.384577 2578 generic.go:358] "Generic (PLEG): container finished" podID="bf5d8d3c-de11-4bf0-872e-708dfdd6f61b" containerID="b54ee30bc48b198f28413bf7ad0ae34bd914426a282193859fdc514d5385cb74" exitCode=2 Apr 23 17:56:17.384676 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:17.384631 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" event={"ID":"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b","Type":"ContainerDied","Data":"b54ee30bc48b198f28413bf7ad0ae34bd914426a282193859fdc514d5385cb74"} Apr 23 17:56:18.392549 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:18.392502 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-857469989d-k22fp" event={"ID":"bf5d8d3c-de11-4bf0-872e-708dfdd6f61b","Type":"ContainerStarted","Data":"28ceddd6af03c18b88616cd1a289baa3a6b45f1779da8a646d3ec84112ecfab9"} Apr 23 17:56:33.261025 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:33.260984 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:33.280617 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:33.280592 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:33.452994 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:33.452960 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:47.098401 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:47.098364 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 17:56:47.098866 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:47.098774 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="alertmanager" containerID="cri-o://9d44b333a70d39aa80b1c2d1345b51c32e555ec748f1cb3b749d5157080c40a9" gracePeriod=120 Apr 23 17:56:47.098939 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:47.098859 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="kube-rbac-proxy-web" containerID="cri-o://b217f9eab9526d918465cb2d0bf95a0be007ebda18998877937d5dca25b994fc" gracePeriod=120 Apr 23 17:56:47.098939 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:47.098858 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="kube-rbac-proxy-metric" containerID="cri-o://a379dfdba37ba5bd6d4ec6866ef630cc39b019895f27be1eeaec5836b4438b2f" gracePeriod=120 Apr 23 17:56:47.098939 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:47.098883 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="config-reloader" containerID="cri-o://c444eafcf81213f682216e68c3ab3eb18441d5d947101eb771f3e99f4a81a086" gracePeriod=120 Apr 23 17:56:47.099071 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:47.098928 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="prom-label-proxy" containerID="cri-o://d5a73a3fcf82cf783db5ff115ee514b7c6cd7dd6989199b9542ea7eb0cf50894" gracePeriod=120 Apr 23 17:56:47.099071 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:47.098928 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="kube-rbac-proxy" containerID="cri-o://737d727c4c56f78d8ab50733d458f59b1620aa11f821bf519c62c21cb76b1393" gracePeriod=120 Apr 23 17:56:47.476675 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:47.476586 2578 generic.go:358] "Generic (PLEG): container finished" podID="f1a4e260-a755-4505-99e6-e0afee647d86" containerID="d5a73a3fcf82cf783db5ff115ee514b7c6cd7dd6989199b9542ea7eb0cf50894" exitCode=0 Apr 23 17:56:47.476675 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:47.476612 2578 generic.go:358] "Generic (PLEG): container finished" podID="f1a4e260-a755-4505-99e6-e0afee647d86" containerID="a379dfdba37ba5bd6d4ec6866ef630cc39b019895f27be1eeaec5836b4438b2f" exitCode=0 Apr 23 17:56:47.476675 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:47.476619 2578 generic.go:358] "Generic (PLEG): container finished" podID="f1a4e260-a755-4505-99e6-e0afee647d86" containerID="737d727c4c56f78d8ab50733d458f59b1620aa11f821bf519c62c21cb76b1393" exitCode=0 Apr 23 17:56:47.476675 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:47.476625 2578 generic.go:358] "Generic (PLEG): container finished" podID="f1a4e260-a755-4505-99e6-e0afee647d86" containerID="c444eafcf81213f682216e68c3ab3eb18441d5d947101eb771f3e99f4a81a086" exitCode=0 Apr 23 17:56:47.476675 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:47.476629 2578 generic.go:358] "Generic (PLEG): container finished" podID="f1a4e260-a755-4505-99e6-e0afee647d86" containerID="9d44b333a70d39aa80b1c2d1345b51c32e555ec748f1cb3b749d5157080c40a9" exitCode=0 Apr 23 17:56:47.476675 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:47.476661 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1a4e260-a755-4505-99e6-e0afee647d86","Type":"ContainerDied","Data":"d5a73a3fcf82cf783db5ff115ee514b7c6cd7dd6989199b9542ea7eb0cf50894"} Apr 23 17:56:47.476958 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:47.476694 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1a4e260-a755-4505-99e6-e0afee647d86","Type":"ContainerDied","Data":"a379dfdba37ba5bd6d4ec6866ef630cc39b019895f27be1eeaec5836b4438b2f"} Apr 23 17:56:47.476958 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:47.476705 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1a4e260-a755-4505-99e6-e0afee647d86","Type":"ContainerDied","Data":"737d727c4c56f78d8ab50733d458f59b1620aa11f821bf519c62c21cb76b1393"} Apr 23 17:56:47.476958 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:47.476714 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1a4e260-a755-4505-99e6-e0afee647d86","Type":"ContainerDied","Data":"c444eafcf81213f682216e68c3ab3eb18441d5d947101eb771f3e99f4a81a086"} Apr 23 17:56:47.476958 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:47.476722 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1a4e260-a755-4505-99e6-e0afee647d86","Type":"ContainerDied","Data":"9d44b333a70d39aa80b1c2d1345b51c32e555ec748f1cb3b749d5157080c40a9"} Apr 23 17:56:48.343842 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.343821 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:48.482014 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.481932 2578 generic.go:358] "Generic (PLEG): container finished" podID="f1a4e260-a755-4505-99e6-e0afee647d86" containerID="b217f9eab9526d918465cb2d0bf95a0be007ebda18998877937d5dca25b994fc" exitCode=0 Apr 23 17:56:48.482014 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.481978 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1a4e260-a755-4505-99e6-e0afee647d86","Type":"ContainerDied","Data":"b217f9eab9526d918465cb2d0bf95a0be007ebda18998877937d5dca25b994fc"} Apr 23 17:56:48.482014 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.482003 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1a4e260-a755-4505-99e6-e0afee647d86","Type":"ContainerDied","Data":"c1ba13bffe5913947541bb9f360a46698de6c5faa8eb22da4e8a28c964a5f494"} Apr 23 17:56:48.482251 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.482020 2578 scope.go:117] "RemoveContainer" containerID="d5a73a3fcf82cf783db5ff115ee514b7c6cd7dd6989199b9542ea7eb0cf50894" Apr 23 17:56:48.482251 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.482038 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:48.489514 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.489492 2578 scope.go:117] "RemoveContainer" containerID="a379dfdba37ba5bd6d4ec6866ef630cc39b019895f27be1eeaec5836b4438b2f" Apr 23 17:56:48.495750 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.495732 2578 scope.go:117] "RemoveContainer" containerID="737d727c4c56f78d8ab50733d458f59b1620aa11f821bf519c62c21cb76b1393" Apr 23 17:56:48.501770 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.501751 2578 scope.go:117] "RemoveContainer" containerID="b217f9eab9526d918465cb2d0bf95a0be007ebda18998877937d5dca25b994fc" Apr 23 17:56:48.507993 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.507975 2578 scope.go:117] "RemoveContainer" containerID="c444eafcf81213f682216e68c3ab3eb18441d5d947101eb771f3e99f4a81a086" Apr 23 17:56:48.514153 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.514031 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f1a4e260-a755-4505-99e6-e0afee647d86-alertmanager-main-db\") pod \"f1a4e260-a755-4505-99e6-e0afee647d86\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " Apr 23 17:56:48.514153 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.514067 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-web-config\") pod \"f1a4e260-a755-4505-99e6-e0afee647d86\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " Apr 23 17:56:48.514153 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.514104 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1a4e260-a755-4505-99e6-e0afee647d86-metrics-client-ca\") pod \"f1a4e260-a755-4505-99e6-e0afee647d86\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " Apr 23 17:56:48.514153 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.514130 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1a4e260-a755-4505-99e6-e0afee647d86-alertmanager-trusted-ca-bundle\") pod \"f1a4e260-a755-4505-99e6-e0afee647d86\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " Apr 23 17:56:48.514495 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.514156 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-kube-rbac-proxy-metric\") pod \"f1a4e260-a755-4505-99e6-e0afee647d86\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " Apr 23 17:56:48.514495 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.514188 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f1a4e260-a755-4505-99e6-e0afee647d86-config-out\") pod \"f1a4e260-a755-4505-99e6-e0afee647d86\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " Apr 23 17:56:48.514495 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.514217 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f1a4e260-a755-4505-99e6-e0afee647d86-tls-assets\") pod \"f1a4e260-a755-4505-99e6-e0afee647d86\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " Apr 23 17:56:48.514495 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.514250 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-kube-rbac-proxy\") pod \"f1a4e260-a755-4505-99e6-e0afee647d86\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " Apr 23 17:56:48.514495 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.514379 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1a4e260-a755-4505-99e6-e0afee647d86-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "f1a4e260-a755-4505-99e6-e0afee647d86" (UID: "f1a4e260-a755-4505-99e6-e0afee647d86"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:56:48.514495 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.514418 2578 scope.go:117] "RemoveContainer" containerID="9d44b333a70d39aa80b1c2d1345b51c32e555ec748f1cb3b749d5157080c40a9" Apr 23 17:56:48.514495 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.514472 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-cluster-tls-config\") pod \"f1a4e260-a755-4505-99e6-e0afee647d86\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " Apr 23 17:56:48.514495 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.514482 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a4e260-a755-4505-99e6-e0afee647d86-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "f1a4e260-a755-4505-99e6-e0afee647d86" (UID: "f1a4e260-a755-4505-99e6-e0afee647d86"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:56:48.514877 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.514507 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-kube-rbac-proxy-web\") pod \"f1a4e260-a755-4505-99e6-e0afee647d86\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " Apr 23 17:56:48.514877 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.514546 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxxzd\" (UniqueName: \"kubernetes.io/projected/f1a4e260-a755-4505-99e6-e0afee647d86-kube-api-access-wxxzd\") pod \"f1a4e260-a755-4505-99e6-e0afee647d86\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " Apr 23 17:56:48.514877 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.514572 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-main-tls\") pod \"f1a4e260-a755-4505-99e6-e0afee647d86\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " Apr 23 17:56:48.514877 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.514581 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a4e260-a755-4505-99e6-e0afee647d86-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "f1a4e260-a755-4505-99e6-e0afee647d86" (UID: "f1a4e260-a755-4505-99e6-e0afee647d86"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:56:48.514877 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.514600 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-config-volume\") pod \"f1a4e260-a755-4505-99e6-e0afee647d86\" (UID: \"f1a4e260-a755-4505-99e6-e0afee647d86\") " Apr 23 17:56:48.515155 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.514884 2578 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1a4e260-a755-4505-99e6-e0afee647d86-metrics-client-ca\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:48.515155 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.514908 2578 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1a4e260-a755-4505-99e6-e0afee647d86-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:48.515155 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.514924 2578 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f1a4e260-a755-4505-99e6-e0afee647d86-alertmanager-main-db\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:48.517681 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.517655 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1a4e260-a755-4505-99e6-e0afee647d86-config-out" (OuterVolumeSpecName: "config-out") pod "f1a4e260-a755-4505-99e6-e0afee647d86" (UID: "f1a4e260-a755-4505-99e6-e0afee647d86"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:56:48.517772 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.517722 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-config-volume" (OuterVolumeSpecName: "config-volume") pod "f1a4e260-a755-4505-99e6-e0afee647d86" (UID: "f1a4e260-a755-4505-99e6-e0afee647d86"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:56:48.517918 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.517875 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "f1a4e260-a755-4505-99e6-e0afee647d86" (UID: "f1a4e260-a755-4505-99e6-e0afee647d86"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:56:48.518064 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.518036 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "f1a4e260-a755-4505-99e6-e0afee647d86" (UID: "f1a4e260-a755-4505-99e6-e0afee647d86"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:56:48.518064 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.518054 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1a4e260-a755-4505-99e6-e0afee647d86-kube-api-access-wxxzd" (OuterVolumeSpecName: "kube-api-access-wxxzd") pod "f1a4e260-a755-4505-99e6-e0afee647d86" (UID: "f1a4e260-a755-4505-99e6-e0afee647d86"). InnerVolumeSpecName "kube-api-access-wxxzd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:56:48.519117 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.519065 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1a4e260-a755-4505-99e6-e0afee647d86-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f1a4e260-a755-4505-99e6-e0afee647d86" (UID: "f1a4e260-a755-4505-99e6-e0afee647d86"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:56:48.519338 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.519314 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "f1a4e260-a755-4505-99e6-e0afee647d86" (UID: "f1a4e260-a755-4505-99e6-e0afee647d86"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:56:48.519569 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.519550 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "f1a4e260-a755-4505-99e6-e0afee647d86" (UID: "f1a4e260-a755-4505-99e6-e0afee647d86"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:56:48.522059 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.522033 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "f1a4e260-a755-4505-99e6-e0afee647d86" (UID: "f1a4e260-a755-4505-99e6-e0afee647d86"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:56:48.528901 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.528876 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-web-config" (OuterVolumeSpecName: "web-config") pod "f1a4e260-a755-4505-99e6-e0afee647d86" (UID: "f1a4e260-a755-4505-99e6-e0afee647d86"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:56:48.537591 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.537570 2578 scope.go:117] "RemoveContainer" containerID="455d76fb0c6442a1d5dab02692ecc6193470c7d03687916377db439133da5a51" Apr 23 17:56:48.543922 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.543906 2578 scope.go:117] "RemoveContainer" containerID="d5a73a3fcf82cf783db5ff115ee514b7c6cd7dd6989199b9542ea7eb0cf50894" Apr 23 17:56:48.544183 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:56:48.544165 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5a73a3fcf82cf783db5ff115ee514b7c6cd7dd6989199b9542ea7eb0cf50894\": container with ID starting with d5a73a3fcf82cf783db5ff115ee514b7c6cd7dd6989199b9542ea7eb0cf50894 not found: ID does not exist" containerID="d5a73a3fcf82cf783db5ff115ee514b7c6cd7dd6989199b9542ea7eb0cf50894" Apr 23 17:56:48.544238 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.544199 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5a73a3fcf82cf783db5ff115ee514b7c6cd7dd6989199b9542ea7eb0cf50894"} err="failed to get container status \"d5a73a3fcf82cf783db5ff115ee514b7c6cd7dd6989199b9542ea7eb0cf50894\": rpc error: code = NotFound desc = could not find container \"d5a73a3fcf82cf783db5ff115ee514b7c6cd7dd6989199b9542ea7eb0cf50894\": container with ID starting with d5a73a3fcf82cf783db5ff115ee514b7c6cd7dd6989199b9542ea7eb0cf50894 not found: ID does not exist" Apr 23 17:56:48.544238 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.544218 2578 scope.go:117] "RemoveContainer" containerID="a379dfdba37ba5bd6d4ec6866ef630cc39b019895f27be1eeaec5836b4438b2f" Apr 23 17:56:48.544455 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:56:48.544435 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a379dfdba37ba5bd6d4ec6866ef630cc39b019895f27be1eeaec5836b4438b2f\": container with ID starting with a379dfdba37ba5bd6d4ec6866ef630cc39b019895f27be1eeaec5836b4438b2f not found: ID does not exist" containerID="a379dfdba37ba5bd6d4ec6866ef630cc39b019895f27be1eeaec5836b4438b2f" Apr 23 17:56:48.544516 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.544465 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a379dfdba37ba5bd6d4ec6866ef630cc39b019895f27be1eeaec5836b4438b2f"} err="failed to get container status \"a379dfdba37ba5bd6d4ec6866ef630cc39b019895f27be1eeaec5836b4438b2f\": rpc error: code = NotFound desc = could not find container \"a379dfdba37ba5bd6d4ec6866ef630cc39b019895f27be1eeaec5836b4438b2f\": container with ID starting with a379dfdba37ba5bd6d4ec6866ef630cc39b019895f27be1eeaec5836b4438b2f not found: ID does not exist" Apr 23 17:56:48.544516 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.544489 2578 scope.go:117] "RemoveContainer" containerID="737d727c4c56f78d8ab50733d458f59b1620aa11f821bf519c62c21cb76b1393" Apr 23 17:56:48.544714 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:56:48.544695 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"737d727c4c56f78d8ab50733d458f59b1620aa11f821bf519c62c21cb76b1393\": container with ID starting with 737d727c4c56f78d8ab50733d458f59b1620aa11f821bf519c62c21cb76b1393 not found: ID does not exist" containerID="737d727c4c56f78d8ab50733d458f59b1620aa11f821bf519c62c21cb76b1393" Apr 23 17:56:48.544751 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.544718 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"737d727c4c56f78d8ab50733d458f59b1620aa11f821bf519c62c21cb76b1393"} err="failed to get container status \"737d727c4c56f78d8ab50733d458f59b1620aa11f821bf519c62c21cb76b1393\": rpc error: code = NotFound desc = could not find container \"737d727c4c56f78d8ab50733d458f59b1620aa11f821bf519c62c21cb76b1393\": container with ID starting with 737d727c4c56f78d8ab50733d458f59b1620aa11f821bf519c62c21cb76b1393 not found: ID does not exist" Apr 23 17:56:48.544751 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.544739 2578 scope.go:117] "RemoveContainer" containerID="b217f9eab9526d918465cb2d0bf95a0be007ebda18998877937d5dca25b994fc" Apr 23 17:56:48.544961 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:56:48.544941 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b217f9eab9526d918465cb2d0bf95a0be007ebda18998877937d5dca25b994fc\": container with ID starting with b217f9eab9526d918465cb2d0bf95a0be007ebda18998877937d5dca25b994fc not found: ID does not exist" containerID="b217f9eab9526d918465cb2d0bf95a0be007ebda18998877937d5dca25b994fc" Apr 23 17:56:48.545036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.544961 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b217f9eab9526d918465cb2d0bf95a0be007ebda18998877937d5dca25b994fc"} err="failed to get container status \"b217f9eab9526d918465cb2d0bf95a0be007ebda18998877937d5dca25b994fc\": rpc error: code = NotFound desc = could not find container \"b217f9eab9526d918465cb2d0bf95a0be007ebda18998877937d5dca25b994fc\": container with ID starting with b217f9eab9526d918465cb2d0bf95a0be007ebda18998877937d5dca25b994fc not found: ID does not exist" Apr 23 17:56:48.545036 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.544974 2578 scope.go:117] "RemoveContainer" containerID="c444eafcf81213f682216e68c3ab3eb18441d5d947101eb771f3e99f4a81a086" Apr 23 17:56:48.545205 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:56:48.545186 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c444eafcf81213f682216e68c3ab3eb18441d5d947101eb771f3e99f4a81a086\": container with ID starting with c444eafcf81213f682216e68c3ab3eb18441d5d947101eb771f3e99f4a81a086 not found: ID does not exist" containerID="c444eafcf81213f682216e68c3ab3eb18441d5d947101eb771f3e99f4a81a086" Apr 23 17:56:48.545253 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.545210 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c444eafcf81213f682216e68c3ab3eb18441d5d947101eb771f3e99f4a81a086"} err="failed to get container status \"c444eafcf81213f682216e68c3ab3eb18441d5d947101eb771f3e99f4a81a086\": rpc error: code = NotFound desc = could not find container \"c444eafcf81213f682216e68c3ab3eb18441d5d947101eb771f3e99f4a81a086\": container with ID starting with c444eafcf81213f682216e68c3ab3eb18441d5d947101eb771f3e99f4a81a086 not found: ID does not exist" Apr 23 17:56:48.545253 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.545222 2578 scope.go:117] "RemoveContainer" containerID="9d44b333a70d39aa80b1c2d1345b51c32e555ec748f1cb3b749d5157080c40a9" Apr 23 17:56:48.545459 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:56:48.545446 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d44b333a70d39aa80b1c2d1345b51c32e555ec748f1cb3b749d5157080c40a9\": container with ID starting with 9d44b333a70d39aa80b1c2d1345b51c32e555ec748f1cb3b749d5157080c40a9 not found: ID does not exist" containerID="9d44b333a70d39aa80b1c2d1345b51c32e555ec748f1cb3b749d5157080c40a9" Apr 23 17:56:48.545500 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.545462 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d44b333a70d39aa80b1c2d1345b51c32e555ec748f1cb3b749d5157080c40a9"} err="failed to get container status \"9d44b333a70d39aa80b1c2d1345b51c32e555ec748f1cb3b749d5157080c40a9\": rpc error: code = NotFound desc = could not find container \"9d44b333a70d39aa80b1c2d1345b51c32e555ec748f1cb3b749d5157080c40a9\": container with ID starting with 9d44b333a70d39aa80b1c2d1345b51c32e555ec748f1cb3b749d5157080c40a9 not found: ID does not exist" Apr 23 17:56:48.545500 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.545476 2578 scope.go:117] "RemoveContainer" containerID="455d76fb0c6442a1d5dab02692ecc6193470c7d03687916377db439133da5a51" Apr 23 17:56:48.545724 ip-10-0-142-63 kubenswrapper[2578]: E0423 17:56:48.545708 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"455d76fb0c6442a1d5dab02692ecc6193470c7d03687916377db439133da5a51\": container with ID starting with 455d76fb0c6442a1d5dab02692ecc6193470c7d03687916377db439133da5a51 not found: ID does not exist" containerID="455d76fb0c6442a1d5dab02692ecc6193470c7d03687916377db439133da5a51" Apr 23 17:56:48.545801 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.545729 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"455d76fb0c6442a1d5dab02692ecc6193470c7d03687916377db439133da5a51"} err="failed to get container status \"455d76fb0c6442a1d5dab02692ecc6193470c7d03687916377db439133da5a51\": rpc error: code = NotFound desc = could not find container \"455d76fb0c6442a1d5dab02692ecc6193470c7d03687916377db439133da5a51\": container with ID starting with 455d76fb0c6442a1d5dab02692ecc6193470c7d03687916377db439133da5a51 not found: ID does not exist" Apr 23 17:56:48.615617 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.615585 2578 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-cluster-tls-config\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:48.615617 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.615614 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:48.615781 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.615637 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wxxzd\" (UniqueName: \"kubernetes.io/projected/f1a4e260-a755-4505-99e6-e0afee647d86-kube-api-access-wxxzd\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:48.615781 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.615649 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-main-tls\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:48.615781 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.615661 2578 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-config-volume\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:48.615781 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.615669 2578 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-web-config\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:48.615781 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.615677 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:48.615781 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.615687 2578 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f1a4e260-a755-4505-99e6-e0afee647d86-config-out\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:48.615781 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.615697 2578 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f1a4e260-a755-4505-99e6-e0afee647d86-tls-assets\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:48.615781 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.615706 2578 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f1a4e260-a755-4505-99e6-e0afee647d86-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:48.805168 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.805139 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 17:56:48.809017 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.808994 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 17:56:48.836524 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.836498 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 17:56:48.836807 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.836795 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="alertmanager" Apr 23 17:56:48.836861 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.836808 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="alertmanager" Apr 23 17:56:48.836861 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.836817 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="kube-rbac-proxy-web" Apr 23 17:56:48.836861 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.836823 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="kube-rbac-proxy-web" Apr 23 17:56:48.836861 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.836830 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4" containerName="registry" Apr 23 17:56:48.836861 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.836836 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4" containerName="registry" Apr 23 17:56:48.836861 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.836846 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="init-config-reloader" Apr 23 17:56:48.836861 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.836851 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="init-config-reloader" Apr 23 17:56:48.836861 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.836859 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="config-reloader" Apr 23 17:56:48.836861 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.836864 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="config-reloader" Apr 23 17:56:48.837195 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.836870 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="kube-rbac-proxy" Apr 23 17:56:48.837195 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.836875 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="kube-rbac-proxy" Apr 23 17:56:48.837195 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.836887 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="kube-rbac-proxy-metric" Apr 23 17:56:48.837195 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.836892 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="kube-rbac-proxy-metric" Apr 23 17:56:48.837195 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.836900 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="prom-label-proxy" Apr 23 17:56:48.837195 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.836905 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="prom-label-proxy" Apr 23 17:56:48.837195 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.836968 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="config-reloader" Apr 23 17:56:48.837195 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.836979 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="kube-rbac-proxy-metric" Apr 23 17:56:48.837195 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.836987 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa5eda1b-c6c9-44cc-9e1a-b67e0654acc4" containerName="registry" Apr 23 17:56:48.837195 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.836993 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="alertmanager" Apr 23 17:56:48.837195 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.836999 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="kube-rbac-proxy-web" Apr 23 17:56:48.837195 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.837005 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="prom-label-proxy" Apr 23 17:56:48.837195 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.837011 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" containerName="kube-rbac-proxy" Apr 23 17:56:48.840905 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.840888 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:48.843461 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.843438 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 17:56:48.843572 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.843482 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 17:56:48.843572 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.843481 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 17:56:48.843572 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.843546 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 17:56:48.843741 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.843581 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-s8vnv\"" Apr 23 17:56:48.843741 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.843581 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 17:56:48.843841 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.843756 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 17:56:48.843841 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.843779 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 17:56:48.843841 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.843830 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 17:56:48.849199 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.849179 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 17:56:48.852230 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:48.852206 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 17:56:49.019023 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.018978 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/930847a2-bfd1-40c9-ab02-201230e3ece9-config-volume\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.019255 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.019037 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/930847a2-bfd1-40c9-ab02-201230e3ece9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.019255 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.019064 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/930847a2-bfd1-40c9-ab02-201230e3ece9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.019255 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.019114 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/930847a2-bfd1-40c9-ab02-201230e3ece9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.019255 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.019147 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/930847a2-bfd1-40c9-ab02-201230e3ece9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.019255 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.019174 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwg44\" (UniqueName: \"kubernetes.io/projected/930847a2-bfd1-40c9-ab02-201230e3ece9-kube-api-access-zwg44\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.019469 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.019266 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/930847a2-bfd1-40c9-ab02-201230e3ece9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.019469 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.019308 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/930847a2-bfd1-40c9-ab02-201230e3ece9-web-config\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.019469 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.019334 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/930847a2-bfd1-40c9-ab02-201230e3ece9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.019469 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.019366 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/930847a2-bfd1-40c9-ab02-201230e3ece9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.019469 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.019400 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/930847a2-bfd1-40c9-ab02-201230e3ece9-config-out\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.019469 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.019426 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/930847a2-bfd1-40c9-ab02-201230e3ece9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.019469 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.019447 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/930847a2-bfd1-40c9-ab02-201230e3ece9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.119791 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.119758 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/930847a2-bfd1-40c9-ab02-201230e3ece9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.119791 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.119794 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/930847a2-bfd1-40c9-ab02-201230e3ece9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.120068 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.119812 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/930847a2-bfd1-40c9-ab02-201230e3ece9-config-out\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.120068 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.119840 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/930847a2-bfd1-40c9-ab02-201230e3ece9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.120068 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.119872 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/930847a2-bfd1-40c9-ab02-201230e3ece9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.120068 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.119898 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/930847a2-bfd1-40c9-ab02-201230e3ece9-config-volume\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.120068 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.119935 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/930847a2-bfd1-40c9-ab02-201230e3ece9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.120068 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.119964 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/930847a2-bfd1-40c9-ab02-201230e3ece9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.120068 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.119988 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/930847a2-bfd1-40c9-ab02-201230e3ece9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.120068 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.120018 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/930847a2-bfd1-40c9-ab02-201230e3ece9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.120068 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.120047 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwg44\" (UniqueName: \"kubernetes.io/projected/930847a2-bfd1-40c9-ab02-201230e3ece9-kube-api-access-zwg44\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.120497 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.120124 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/930847a2-bfd1-40c9-ab02-201230e3ece9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.120497 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.120167 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/930847a2-bfd1-40c9-ab02-201230e3ece9-web-config\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.120823 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.120795 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/930847a2-bfd1-40c9-ab02-201230e3ece9-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.120905 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.120861 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/930847a2-bfd1-40c9-ab02-201230e3ece9-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.121154 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.121127 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/930847a2-bfd1-40c9-ab02-201230e3ece9-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.122620 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.122595 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/930847a2-bfd1-40c9-ab02-201230e3ece9-config-out\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.123234 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.122912 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/930847a2-bfd1-40c9-ab02-201230e3ece9-web-config\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.123234 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.123015 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/930847a2-bfd1-40c9-ab02-201230e3ece9-tls-assets\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.123392 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.123309 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/930847a2-bfd1-40c9-ab02-201230e3ece9-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.123461 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.123407 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/930847a2-bfd1-40c9-ab02-201230e3ece9-config-volume\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.123549 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.123520 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/930847a2-bfd1-40c9-ab02-201230e3ece9-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.123986 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.123966 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/930847a2-bfd1-40c9-ab02-201230e3ece9-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.124814 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.124796 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/930847a2-bfd1-40c9-ab02-201230e3ece9-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.124930 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.124907 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/930847a2-bfd1-40c9-ab02-201230e3ece9-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.130851 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.130827 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwg44\" (UniqueName: \"kubernetes.io/projected/930847a2-bfd1-40c9-ab02-201230e3ece9-kube-api-access-zwg44\") pod \"alertmanager-main-0\" (UID: \"930847a2-bfd1-40c9-ab02-201230e3ece9\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.151337 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.151317 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 17:56:49.280686 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.280584 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 17:56:49.283337 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:56:49.283304 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod930847a2_bfd1_40c9_ab02_201230e3ece9.slice/crio-5eb52096105c272e824162dfb8d0fcf1cca4ccf1150495a0647d3d3178e3b5b3 WatchSource:0}: Error finding container 5eb52096105c272e824162dfb8d0fcf1cca4ccf1150495a0647d3d3178e3b5b3: Status 404 returned error can't find the container with id 5eb52096105c272e824162dfb8d0fcf1cca4ccf1150495a0647d3d3178e3b5b3 Apr 23 17:56:49.487258 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.487223 2578 generic.go:358] "Generic (PLEG): container finished" podID="930847a2-bfd1-40c9-ab02-201230e3ece9" containerID="80b570bfd88c3873693f1a7456cf12a03ccf47dd360265069f603934658753cc" exitCode=0 Apr 23 17:56:49.487603 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.487274 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"930847a2-bfd1-40c9-ab02-201230e3ece9","Type":"ContainerDied","Data":"80b570bfd88c3873693f1a7456cf12a03ccf47dd360265069f603934658753cc"} Apr 23 17:56:49.487603 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.487293 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"930847a2-bfd1-40c9-ab02-201230e3ece9","Type":"ContainerStarted","Data":"5eb52096105c272e824162dfb8d0fcf1cca4ccf1150495a0647d3d3178e3b5b3"} Apr 23 17:56:49.755727 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:49.755671 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1a4e260-a755-4505-99e6-e0afee647d86" path="/var/lib/kubelet/pods/f1a4e260-a755-4505-99e6-e0afee647d86/volumes" Apr 23 17:56:50.492797 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:50.492761 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"930847a2-bfd1-40c9-ab02-201230e3ece9","Type":"ContainerStarted","Data":"17e95f26291770cb11434f54860e4fbae6da1841f7a54ba341f8ce9172fb1ac2"} Apr 23 17:56:50.492797 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:50.492801 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"930847a2-bfd1-40c9-ab02-201230e3ece9","Type":"ContainerStarted","Data":"7ec6734d7816734ccf1a378648456721fe9de5f70b442b6f96e2333d16ea95d5"} Apr 23 17:56:50.493263 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:50.492816 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"930847a2-bfd1-40c9-ab02-201230e3ece9","Type":"ContainerStarted","Data":"6ab33a3d5fe8a7f7f63631c5073875b1cebf98df29fe9083e4105615b24693ee"} Apr 23 17:56:50.493263 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:50.492828 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"930847a2-bfd1-40c9-ab02-201230e3ece9","Type":"ContainerStarted","Data":"767bc77add98f348efff531f464d5450fdc6a8d7d23ec388f1e638ada4250001"} Apr 23 17:56:50.493263 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:50.492839 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"930847a2-bfd1-40c9-ab02-201230e3ece9","Type":"ContainerStarted","Data":"482777dfc366e28f9b8f12cc7a955d1bd44046ff5547d3fbd5f1e928324c5942"} Apr 23 17:56:50.493263 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:50.492851 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"930847a2-bfd1-40c9-ab02-201230e3ece9","Type":"ContainerStarted","Data":"4539ae3866cace6223a40089bfdf96b568c7eca1d9bc6af0e57e758fc8d0210f"} Apr 23 17:56:50.522746 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:50.522699 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.52268376 podStartE2EDuration="2.52268376s" podCreationTimestamp="2026-04-23 17:56:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:56:50.520126606 +0000 UTC m=+207.342594340" watchObservedRunningTime="2026-04-23 17:56:50.52268376 +0000 UTC m=+207.345151494" Apr 23 17:56:51.271434 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.271402 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:56:51.271875 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.271846 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="prometheus" containerID="cri-o://70a19b34eeaff693fadbe98dbda66d4a9eb5b84e020d678622df7b86e95db571" gracePeriod=600 Apr 23 17:56:51.271963 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.271906 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="thanos-sidecar" containerID="cri-o://8ac51988602267ef533ade514adbea0242c618c84ad2902c9553f4dd003ff49e" gracePeriod=600 Apr 23 17:56:51.271963 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.271888 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="kube-rbac-proxy" containerID="cri-o://b6bed4fbe32b42e9aacef143eb26ceccb54a8667d8f7c3217a6c68e69345c7df" gracePeriod=600 Apr 23 17:56:51.271963 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.271948 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="kube-rbac-proxy-web" containerID="cri-o://922cd5d3fa2d41a4736710b3c9608d891903a23205b23c3b93fdc45fcad71419" gracePeriod=600 Apr 23 17:56:51.272146 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.271903 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="kube-rbac-proxy-thanos" containerID="cri-o://da101d894a4d89d8391d820855a1bd992f89cc547827ed5d126bf968bf30400d" gracePeriod=600 Apr 23 17:56:51.272146 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.271931 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="config-reloader" containerID="cri-o://83ab36b82cc2e9353b1b128ee5f9a22ccf59b4bfc01c80689efc42bce596f8e5" gracePeriod=600 Apr 23 17:56:51.501305 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.501272 2578 generic.go:358] "Generic (PLEG): container finished" podID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerID="da101d894a4d89d8391d820855a1bd992f89cc547827ed5d126bf968bf30400d" exitCode=0 Apr 23 17:56:51.501305 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.501302 2578 generic.go:358] "Generic (PLEG): container finished" podID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerID="b6bed4fbe32b42e9aacef143eb26ceccb54a8667d8f7c3217a6c68e69345c7df" exitCode=0 Apr 23 17:56:51.501305 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.501313 2578 generic.go:358] "Generic (PLEG): container finished" podID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerID="922cd5d3fa2d41a4736710b3c9608d891903a23205b23c3b93fdc45fcad71419" exitCode=0 Apr 23 17:56:51.501305 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.501322 2578 generic.go:358] "Generic (PLEG): container finished" podID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerID="8ac51988602267ef533ade514adbea0242c618c84ad2902c9553f4dd003ff49e" exitCode=0 Apr 23 17:56:51.501895 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.501332 2578 generic.go:358] "Generic (PLEG): container finished" podID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerID="83ab36b82cc2e9353b1b128ee5f9a22ccf59b4bfc01c80689efc42bce596f8e5" exitCode=0 Apr 23 17:56:51.501895 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.501340 2578 generic.go:358] "Generic (PLEG): container finished" podID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerID="70a19b34eeaff693fadbe98dbda66d4a9eb5b84e020d678622df7b86e95db571" exitCode=0 Apr 23 17:56:51.501895 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.501348 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e","Type":"ContainerDied","Data":"da101d894a4d89d8391d820855a1bd992f89cc547827ed5d126bf968bf30400d"} Apr 23 17:56:51.501895 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.501385 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e","Type":"ContainerDied","Data":"b6bed4fbe32b42e9aacef143eb26ceccb54a8667d8f7c3217a6c68e69345c7df"} Apr 23 17:56:51.501895 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.501395 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e","Type":"ContainerDied","Data":"922cd5d3fa2d41a4736710b3c9608d891903a23205b23c3b93fdc45fcad71419"} Apr 23 17:56:51.501895 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.501404 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e","Type":"ContainerDied","Data":"8ac51988602267ef533ade514adbea0242c618c84ad2902c9553f4dd003ff49e"} Apr 23 17:56:51.501895 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.501412 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e","Type":"ContainerDied","Data":"83ab36b82cc2e9353b1b128ee5f9a22ccf59b4bfc01c80689efc42bce596f8e5"} Apr 23 17:56:51.501895 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.501421 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e","Type":"ContainerDied","Data":"70a19b34eeaff693fadbe98dbda66d4a9eb5b84e020d678622df7b86e95db571"} Apr 23 17:56:51.526186 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.526123 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:51.541078 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.541050 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-kube-rbac-proxy\") pod \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " Apr 23 17:56:51.541221 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.541182 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-prometheus-k8s-tls\") pod \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " Apr 23 17:56:51.541295 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.541225 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-grpc-tls\") pod \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " Apr 23 17:56:51.541295 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.541256 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-configmap-serving-certs-ca-bundle\") pod \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " Apr 23 17:56:51.541295 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.541287 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-prometheus-k8s-db\") pod \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " Apr 23 17:56:51.541445 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.541314 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-web-config\") pod \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " Apr 23 17:56:51.541445 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.541360 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-prometheus-trusted-ca-bundle\") pod \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " Apr 23 17:56:51.541445 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.541391 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " Apr 23 17:56:51.541445 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.541433 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-metrics-client-certs\") pod \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " Apr 23 17:56:51.541635 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.541457 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-config\") pod \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " Apr 23 17:56:51.541635 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.541512 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-configmap-metrics-client-ca\") pod \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " Apr 23 17:56:51.541635 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.541545 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhk6l\" (UniqueName: \"kubernetes.io/projected/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-kube-api-access-qhk6l\") pod \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " Apr 23 17:56:51.541635 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.541576 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-prometheus-k8s-rulefiles-0\") pod \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " Apr 23 17:56:51.541635 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.541601 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-configmap-kubelet-serving-ca-bundle\") pod \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " Apr 23 17:56:51.541858 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.541640 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " Apr 23 17:56:51.541858 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.541669 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-tls-assets\") pod \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " Apr 23 17:56:51.541858 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.541675 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" (UID: "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:56:51.541858 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.541699 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-thanos-prometheus-http-client-file\") pod \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " Apr 23 17:56:51.542037 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.541948 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" (UID: "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:56:51.542299 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.542185 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:51.542299 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.542209 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-prometheus-trusted-ca-bundle\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:51.542299 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.542282 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" (UID: "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:56:51.543297 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.542962 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" (UID: "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:56:51.543297 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.542970 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" (UID: "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:56:51.545930 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.545900 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" (UID: "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:56:51.547029 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.547002 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" (UID: "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:56:51.547154 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.547134 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" (UID: "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:56:51.547236 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.547219 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" (UID: "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:56:51.547556 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.547536 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" (UID: "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:56:51.547659 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.547640 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" (UID: "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:56:51.547942 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.547920 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-config" (OuterVolumeSpecName: "config") pod "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" (UID: "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:56:51.548072 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.548048 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" (UID: "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:56:51.549529 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.549498 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" (UID: "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:56:51.553759 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.553178 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-kube-api-access-qhk6l" (OuterVolumeSpecName: "kube-api-access-qhk6l") pod "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" (UID: "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e"). InnerVolumeSpecName "kube-api-access-qhk6l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:56:51.553759 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.553204 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" (UID: "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:56:51.564401 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.564361 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-web-config" (OuterVolumeSpecName: "web-config") pod "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" (UID: "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 17:56:51.642901 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.642867 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-config-out\") pod \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\" (UID: \"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e\") " Apr 23 17:56:51.643044 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.643014 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:51.643044 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.643028 2578 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-metrics-client-certs\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:51.643044 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.643039 2578 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-config\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:51.643184 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.643048 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-configmap-metrics-client-ca\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:51.643184 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.643058 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qhk6l\" (UniqueName: \"kubernetes.io/projected/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-kube-api-access-qhk6l\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:51.643184 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.643066 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:51.643184 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.643075 2578 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:51.643184 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.643103 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:51.643184 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.643112 2578 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-tls-assets\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:51.643184 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.643121 2578 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-thanos-prometheus-http-client-file\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:51.643184 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.643130 2578 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-kube-rbac-proxy\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:51.643184 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.643138 2578 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-prometheus-k8s-tls\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:51.643184 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.643147 2578 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-secret-grpc-tls\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:51.643184 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.643155 2578 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-prometheus-k8s-db\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:51.643184 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.643166 2578 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-web-config\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:51.644836 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.644813 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-config-out" (OuterVolumeSpecName: "config-out") pod "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" (UID: "8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:56:51.743508 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:51.743476 2578 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e-config-out\") on node \"ip-10-0-142-63.ec2.internal\" DevicePath \"\"" Apr 23 17:56:52.506746 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.506713 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e","Type":"ContainerDied","Data":"1be0116d4366cc105a9e41e0f701adf689d0cb30cd99fa3cba683876af77e67f"} Apr 23 17:56:52.507142 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.506761 2578 scope.go:117] "RemoveContainer" containerID="da101d894a4d89d8391d820855a1bd992f89cc547827ed5d126bf968bf30400d" Apr 23 17:56:52.507142 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.506764 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.514582 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.514562 2578 scope.go:117] "RemoveContainer" containerID="b6bed4fbe32b42e9aacef143eb26ceccb54a8667d8f7c3217a6c68e69345c7df" Apr 23 17:56:52.522118 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.522078 2578 scope.go:117] "RemoveContainer" containerID="922cd5d3fa2d41a4736710b3c9608d891903a23205b23c3b93fdc45fcad71419" Apr 23 17:56:52.528966 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.528947 2578 scope.go:117] "RemoveContainer" containerID="8ac51988602267ef533ade514adbea0242c618c84ad2902c9553f4dd003ff49e" Apr 23 17:56:52.529389 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.529366 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:56:52.533577 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.533553 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:56:52.538407 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.538340 2578 scope.go:117] "RemoveContainer" containerID="83ab36b82cc2e9353b1b128ee5f9a22ccf59b4bfc01c80689efc42bce596f8e5" Apr 23 17:56:52.544797 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.544777 2578 scope.go:117] "RemoveContainer" containerID="70a19b34eeaff693fadbe98dbda66d4a9eb5b84e020d678622df7b86e95db571" Apr 23 17:56:52.551653 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.551623 2578 scope.go:117] "RemoveContainer" containerID="49b556de5e0c6e1ecee13409efacec517e493030a838ecbec4325dbb06a8ef68" Apr 23 17:56:52.563022 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.563002 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:56:52.563390 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.563370 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="init-config-reloader" Apr 23 17:56:52.563390 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.563392 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="init-config-reloader" Apr 23 17:56:52.563529 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.563409 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="thanos-sidecar" Apr 23 17:56:52.563529 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.563418 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="thanos-sidecar" Apr 23 17:56:52.563529 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.563440 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="kube-rbac-proxy-thanos" Apr 23 17:56:52.563529 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.563447 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="kube-rbac-proxy-thanos" Apr 23 17:56:52.563529 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.563456 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="prometheus" Apr 23 17:56:52.563529 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.563463 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="prometheus" Apr 23 17:56:52.563529 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.563472 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="kube-rbac-proxy" Apr 23 17:56:52.563529 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.563478 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="kube-rbac-proxy" Apr 23 17:56:52.563529 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.563489 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="kube-rbac-proxy-web" Apr 23 17:56:52.563529 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.563496 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="kube-rbac-proxy-web" Apr 23 17:56:52.563529 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.563508 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="config-reloader" Apr 23 17:56:52.563529 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.563516 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="config-reloader" Apr 23 17:56:52.564206 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.563612 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="config-reloader" Apr 23 17:56:52.564206 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.563623 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="kube-rbac-proxy" Apr 23 17:56:52.564206 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.563635 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="thanos-sidecar" Apr 23 17:56:52.564206 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.563642 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="kube-rbac-proxy-thanos" Apr 23 17:56:52.564206 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.563652 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="prometheus" Apr 23 17:56:52.564206 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.563662 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" containerName="kube-rbac-proxy-web" Apr 23 17:56:52.566818 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.566802 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.569673 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.569646 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 17:56:52.569812 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.569689 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 17:56:52.569812 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.569770 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-mh29w\"" Apr 23 17:56:52.569910 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.569843 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 17:56:52.569910 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.569860 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 17:56:52.569910 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.569893 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 17:56:52.570279 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.570261 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 17:56:52.570482 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.570303 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-d6lu1r4c085fu\"" Apr 23 17:56:52.570482 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.570360 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 17:56:52.570482 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.570461 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 17:56:52.570646 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.570630 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 17:56:52.570840 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.570704 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 17:56:52.570840 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.570805 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 17:56:52.572995 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.572978 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 17:56:52.575892 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.575872 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 17:56:52.582033 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.582015 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:56:52.649614 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.649585 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b86e05a9-f3dc-49ae-b920-dc18edb23069-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.649743 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.649618 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b86e05a9-f3dc-49ae-b920-dc18edb23069-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.649743 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.649637 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.649743 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.649664 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b86e05a9-f3dc-49ae-b920-dc18edb23069-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.649743 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.649709 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-config\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.649743 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.649734 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.649930 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.649756 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwkb7\" (UniqueName: \"kubernetes.io/projected/b86e05a9-f3dc-49ae-b920-dc18edb23069-kube-api-access-wwkb7\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.649930 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.649777 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-web-config\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.649930 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.649793 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.649930 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.649823 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b86e05a9-f3dc-49ae-b920-dc18edb23069-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.649930 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.649842 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.649930 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.649857 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.649930 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.649872 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b86e05a9-f3dc-49ae-b920-dc18edb23069-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.649930 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.649911 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.650243 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.649942 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b86e05a9-f3dc-49ae-b920-dc18edb23069-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.650243 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.649959 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b86e05a9-f3dc-49ae-b920-dc18edb23069-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.650243 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.650016 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.650243 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.650050 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b86e05a9-f3dc-49ae-b920-dc18edb23069-config-out\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.750519 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.750480 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.750670 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.750524 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b86e05a9-f3dc-49ae-b920-dc18edb23069-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.750670 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.750544 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.750670 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.750609 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.750670 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.750637 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b86e05a9-f3dc-49ae-b920-dc18edb23069-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.750884 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.750731 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.750884 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.750753 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b86e05a9-f3dc-49ae-b920-dc18edb23069-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.750884 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.750780 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b86e05a9-f3dc-49ae-b920-dc18edb23069-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.750884 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.750812 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.751207 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.751160 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/b86e05a9-f3dc-49ae-b920-dc18edb23069-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.751606 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.751575 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b86e05a9-f3dc-49ae-b920-dc18edb23069-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.751802 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.751764 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b86e05a9-f3dc-49ae-b920-dc18edb23069-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.751802 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.751765 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b86e05a9-f3dc-49ae-b920-dc18edb23069-config-out\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.751982 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.751874 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b86e05a9-f3dc-49ae-b920-dc18edb23069-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.751982 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.751914 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b86e05a9-f3dc-49ae-b920-dc18edb23069-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.751982 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.751941 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.751982 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.751972 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b86e05a9-f3dc-49ae-b920-dc18edb23069-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.752210 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.752016 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-config\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.752210 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.752046 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.752210 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.752070 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wwkb7\" (UniqueName: \"kubernetes.io/projected/b86e05a9-f3dc-49ae-b920-dc18edb23069-kube-api-access-wwkb7\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.752210 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.752117 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-web-config\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.753715 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.753516 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b86e05a9-f3dc-49ae-b920-dc18edb23069-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.753715 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.753700 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.755344 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.754192 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.755344 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.754291 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.755344 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.754735 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b86e05a9-f3dc-49ae-b920-dc18edb23069-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.755344 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.754929 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b86e05a9-f3dc-49ae-b920-dc18edb23069-config-out\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.755344 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.755159 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b86e05a9-f3dc-49ae-b920-dc18edb23069-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.755344 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.755306 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.755344 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.755316 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.755756 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.755418 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-web-config\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.756717 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.756694 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.756834 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.756778 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b86e05a9-f3dc-49ae-b920-dc18edb23069-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.757286 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.757261 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-config\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.757382 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.757367 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b86e05a9-f3dc-49ae-b920-dc18edb23069-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.762747 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.762727 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwkb7\" (UniqueName: \"kubernetes.io/projected/b86e05a9-f3dc-49ae-b920-dc18edb23069-kube-api-access-wwkb7\") pod \"prometheus-k8s-0\" (UID: \"b86e05a9-f3dc-49ae-b920-dc18edb23069\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:52.877955 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:52.877921 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:56:53.004911 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:53.004852 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 17:56:53.006998 ip-10-0-142-63 kubenswrapper[2578]: W0423 17:56:53.006971 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb86e05a9_f3dc_49ae_b920_dc18edb23069.slice/crio-9a93b4abb6951bd4779393db90d218a8f09470d307c69abad82f1da2c518d910 WatchSource:0}: Error finding container 9a93b4abb6951bd4779393db90d218a8f09470d307c69abad82f1da2c518d910: Status 404 returned error can't find the container with id 9a93b4abb6951bd4779393db90d218a8f09470d307c69abad82f1da2c518d910 Apr 23 17:56:53.510696 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:53.510660 2578 generic.go:358] "Generic (PLEG): container finished" podID="b86e05a9-f3dc-49ae-b920-dc18edb23069" containerID="7f0b9fcd35be70a3746f177475c748f5d45ccde91e89e3468d24ee756d724391" exitCode=0 Apr 23 17:56:53.511180 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:53.510761 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b86e05a9-f3dc-49ae-b920-dc18edb23069","Type":"ContainerDied","Data":"7f0b9fcd35be70a3746f177475c748f5d45ccde91e89e3468d24ee756d724391"} Apr 23 17:56:53.511180 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:53.510805 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b86e05a9-f3dc-49ae-b920-dc18edb23069","Type":"ContainerStarted","Data":"9a93b4abb6951bd4779393db90d218a8f09470d307c69abad82f1da2c518d910"} Apr 23 17:56:53.759564 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:53.756103 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e" path="/var/lib/kubelet/pods/8aa2b3bc-4b20-40f8-9dab-7dbfbba8500e/volumes" Apr 23 17:56:54.517683 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:54.517650 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b86e05a9-f3dc-49ae-b920-dc18edb23069","Type":"ContainerStarted","Data":"aa649d0badbe5569428d63701fadcca79ef417622cc6b3ef59d287b68fea5723"} Apr 23 17:56:54.517683 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:54.517684 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b86e05a9-f3dc-49ae-b920-dc18edb23069","Type":"ContainerStarted","Data":"f4f2262f3e18b604ae5b5d8354082cd7d2bf70d4c4d2efdb986078f4d2d3b9e0"} Apr 23 17:56:54.518114 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:54.517694 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b86e05a9-f3dc-49ae-b920-dc18edb23069","Type":"ContainerStarted","Data":"e074bfb99e616335b64d8cb748bc97391853a6e950335fc0365fb0baf874c715"} Apr 23 17:56:54.518114 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:54.517703 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b86e05a9-f3dc-49ae-b920-dc18edb23069","Type":"ContainerStarted","Data":"b117e3d6b45976e628e0c0f38c19a4886f42d1822587f2d3f0296cf97204a921"} Apr 23 17:56:54.518114 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:54.517711 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b86e05a9-f3dc-49ae-b920-dc18edb23069","Type":"ContainerStarted","Data":"603e6a9e387dcac64e7ea677ab8483e778ebf02c7d941dcb91973d21599a05ce"} Apr 23 17:56:54.518114 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:54.517719 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"b86e05a9-f3dc-49ae-b920-dc18edb23069","Type":"ContainerStarted","Data":"c375ce6fd3e4fe960929c7c054f0d8096d6edb540ecccae426f1094110aff744"} Apr 23 17:56:54.545010 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:54.544957 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.544938203 podStartE2EDuration="2.544938203s" podCreationTimestamp="2026-04-23 17:56:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:56:54.543334232 +0000 UTC m=+211.365801992" watchObservedRunningTime="2026-04-23 17:56:54.544938203 +0000 UTC m=+211.367405941" Apr 23 17:56:57.878680 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:56:57.878629 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:52.879034 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:57:52.878988 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:52.894439 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:57:52.894416 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:57:53.706551 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:57:53.706524 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 17:58:23.654849 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:58:23.654823 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 17:58:23.655344 ip-10-0-142-63 kubenswrapper[2578]: I0423 17:58:23.654878 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 18:03:23.681984 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:03:23.681951 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 18:03:23.682565 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:03:23.682194 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 18:08:23.706420 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:08:23.706389 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 18:08:23.707128 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:08:23.707107 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 18:13:23.730777 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:13:23.730745 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 18:13:23.732367 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:13:23.732346 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 18:18:23.761155 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:18:23.761122 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 18:18:23.763601 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:18:23.762684 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 18:23:23.784360 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:23:23.784259 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 18:23:23.791818 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:23:23.786075 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 18:28:23.807834 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:28:23.807809 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 18:28:23.810403 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:28:23.809880 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 18:33:23.831961 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:33:23.831844 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 18:33:23.835969 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:33:23.835808 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 18:38:23.856352 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:38:23.856241 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 18:38:23.866029 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:38:23.866007 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 18:43:23.883602 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:43:23.883490 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 18:43:23.890049 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:43:23.890024 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 18:48:23.907181 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:48:23.907053 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 18:48:23.913965 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:48:23.913944 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 18:51:27.601006 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:27.600966 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-dr7p4_16856cbe-119d-49c7-aa8f-a7d0f4002555/global-pull-secret-syncer/0.log" Apr 23 18:51:27.722894 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:27.722847 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-442hb_97a978cb-3849-4c89-bce7-b7b3126e771f/konnectivity-agent/0.log" Apr 23 18:51:27.849999 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:27.849962 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-142-63.ec2.internal_fb1b5e25e4662b1f0eb7136557c5a4df/haproxy/0.log" Apr 23 18:51:31.118152 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.118118 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_930847a2-bfd1-40c9-ab02-201230e3ece9/alertmanager/0.log" Apr 23 18:51:31.143137 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.143108 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_930847a2-bfd1-40c9-ab02-201230e3ece9/config-reloader/0.log" Apr 23 18:51:31.168703 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.168677 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_930847a2-bfd1-40c9-ab02-201230e3ece9/kube-rbac-proxy-web/0.log" Apr 23 18:51:31.190858 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.190838 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_930847a2-bfd1-40c9-ab02-201230e3ece9/kube-rbac-proxy/0.log" Apr 23 18:51:31.218587 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.218565 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_930847a2-bfd1-40c9-ab02-201230e3ece9/kube-rbac-proxy-metric/0.log" Apr 23 18:51:31.241113 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.241070 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_930847a2-bfd1-40c9-ab02-201230e3ece9/prom-label-proxy/0.log" Apr 23 18:51:31.262639 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.262620 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_930847a2-bfd1-40c9-ab02-201230e3ece9/init-config-reloader/0.log" Apr 23 18:51:31.344915 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.344889 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-99rsf_aefeedb2-a459-4b8f-9510-da8a136c2add/kube-state-metrics/0.log" Apr 23 18:51:31.377263 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.377236 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-99rsf_aefeedb2-a459-4b8f-9510-da8a136c2add/kube-rbac-proxy-main/0.log" Apr 23 18:51:31.402390 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.402368 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-99rsf_aefeedb2-a459-4b8f-9510-da8a136c2add/kube-rbac-proxy-self/0.log" Apr 23 18:51:31.466012 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.465985 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-z5dwz_eb4573a0-b31c-44f4-aab7-34751555bf31/monitoring-plugin/0.log" Apr 23 18:51:31.509859 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.509816 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8c8wq_20babaa9-49a8-431c-a51c-fc72be72a2cb/node-exporter/0.log" Apr 23 18:51:31.533942 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.533922 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8c8wq_20babaa9-49a8-431c-a51c-fc72be72a2cb/kube-rbac-proxy/0.log" Apr 23 18:51:31.576513 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.576491 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8c8wq_20babaa9-49a8-431c-a51c-fc72be72a2cb/init-textfile/0.log" Apr 23 18:51:31.778963 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.778883 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-brvjh_3c070709-8b02-40df-a7c8-e5b5d0ad22a6/kube-rbac-proxy-main/0.log" Apr 23 18:51:31.801989 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.801963 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-brvjh_3c070709-8b02-40df-a7c8-e5b5d0ad22a6/kube-rbac-proxy-self/0.log" Apr 23 18:51:31.830626 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.830597 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-brvjh_3c070709-8b02-40df-a7c8-e5b5d0ad22a6/openshift-state-metrics/0.log" Apr 23 18:51:31.867894 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.867863 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b86e05a9-f3dc-49ae-b920-dc18edb23069/prometheus/0.log" Apr 23 18:51:31.884182 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.884159 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b86e05a9-f3dc-49ae-b920-dc18edb23069/config-reloader/0.log" Apr 23 18:51:31.909682 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.909655 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b86e05a9-f3dc-49ae-b920-dc18edb23069/thanos-sidecar/0.log" Apr 23 18:51:31.933536 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.933516 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b86e05a9-f3dc-49ae-b920-dc18edb23069/kube-rbac-proxy-web/0.log" Apr 23 18:51:31.958682 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.958662 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b86e05a9-f3dc-49ae-b920-dc18edb23069/kube-rbac-proxy/0.log" Apr 23 18:51:31.981042 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:31.980995 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b86e05a9-f3dc-49ae-b920-dc18edb23069/kube-rbac-proxy-thanos/0.log" Apr 23 18:51:32.003231 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:32.003175 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_b86e05a9-f3dc-49ae-b920-dc18edb23069/init-config-reloader/0.log" Apr 23 18:51:32.034891 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:32.034825 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-9fdwj_ad280719-5945-4ba9-b574-bb0345e669c9/prometheus-operator/0.log" Apr 23 18:51:32.056915 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:32.056891 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-9fdwj_ad280719-5945-4ba9-b574-bb0345e669c9/kube-rbac-proxy/0.log" Apr 23 18:51:32.090276 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:32.090233 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-hxgc8_25e3c9d3-09db-4fd7-8fb2-232077749fa6/prometheus-operator-admission-webhook/0.log" Apr 23 18:51:33.501154 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:33.501120 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-n6mwn_904ab1d8-f170-427c-b547-546b37cd8388/networking-console-plugin/0.log" Apr 23 18:51:34.480654 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.480623 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw"] Apr 23 18:51:34.484167 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.484143 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw" Apr 23 18:51:34.486366 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.486341 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8jlv4\"/\"openshift-service-ca.crt\"" Apr 23 18:51:34.486611 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.486595 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8jlv4\"/\"kube-root-ca.crt\"" Apr 23 18:51:34.487461 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.487445 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8jlv4\"/\"default-dockercfg-kt7rv\"" Apr 23 18:51:34.491011 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.490960 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw"] Apr 23 18:51:34.563672 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.563636 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d60729d6-1a88-442e-950e-29035b68d734-proc\") pod \"perf-node-gather-daemonset-tbxhw\" (UID: \"d60729d6-1a88-442e-950e-29035b68d734\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw" Apr 23 18:51:34.564066 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.563689 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d60729d6-1a88-442e-950e-29035b68d734-lib-modules\") pod \"perf-node-gather-daemonset-tbxhw\" (UID: \"d60729d6-1a88-442e-950e-29035b68d734\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw" Apr 23 18:51:34.564066 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.563756 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d60729d6-1a88-442e-950e-29035b68d734-podres\") pod \"perf-node-gather-daemonset-tbxhw\" (UID: \"d60729d6-1a88-442e-950e-29035b68d734\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw" Apr 23 18:51:34.564066 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.563808 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87vgc\" (UniqueName: \"kubernetes.io/projected/d60729d6-1a88-442e-950e-29035b68d734-kube-api-access-87vgc\") pod \"perf-node-gather-daemonset-tbxhw\" (UID: \"d60729d6-1a88-442e-950e-29035b68d734\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw" Apr 23 18:51:34.564066 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.563831 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d60729d6-1a88-442e-950e-29035b68d734-sys\") pod \"perf-node-gather-daemonset-tbxhw\" (UID: \"d60729d6-1a88-442e-950e-29035b68d734\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw" Apr 23 18:51:34.665116 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.665068 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-87vgc\" (UniqueName: \"kubernetes.io/projected/d60729d6-1a88-442e-950e-29035b68d734-kube-api-access-87vgc\") pod \"perf-node-gather-daemonset-tbxhw\" (UID: \"d60729d6-1a88-442e-950e-29035b68d734\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw" Apr 23 18:51:34.665116 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.665123 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d60729d6-1a88-442e-950e-29035b68d734-sys\") pod \"perf-node-gather-daemonset-tbxhw\" (UID: \"d60729d6-1a88-442e-950e-29035b68d734\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw" Apr 23 18:51:34.665352 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.665174 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d60729d6-1a88-442e-950e-29035b68d734-proc\") pod \"perf-node-gather-daemonset-tbxhw\" (UID: \"d60729d6-1a88-442e-950e-29035b68d734\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw" Apr 23 18:51:34.665352 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.665210 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d60729d6-1a88-442e-950e-29035b68d734-lib-modules\") pod \"perf-node-gather-daemonset-tbxhw\" (UID: \"d60729d6-1a88-442e-950e-29035b68d734\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw" Apr 23 18:51:34.665352 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.665229 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d60729d6-1a88-442e-950e-29035b68d734-podres\") pod \"perf-node-gather-daemonset-tbxhw\" (UID: \"d60729d6-1a88-442e-950e-29035b68d734\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw" Apr 23 18:51:34.665352 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.665269 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d60729d6-1a88-442e-950e-29035b68d734-sys\") pod \"perf-node-gather-daemonset-tbxhw\" (UID: \"d60729d6-1a88-442e-950e-29035b68d734\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw" Apr 23 18:51:34.665352 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.665288 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d60729d6-1a88-442e-950e-29035b68d734-proc\") pod \"perf-node-gather-daemonset-tbxhw\" (UID: \"d60729d6-1a88-442e-950e-29035b68d734\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw" Apr 23 18:51:34.665518 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.665357 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d60729d6-1a88-442e-950e-29035b68d734-podres\") pod \"perf-node-gather-daemonset-tbxhw\" (UID: \"d60729d6-1a88-442e-950e-29035b68d734\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw" Apr 23 18:51:34.665518 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.665360 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d60729d6-1a88-442e-950e-29035b68d734-lib-modules\") pod \"perf-node-gather-daemonset-tbxhw\" (UID: \"d60729d6-1a88-442e-950e-29035b68d734\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw" Apr 23 18:51:34.673260 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.673240 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-87vgc\" (UniqueName: \"kubernetes.io/projected/d60729d6-1a88-442e-950e-29035b68d734-kube-api-access-87vgc\") pod \"perf-node-gather-daemonset-tbxhw\" (UID: \"d60729d6-1a88-442e-950e-29035b68d734\") " pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw" Apr 23 18:51:34.716209 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.716179 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-5f6sp_8d77429e-b156-4f21-8e2d-958b0183cfe9/volume-data-source-validator/0.log" Apr 23 18:51:34.795325 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.795302 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw" Apr 23 18:51:34.911741 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.911715 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw"] Apr 23 18:51:34.914039 ip-10-0-142-63 kubenswrapper[2578]: W0423 18:51:34.914010 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd60729d6_1a88_442e_950e_29035b68d734.slice/crio-741018427ebde2eac6c2f1880531a1f72ac3a70c94fd3334cff6f68e73e7572b WatchSource:0}: Error finding container 741018427ebde2eac6c2f1880531a1f72ac3a70c94fd3334cff6f68e73e7572b: Status 404 returned error can't find the container with id 741018427ebde2eac6c2f1880531a1f72ac3a70c94fd3334cff6f68e73e7572b Apr 23 18:51:34.915832 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:34.915814 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 18:51:35.119113 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:35.119020 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw" event={"ID":"d60729d6-1a88-442e-950e-29035b68d734","Type":"ContainerStarted","Data":"f4be61253e7d83703f324c996f50bfe95816173fa6e34115f1c02e2a1b4ac0dd"} Apr 23 18:51:35.119113 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:35.119059 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw" event={"ID":"d60729d6-1a88-442e-950e-29035b68d734","Type":"ContainerStarted","Data":"741018427ebde2eac6c2f1880531a1f72ac3a70c94fd3334cff6f68e73e7572b"} Apr 23 18:51:35.119113 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:35.119081 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw" Apr 23 18:51:35.366529 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:35.366498 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7t847_17f79bf8-0319-418c-a852-2ce9a897e648/dns/0.log" Apr 23 18:51:35.387283 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:35.387213 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7t847_17f79bf8-0319-418c-a852-2ce9a897e648/kube-rbac-proxy/0.log" Apr 23 18:51:35.520427 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:35.520398 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mn726_890e39b8-16d9-4ffa-9934-ca657c99daf2/dns-node-resolver/0.log" Apr 23 18:51:35.997682 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:35.997648 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qjrbv_308874bf-36fb-4296-aa6f-8568677e83c4/node-ca/0.log" Apr 23 18:51:37.050377 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:37.050346 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-j7wvx_f33cc58f-96ef-4c06-8b13-ae89d3b2c805/serve-healthcheck-canary/0.log" Apr 23 18:51:37.538772 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:37.538683 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kcw7j_e1cea0c8-f980-48ab-9c67-01d902f521d7/kube-rbac-proxy/0.log" Apr 23 18:51:37.557333 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:37.557302 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kcw7j_e1cea0c8-f980-48ab-9c67-01d902f521d7/exporter/0.log" Apr 23 18:51:37.576415 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:37.576389 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-kcw7j_e1cea0c8-f980-48ab-9c67-01d902f521d7/extractor/0.log" Apr 23 18:51:41.131623 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:41.131591 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw" Apr 23 18:51:41.157720 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:41.157668 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8jlv4/perf-node-gather-daemonset-tbxhw" podStartSLOduration=7.15765269 podStartE2EDuration="7.15765269s" podCreationTimestamp="2026-04-23 18:51:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 18:51:35.135806781 +0000 UTC m=+3491.958274516" watchObservedRunningTime="2026-04-23 18:51:41.15765269 +0000 UTC m=+3497.980120425" Apr 23 18:51:45.510157 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:45.510123 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-48vg7_a6b6e3a3-edb0-41a4-877a-1eed7a82403d/kube-multus/0.log" Apr 23 18:51:45.579696 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:45.579669 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2krs5_5a131586-128b-4207-ac02-4240d9075bc2/kube-multus-additional-cni-plugins/0.log" Apr 23 18:51:45.604938 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:45.604885 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2krs5_5a131586-128b-4207-ac02-4240d9075bc2/egress-router-binary-copy/0.log" Apr 23 18:51:45.624942 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:45.624916 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2krs5_5a131586-128b-4207-ac02-4240d9075bc2/cni-plugins/0.log" Apr 23 18:51:45.646664 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:45.646636 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2krs5_5a131586-128b-4207-ac02-4240d9075bc2/bond-cni-plugin/0.log" Apr 23 18:51:45.670800 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:45.670721 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2krs5_5a131586-128b-4207-ac02-4240d9075bc2/routeoverride-cni/0.log" Apr 23 18:51:45.694309 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:45.694288 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2krs5_5a131586-128b-4207-ac02-4240d9075bc2/whereabouts-cni-bincopy/0.log" Apr 23 18:51:45.721199 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:45.721171 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2krs5_5a131586-128b-4207-ac02-4240d9075bc2/whereabouts-cni/0.log" Apr 23 18:51:46.146112 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:46.146054 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mqfsb_e70550da-839d-4462-b368-c0139f793c15/network-metrics-daemon/0.log" Apr 23 18:51:46.164797 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:46.164769 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mqfsb_e70550da-839d-4462-b368-c0139f793c15/kube-rbac-proxy/0.log" Apr 23 18:51:47.661892 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:47.661859 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-controller/0.log" Apr 23 18:51:47.681388 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:47.681356 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/0.log" Apr 23 18:51:47.696559 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:47.696534 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovn-acl-logging/1.log" Apr 23 18:51:47.714096 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:47.714069 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/kube-rbac-proxy-node/0.log" Apr 23 18:51:47.737010 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:47.736987 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 18:51:47.754431 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:47.754404 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/northd/0.log" Apr 23 18:51:47.774251 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:47.774227 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/nbdb/0.log" Apr 23 18:51:47.799753 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:47.799728 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/sbdb/0.log" Apr 23 18:51:47.899624 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:47.899587 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-x2gvq_2fb5e8ca-0609-4dd5-ac79-69c12ad152a3/ovnkube-controller/0.log" Apr 23 18:51:48.858907 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:48.858863 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-2t94g_47475636-63bf-4a11-9285-cce1b1df596d/check-endpoints/0.log" Apr 23 18:51:48.909749 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:48.909722 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-bztd4_0c49641e-88eb-49d0-b1e0-5408152b701d/network-check-target-container/0.log" Apr 23 18:51:49.857244 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:49.857204 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-xvbsl_2467946b-effa-4a29-a822-8670defce032/iptables-alerter/0.log" Apr 23 18:51:50.468991 ip-10-0-142-63 kubenswrapper[2578]: I0423 18:51:50.468959 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-t84lj_7560deb4-54dc-4f99-a04b-c7e973e8b201/tuned/0.log"