Apr 21 07:51:40.035184 ip-10-0-137-194 systemd[1]: Starting Kubernetes Kubelet... Apr 21 07:51:40.544769 ip-10-0-137-194 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 07:51:40.544769 ip-10-0-137-194 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 07:51:40.544769 ip-10-0-137-194 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 07:51:40.544769 ip-10-0-137-194 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 07:51:40.544769 ip-10-0-137-194 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 07:51:40.545799 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.545711 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 07:51:40.550063 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550038 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:51:40.550063 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550056 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:51:40.550063 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550061 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:51:40.550063 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550065 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:51:40.550063 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550071 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:51:40.550366 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550075 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:51:40.550366 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550080 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:51:40.550366 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550083 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:51:40.550366 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550086 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:51:40.550366 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550090 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:51:40.550366 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550094 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:51:40.550366 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550098 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:51:40.550366 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550101 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:51:40.550366 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550105 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:51:40.550366 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550109 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:51:40.550366 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550113 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:51:40.550366 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550117 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:51:40.550366 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550120 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:51:40.550366 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550124 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:51:40.550366 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550128 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:51:40.550366 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550131 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:51:40.550366 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550135 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:51:40.550366 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550139 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:51:40.550366 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550143 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:51:40.550366 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550147 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:51:40.551194 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550150 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:51:40.551194 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550154 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:51:40.551194 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550158 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:51:40.551194 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550161 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:51:40.551194 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550165 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:51:40.551194 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550169 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:51:40.551194 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550173 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:51:40.551194 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550177 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:51:40.551194 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550181 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:51:40.551194 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550185 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:51:40.551194 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550192 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:51:40.551194 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550198 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:51:40.551194 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550203 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:51:40.551194 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550208 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:51:40.551194 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550213 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:51:40.551194 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550217 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:51:40.551194 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550221 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:51:40.551194 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550225 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:51:40.551194 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550229 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:51:40.551194 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550233 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:51:40.552073 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550237 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:51:40.552073 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550241 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:51:40.552073 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550245 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:51:40.552073 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550249 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:51:40.552073 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550253 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:51:40.552073 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550257 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:51:40.552073 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550261 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:51:40.552073 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550265 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:51:40.552073 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550269 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:51:40.552073 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550274 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:51:40.552073 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550278 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:51:40.552073 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550282 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:51:40.552073 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550286 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:51:40.552073 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550290 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:51:40.552073 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550294 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:51:40.552073 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550298 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:51:40.552073 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550303 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:51:40.552073 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550307 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:51:40.552073 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550314 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:51:40.552073 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550319 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:51:40.552945 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550323 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:51:40.552945 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550327 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:51:40.552945 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550331 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:51:40.552945 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550335 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:51:40.552945 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550339 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:51:40.552945 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550345 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:51:40.552945 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550350 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:51:40.552945 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550354 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:51:40.552945 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550359 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:51:40.552945 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550363 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:51:40.552945 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550369 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:51:40.552945 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550373 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:51:40.552945 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550377 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:51:40.552945 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550391 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:51:40.552945 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550396 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:51:40.552945 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550400 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:51:40.552945 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550404 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:51:40.552945 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550409 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:51:40.552945 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550413 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:51:40.553460 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550417 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:51:40.553460 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.550421 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:51:40.553460 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551101 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:51:40.553460 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551111 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:51:40.553460 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551116 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:51:40.553460 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551120 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:51:40.553460 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551124 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:51:40.553460 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551128 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:51:40.553460 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551133 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:51:40.553460 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551138 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:51:40.553460 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551142 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:51:40.553460 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551146 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:51:40.553460 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551150 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:51:40.553460 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551155 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:51:40.553460 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551159 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:51:40.553460 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551163 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:51:40.553460 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551167 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:51:40.553460 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551172 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:51:40.553460 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551177 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:51:40.553460 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551182 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:51:40.554082 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551187 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:51:40.554082 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551190 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:51:40.554082 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551194 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:51:40.554082 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551198 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:51:40.554082 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551203 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:51:40.554082 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551207 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:51:40.554082 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551219 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:51:40.554082 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551223 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:51:40.554082 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551228 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:51:40.554082 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551232 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:51:40.554082 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551236 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:51:40.554082 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551240 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:51:40.554082 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551245 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:51:40.554082 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551249 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:51:40.554082 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551253 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:51:40.554082 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551257 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:51:40.554082 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551261 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:51:40.554082 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551265 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:51:40.554082 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551269 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:51:40.554774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551276 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:51:40.554774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551282 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:51:40.554774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551290 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:51:40.554774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551297 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:51:40.554774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551302 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:51:40.554774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551306 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:51:40.554774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551311 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:51:40.554774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551316 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:51:40.554774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551322 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:51:40.554774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551326 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:51:40.554774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551330 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:51:40.554774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551335 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:51:40.554774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551339 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:51:40.554774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551344 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:51:40.554774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551348 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:51:40.554774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551352 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:51:40.554774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551356 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:51:40.554774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551360 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:51:40.554774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551365 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:51:40.555287 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551370 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:51:40.555287 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551382 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:51:40.555287 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551386 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:51:40.555287 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551390 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:51:40.555287 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551394 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:51:40.555287 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551398 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:51:40.555287 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551404 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:51:40.555287 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551408 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:51:40.555287 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551412 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:51:40.555287 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551416 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:51:40.555287 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551420 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:51:40.555287 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551424 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:51:40.555287 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551427 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:51:40.555287 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551432 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:51:40.555287 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551436 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:51:40.555287 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551439 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:51:40.555287 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551444 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:51:40.555287 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551448 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:51:40.555287 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551452 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:51:40.555774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551456 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:51:40.555774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551460 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:51:40.555774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551464 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:51:40.555774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551468 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:51:40.555774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551472 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:51:40.555774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551478 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:51:40.555774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551482 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:51:40.555774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551487 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:51:40.555774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551491 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:51:40.555774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551495 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:51:40.555774 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.551499 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:51:40.555774 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551603 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 07:51:40.555774 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551613 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 07:51:40.555774 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551626 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 07:51:40.555774 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551632 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 07:51:40.555774 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551647 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 07:51:40.555774 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551653 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 07:51:40.555774 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551660 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 07:51:40.555774 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551667 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 07:51:40.555774 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551672 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 07:51:40.555774 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551677 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551683 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551688 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551693 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551697 2570 flags.go:64] FLAG: --cgroup-root="" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551702 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551707 2570 flags.go:64] FLAG: --client-ca-file="" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551711 2570 flags.go:64] FLAG: --cloud-config="" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551716 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551720 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551731 2570 flags.go:64] FLAG: --cluster-domain="" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551736 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551742 2570 flags.go:64] FLAG: --config-dir="" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551747 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551752 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551758 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551763 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551769 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551774 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551779 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551784 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551789 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551794 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551798 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551805 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 07:51:40.556379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551810 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551815 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551819 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551832 2570 flags.go:64] FLAG: --enable-server="true" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551836 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551852 2570 flags.go:64] FLAG: --event-burst="100" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551857 2570 flags.go:64] FLAG: --event-qps="50" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551886 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551891 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551896 2570 flags.go:64] FLAG: --eviction-hard="" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551903 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551907 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551912 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551917 2570 flags.go:64] FLAG: --eviction-soft="" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551922 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551927 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551931 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551936 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551941 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551946 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551951 2570 flags.go:64] FLAG: --feature-gates="" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551958 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551963 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551968 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551973 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551978 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 21 07:51:40.557014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551983 2570 flags.go:64] FLAG: --help="false" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551987 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-137-194.ec2.internal" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551992 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.551997 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552002 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552008 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552013 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552018 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552023 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552028 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552034 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552039 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552045 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552050 2570 flags.go:64] FLAG: --kube-reserved="" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552054 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552059 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552064 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552069 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552073 2570 flags.go:64] FLAG: --lock-file="" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552078 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552083 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552087 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552096 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 07:51:40.557674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552101 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552106 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552111 2570 flags.go:64] FLAG: --logging-format="text" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552116 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552122 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552127 2570 flags.go:64] FLAG: --manifest-url="" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552132 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552138 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552143 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552150 2570 flags.go:64] FLAG: --max-pods="110" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552155 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552160 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552165 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552170 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552175 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552180 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552184 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552201 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552205 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552210 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552216 2570 flags.go:64] FLAG: --pod-cidr="" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552221 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552229 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552234 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 07:51:40.558279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552239 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552244 2570 flags.go:64] FLAG: --port="10250" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552249 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552253 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0048d407b77b3f746" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552258 2570 flags.go:64] FLAG: --qos-reserved="" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552263 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552268 2570 flags.go:64] FLAG: --register-node="true" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552273 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552279 2570 flags.go:64] FLAG: --register-with-taints="" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552285 2570 flags.go:64] FLAG: --registry-burst="10" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552290 2570 flags.go:64] FLAG: --registry-qps="5" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552295 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552300 2570 flags.go:64] FLAG: --reserved-memory="" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552306 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552311 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552315 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552320 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552325 2570 flags.go:64] FLAG: --runonce="false" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552330 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552335 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552340 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552344 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552349 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552354 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552360 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552365 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 07:51:40.558929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552369 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552374 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552379 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552385 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552390 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552394 2570 flags.go:64] FLAG: --system-cgroups="" Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552399 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552408 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552412 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552417 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552424 2570 flags.go:64] FLAG: --tls-min-version="" Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552429 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552434 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552438 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552444 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552449 2570 flags.go:64] FLAG: --v="2" Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552455 2570 flags.go:64] FLAG: --version="false" Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552471 2570 flags.go:64] FLAG: --vmodule="" Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552478 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.552484 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552643 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552649 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552654 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552659 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:51:40.559591 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552664 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:51:40.560181 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552668 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:51:40.560181 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552672 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:51:40.560181 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552677 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:51:40.560181 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552681 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:51:40.560181 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552686 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:51:40.560181 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552690 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:51:40.560181 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552697 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:51:40.560181 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552701 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:51:40.560181 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552705 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:51:40.560181 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552709 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:51:40.560181 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552714 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:51:40.560181 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552719 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:51:40.560181 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552724 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:51:40.560181 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552728 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:51:40.560181 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552733 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:51:40.560181 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552737 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:51:40.560181 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552741 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:51:40.560181 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552745 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:51:40.560181 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552749 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:51:40.560181 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552753 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:51:40.560682 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552758 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:51:40.560682 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552762 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:51:40.560682 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552766 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:51:40.560682 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552770 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:51:40.560682 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552778 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:51:40.560682 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552783 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:51:40.560682 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552787 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:51:40.560682 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552791 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:51:40.560682 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552796 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:51:40.560682 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552800 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:51:40.560682 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552804 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:51:40.560682 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552809 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:51:40.560682 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552815 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:51:40.560682 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552821 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:51:40.560682 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552826 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:51:40.560682 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552831 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:51:40.560682 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552835 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:51:40.560682 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552839 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:51:40.560682 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552846 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:51:40.561163 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552851 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:51:40.561163 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552855 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:51:40.561163 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552875 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:51:40.561163 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552880 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:51:40.561163 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552884 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:51:40.561163 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552898 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:51:40.561163 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552903 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:51:40.561163 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552907 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:51:40.561163 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552911 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:51:40.561163 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552915 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:51:40.561163 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552920 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:51:40.561163 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552924 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:51:40.561163 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552928 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:51:40.561163 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552933 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:51:40.561163 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552937 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:51:40.561163 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552941 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:51:40.561163 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552945 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:51:40.561163 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552951 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:51:40.561163 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552955 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:51:40.561163 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552959 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:51:40.561659 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552963 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:51:40.561659 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552967 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:51:40.561659 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552971 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:51:40.561659 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552975 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:51:40.561659 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552980 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:51:40.561659 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552984 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:51:40.561659 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552988 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:51:40.561659 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552992 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:51:40.561659 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.552996 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:51:40.561659 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.553001 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:51:40.561659 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.553005 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:51:40.561659 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.553011 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:51:40.561659 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.553015 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:51:40.561659 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.553019 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:51:40.561659 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.553024 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:51:40.561659 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.553029 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:51:40.561659 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.553034 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:51:40.561659 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.553038 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:51:40.561659 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.553043 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:51:40.561659 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.553048 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:51:40.562160 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.553052 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:51:40.562160 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.553056 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:51:40.562160 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.553603 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 07:51:40.562160 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.560064 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 07:51:40.562160 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.560172 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 07:51:40.562160 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560220 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:51:40.562160 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560226 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:51:40.562160 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560230 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:51:40.562160 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560235 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:51:40.562160 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560240 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:51:40.562160 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560244 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:51:40.562160 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560247 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:51:40.562160 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560250 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:51:40.562160 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560253 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:51:40.562160 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560257 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:51:40.562524 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560260 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:51:40.562524 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560263 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:51:40.562524 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560265 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:51:40.562524 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560268 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:51:40.562524 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560270 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:51:40.562524 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560273 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:51:40.562524 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560276 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:51:40.562524 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560279 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:51:40.562524 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560281 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:51:40.562524 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560284 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:51:40.562524 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560287 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:51:40.562524 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560290 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:51:40.562524 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560293 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:51:40.562524 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560296 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:51:40.562524 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560298 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:51:40.562524 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560301 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:51:40.562524 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560303 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:51:40.562524 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560306 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:51:40.562524 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560308 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:51:40.562524 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560311 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:51:40.563060 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560314 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:51:40.563060 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560319 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:51:40.563060 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560322 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:51:40.563060 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560325 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:51:40.563060 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560327 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:51:40.563060 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560330 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:51:40.563060 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560333 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:51:40.563060 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560336 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:51:40.563060 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560338 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:51:40.563060 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560341 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:51:40.563060 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560343 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:51:40.563060 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560346 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:51:40.563060 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560349 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:51:40.563060 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560351 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:51:40.563060 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560354 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:51:40.563060 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560357 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:51:40.563060 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560359 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:51:40.563060 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560362 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:51:40.563060 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560365 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:51:40.563060 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560368 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:51:40.563552 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560370 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:51:40.563552 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560373 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:51:40.563552 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560375 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:51:40.563552 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560378 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:51:40.563552 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560380 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:51:40.563552 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560382 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:51:40.563552 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560385 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:51:40.563552 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560388 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:51:40.563552 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560390 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:51:40.563552 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560393 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:51:40.563552 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560396 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:51:40.563552 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560399 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:51:40.563552 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560401 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:51:40.563552 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560404 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:51:40.563552 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560408 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:51:40.563552 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560410 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:51:40.563552 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560413 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:51:40.563552 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560415 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:51:40.563552 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560419 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:51:40.564027 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560423 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:51:40.564027 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560426 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:51:40.564027 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560429 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:51:40.564027 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560431 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:51:40.564027 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560434 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:51:40.564027 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560436 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:51:40.564027 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560439 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:51:40.564027 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560441 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:51:40.564027 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560444 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:51:40.564027 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560447 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:51:40.564027 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560449 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:51:40.564027 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560451 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:51:40.564027 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560454 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:51:40.564027 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560456 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:51:40.564027 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560459 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:51:40.564027 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560461 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:51:40.564027 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560464 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:51:40.564430 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.560469 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 07:51:40.564430 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560574 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 07:51:40.564430 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560580 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 07:51:40.564430 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560583 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 07:51:40.564430 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560586 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 07:51:40.564430 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560589 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 07:51:40.564430 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560591 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 07:51:40.564430 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560594 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 07:51:40.564430 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560596 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 07:51:40.564430 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560599 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 07:51:40.564430 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560602 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 07:51:40.564430 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560605 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 07:51:40.564430 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560608 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 07:51:40.564430 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560610 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 07:51:40.564430 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560613 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 07:51:40.564430 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560615 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 07:51:40.564829 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560618 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 07:51:40.564829 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560621 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 07:51:40.564829 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560623 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 07:51:40.564829 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560626 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 21 07:51:40.564829 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560628 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 07:51:40.564829 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560631 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 07:51:40.564829 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560635 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 07:51:40.564829 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560639 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 07:51:40.564829 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560642 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 07:51:40.564829 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560644 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 07:51:40.564829 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560647 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 07:51:40.564829 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560650 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 07:51:40.564829 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560653 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 07:51:40.564829 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560655 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 07:51:40.564829 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560657 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 07:51:40.564829 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560660 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 07:51:40.564829 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560663 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 07:51:40.564829 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560665 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 07:51:40.564829 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560668 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 07:51:40.564829 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560671 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 07:51:40.565323 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560673 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 07:51:40.565323 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560676 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 07:51:40.565323 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560678 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 07:51:40.565323 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560681 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 07:51:40.565323 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560684 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 07:51:40.565323 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560686 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 07:51:40.565323 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560689 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 07:51:40.565323 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560692 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 07:51:40.565323 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560695 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 07:51:40.565323 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560698 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 07:51:40.565323 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560700 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 07:51:40.565323 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560703 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 07:51:40.565323 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560705 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 07:51:40.565323 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560708 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 07:51:40.565323 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560710 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 07:51:40.565323 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560713 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 07:51:40.565323 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560715 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 07:51:40.565323 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560718 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 07:51:40.565323 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560720 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 07:51:40.565786 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560723 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 07:51:40.565786 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560725 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 07:51:40.565786 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560727 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 07:51:40.565786 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560730 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 07:51:40.565786 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560733 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 07:51:40.565786 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560735 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 07:51:40.565786 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560738 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 07:51:40.565786 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560740 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 07:51:40.565786 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560743 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 07:51:40.565786 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560746 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 07:51:40.565786 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560748 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 07:51:40.565786 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560751 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 07:51:40.565786 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560753 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 07:51:40.565786 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560756 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 07:51:40.565786 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560758 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 07:51:40.565786 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560761 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 07:51:40.565786 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560763 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 07:51:40.565786 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560766 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 07:51:40.565786 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560769 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 07:51:40.565786 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560771 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 07:51:40.566321 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560774 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 07:51:40.566321 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560776 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 07:51:40.566321 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560779 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 07:51:40.566321 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560782 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 07:51:40.566321 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560784 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 07:51:40.566321 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560787 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 07:51:40.566321 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560789 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 07:51:40.566321 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560792 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 07:51:40.566321 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560794 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 07:51:40.566321 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560796 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 07:51:40.566321 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560799 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 07:51:40.566321 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:40.560802 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 07:51:40.566321 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.560807 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 07:51:40.566321 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.561505 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 07:51:40.566321 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.563522 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 07:51:40.566699 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.564720 2570 server.go:1019] "Starting client certificate rotation" Apr 21 07:51:40.566699 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.564816 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 07:51:40.566699 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.564852 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 07:51:40.593980 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.593964 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 07:51:40.596507 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.596490 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 07:51:40.615500 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.615478 2570 log.go:25] "Validated CRI v1 runtime API" Apr 21 07:51:40.621418 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.621401 2570 log.go:25] "Validated CRI v1 image API" Apr 21 07:51:40.622732 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.622716 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 07:51:40.625053 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.625037 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 07:51:40.627744 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.627726 2570 fs.go:135] Filesystem UUIDs: map[78e22e1f-4d39-46f9-93bc-99881f3d5d60:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 9c6f46dd-5afb-4d74-97e7-c1993c4e9df9:/dev/nvme0n1p4] Apr 21 07:51:40.627797 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.627745 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 07:51:40.633744 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.633632 2570 manager.go:217] Machine: {Timestamp:2026-04-21 07:51:40.631469553 +0000 UTC m=+0.469856893 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099324 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec26dba99646ce47e00e1583d740c837 SystemUUID:ec26dba9-9646-ce47-e00e-1583d740c837 BootID:db9ac269-378b-4d6a-aac8-bd0f3c854c92 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7e:42:3a:93:17 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7e:42:3a:93:17 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:b6:be:ef:70:53:93 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 07:51:40.633744 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.633740 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 07:51:40.633898 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.633885 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 07:51:40.635012 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.634983 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 07:51:40.635155 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.635015 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-194.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 07:51:40.635200 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.635165 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 07:51:40.635200 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.635176 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 07:51:40.635200 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.635188 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 07:51:40.636052 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.636042 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 07:51:40.637516 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.637506 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 21 07:51:40.637641 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.637631 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 07:51:40.640360 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.640350 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 21 07:51:40.640399 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.640365 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 07:51:40.640399 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.640377 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 07:51:40.640399 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.640386 2570 kubelet.go:397] "Adding apiserver pod source" Apr 21 07:51:40.640399 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.640398 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 07:51:40.641607 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.641583 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 07:51:40.641607 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.641611 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 07:51:40.643346 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.643326 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8d7gl" Apr 21 07:51:40.644956 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.644940 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 07:51:40.646338 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.646324 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 07:51:40.648573 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.648553 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 07:51:40.648660 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.648583 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 07:51:40.648660 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.648596 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 07:51:40.648660 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.648608 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 07:51:40.648660 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.648620 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 07:51:40.648660 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.648631 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 07:51:40.648660 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.648644 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 07:51:40.648660 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.648658 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 07:51:40.648847 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.648671 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 07:51:40.648847 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.648682 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 07:51:40.648847 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.648698 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 07:51:40.648847 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.648715 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 07:51:40.649604 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.649592 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 07:51:40.649643 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.649606 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 07:51:40.650387 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.650361 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8d7gl" Apr 21 07:51:40.652906 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.652877 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-194.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 07:51:40.652992 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:40.652938 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 07:51:40.652992 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:40.652956 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-194.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 07:51:40.653421 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.653410 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 07:51:40.653464 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.653447 2570 server.go:1295] "Started kubelet" Apr 21 07:51:40.654362 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.654210 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 07:51:40.654586 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.654528 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 07:51:40.654666 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.654620 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 07:51:40.655181 ip-10-0-137-194 systemd[1]: Started Kubernetes Kubelet. Apr 21 07:51:40.656935 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.656917 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 07:51:40.659149 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.659131 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 21 07:51:40.665272 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.665252 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 07:51:40.665356 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.665260 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 07:51:40.665894 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.665877 2570 factory.go:55] Registering systemd factory Apr 21 07:51:40.665976 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.665898 2570 factory.go:223] Registration of the systemd container factory successfully Apr 21 07:51:40.666065 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.666050 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 07:51:40.666147 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:40.666052 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-194.ec2.internal\" not found" Apr 21 07:51:40.666147 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.666057 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 07:51:40.666277 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.666154 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 07:51:40.666277 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.666257 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 21 07:51:40.666277 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.666268 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 21 07:51:40.666804 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.666789 2570 factory.go:153] Registering CRI-O factory Apr 21 07:51:40.666936 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.666919 2570 factory.go:223] Registration of the crio container factory successfully Apr 21 07:51:40.667026 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.666980 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 07:51:40.667026 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.667007 2570 factory.go:103] Registering Raw factory Apr 21 07:51:40.667026 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.667024 2570 manager.go:1196] Started watching for new ooms in manager Apr 21 07:51:40.667170 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:40.667047 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 07:51:40.667494 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.667477 2570 manager.go:319] Starting recovery of all containers Apr 21 07:51:40.667566 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.667504 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:51:40.670407 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:40.670383 2570 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-194.ec2.internal\" not found" node="ip-10-0-137-194.ec2.internal" Apr 21 07:51:40.677840 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.677825 2570 manager.go:324] Recovery completed Apr 21 07:51:40.682297 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.682284 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:51:40.685596 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.685575 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-194.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:51:40.685657 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.685609 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-194.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:51:40.685657 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.685620 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-194.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:51:40.686120 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.686105 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 07:51:40.686120 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.686119 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 07:51:40.686191 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.686136 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 21 07:51:40.689229 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.689217 2570 policy_none.go:49] "None policy: Start" Apr 21 07:51:40.689290 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.689239 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 07:51:40.689290 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.689248 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 21 07:51:40.726464 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.726448 2570 manager.go:341] "Starting Device Plugin manager" Apr 21 07:51:40.754069 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:40.726511 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 07:51:40.754069 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.726524 2570 server.go:85] "Starting device plugin registration server" Apr 21 07:51:40.754069 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.726737 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 07:51:40.754069 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.726747 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 07:51:40.754069 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.726857 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 07:51:40.754069 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.726956 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 07:51:40.754069 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.726973 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 07:51:40.754069 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:40.727679 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 07:51:40.754069 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:40.727718 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-194.ec2.internal\" not found" Apr 21 07:51:40.767759 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.767720 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 07:51:40.768939 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.768918 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 07:51:40.768939 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.768941 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 07:51:40.769065 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.768959 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 07:51:40.769065 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.768966 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 07:51:40.769065 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:40.769000 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 07:51:40.771749 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.771733 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:51:40.827628 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.827551 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:51:40.829200 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.829184 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-194.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:51:40.829272 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.829216 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-194.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:51:40.829272 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.829229 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-194.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:51:40.829272 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.829253 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-194.ec2.internal" Apr 21 07:51:40.837929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.837911 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-194.ec2.internal" Apr 21 07:51:40.837980 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:40.837935 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-194.ec2.internal\": node \"ip-10-0-137-194.ec2.internal\" not found" Apr 21 07:51:40.855375 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:40.855345 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-194.ec2.internal\" not found" Apr 21 07:51:40.869720 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.869691 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-194.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-194.ec2.internal"] Apr 21 07:51:40.869809 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.869764 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:51:40.874345 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.874327 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-194.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:51:40.874423 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.874357 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-194.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:51:40.874423 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.874367 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-194.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:51:40.876506 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.876491 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:51:40.877138 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.877118 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-194.ec2.internal" Apr 21 07:51:40.877238 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.877157 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:51:40.877319 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.877304 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-194.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:51:40.877366 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.877331 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-194.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:51:40.877366 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.877343 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-194.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:51:40.877809 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.877796 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-194.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:51:40.877881 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.877818 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-194.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:51:40.877881 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.877827 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-194.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:51:40.879420 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.879408 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-194.ec2.internal" Apr 21 07:51:40.879473 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.879432 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 07:51:40.880115 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.880101 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-194.ec2.internal" event="NodeHasSufficientMemory" Apr 21 07:51:40.880191 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.880125 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-194.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 07:51:40.880191 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:40.880138 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-194.ec2.internal" event="NodeHasSufficientPID" Apr 21 07:51:40.906633 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:40.906603 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-194.ec2.internal\" not found" node="ip-10-0-137-194.ec2.internal" Apr 21 07:51:40.910954 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:40.910937 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-194.ec2.internal\" not found" node="ip-10-0-137-194.ec2.internal" Apr 21 07:51:40.955930 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:40.955901 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-194.ec2.internal\" not found" Apr 21 07:51:41.056574 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:41.056527 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-194.ec2.internal\" not found" Apr 21 07:51:41.067912 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.067885 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2e9938a109a193e2a24eba106814c8e0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-194.ec2.internal\" (UID: \"2e9938a109a193e2a24eba106814c8e0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-194.ec2.internal" Apr 21 07:51:41.067989 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.067918 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e9938a109a193e2a24eba106814c8e0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-194.ec2.internal\" (UID: \"2e9938a109a193e2a24eba106814c8e0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-194.ec2.internal" Apr 21 07:51:41.067989 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.067938 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dc75abb31154ae5273a07d0d5f2959e8-config\") pod \"kube-apiserver-proxy-ip-10-0-137-194.ec2.internal\" (UID: \"dc75abb31154ae5273a07d0d5f2959e8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-194.ec2.internal" Apr 21 07:51:41.157326 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:41.157287 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-194.ec2.internal\" not found" Apr 21 07:51:41.168635 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.168609 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2e9938a109a193e2a24eba106814c8e0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-194.ec2.internal\" (UID: \"2e9938a109a193e2a24eba106814c8e0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-194.ec2.internal" Apr 21 07:51:41.168697 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.168642 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e9938a109a193e2a24eba106814c8e0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-194.ec2.internal\" (UID: \"2e9938a109a193e2a24eba106814c8e0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-194.ec2.internal" Apr 21 07:51:41.168697 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.168661 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dc75abb31154ae5273a07d0d5f2959e8-config\") pod \"kube-apiserver-proxy-ip-10-0-137-194.ec2.internal\" (UID: \"dc75abb31154ae5273a07d0d5f2959e8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-194.ec2.internal" Apr 21 07:51:41.168766 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.168701 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/dc75abb31154ae5273a07d0d5f2959e8-config\") pod \"kube-apiserver-proxy-ip-10-0-137-194.ec2.internal\" (UID: \"dc75abb31154ae5273a07d0d5f2959e8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-194.ec2.internal" Apr 21 07:51:41.168766 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.168705 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2e9938a109a193e2a24eba106814c8e0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-194.ec2.internal\" (UID: \"2e9938a109a193e2a24eba106814c8e0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-194.ec2.internal" Apr 21 07:51:41.168766 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.168711 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e9938a109a193e2a24eba106814c8e0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-194.ec2.internal\" (UID: \"2e9938a109a193e2a24eba106814c8e0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-194.ec2.internal" Apr 21 07:51:41.208791 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.208756 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-194.ec2.internal" Apr 21 07:51:41.213342 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.213325 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-194.ec2.internal" Apr 21 07:51:41.258116 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:41.258073 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-194.ec2.internal\" not found" Apr 21 07:51:41.358577 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:41.358542 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-194.ec2.internal\" not found" Apr 21 07:51:41.459150 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:41.459080 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-194.ec2.internal\" not found" Apr 21 07:51:41.559613 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:41.559563 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-194.ec2.internal\" not found" Apr 21 07:51:41.564744 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.564724 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 07:51:41.564938 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.564919 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 07:51:41.565006 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.564974 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 07:51:41.652707 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.652651 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 07:46:40 +0000 UTC" deadline="2028-01-27 21:25:07.569435318 +0000 UTC" Apr 21 07:51:41.652707 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.652694 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15517h33m25.916743958s" Apr 21 07:51:41.659693 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:41.659671 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-194.ec2.internal\" not found" Apr 21 07:51:41.666235 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.666215 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 07:51:41.676945 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.676915 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 07:51:41.693619 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.693595 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-z796w" Apr 21 07:51:41.700567 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.700551 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-z796w" Apr 21 07:51:41.759917 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:41.759892 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-194.ec2.internal\" not found" Apr 21 07:51:41.860388 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:41.860357 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-194.ec2.internal\" not found" Apr 21 07:51:41.879830 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:41.879783 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc75abb31154ae5273a07d0d5f2959e8.slice/crio-2315c748baae5aa3fc4a35c462104f0ccab8b1ff0a052bf2de85ae0735b28ef4 WatchSource:0}: Error finding container 2315c748baae5aa3fc4a35c462104f0ccab8b1ff0a052bf2de85ae0735b28ef4: Status 404 returned error can't find the container with id 2315c748baae5aa3fc4a35c462104f0ccab8b1ff0a052bf2de85ae0735b28ef4 Apr 21 07:51:41.880156 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:41.880131 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e9938a109a193e2a24eba106814c8e0.slice/crio-a7f5e0c04fb97beeec1c12e6700f2ca3ccf4447541f554e921daca202cd74a0d WatchSource:0}: Error finding container a7f5e0c04fb97beeec1c12e6700f2ca3ccf4447541f554e921daca202cd74a0d: Status 404 returned error can't find the container with id a7f5e0c04fb97beeec1c12e6700f2ca3ccf4447541f554e921daca202cd74a0d Apr 21 07:51:41.885056 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.885039 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 07:51:41.960496 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:41.960454 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-194.ec2.internal\" not found" Apr 21 07:51:41.975227 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.975208 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:51:41.997901 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:41.997879 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:51:42.065593 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.065512 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-194.ec2.internal" Apr 21 07:51:42.076092 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.076071 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 07:51:42.077019 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.077008 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-194.ec2.internal" Apr 21 07:51:42.084205 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.084192 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 07:51:42.448039 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.447991 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:51:42.641776 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.641745 2570 apiserver.go:52] "Watching apiserver" Apr 21 07:51:42.650991 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.650432 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 07:51:42.650991 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.650950 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd","openshift-dns/node-resolver-hc6tl","openshift-multus/multus-additional-cni-plugins-6p77p","openshift-multus/network-metrics-daemon-bbxng","kube-system/konnectivity-agent-gc4bz","openshift-cluster-node-tuning-operator/tuned-ghbj6","openshift-image-registry/node-ca-v596t","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-194.ec2.internal","openshift-multus/multus-p59jc","openshift-network-diagnostics/network-check-target-htrsr","openshift-network-operator/iptables-alerter-x9g68","openshift-ovn-kubernetes/ovnkube-node-kshzg","kube-system/kube-apiserver-proxy-ip-10-0-137-194.ec2.internal"] Apr 21 07:51:42.655766 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.655641 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-v596t" Apr 21 07:51:42.657714 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.657687 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hc6tl" Apr 21 07:51:42.658482 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.658367 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-jxjsm\"" Apr 21 07:51:42.658482 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.658442 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 07:51:42.658707 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.658538 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 07:51:42.658707 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.658646 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 07:51:42.659594 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.659456 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 07:51:42.659852 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.659831 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8q9sp\"" Apr 21 07:51:42.660030 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.660009 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:51:42.660116 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.660080 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 07:51:42.660116 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:42.660096 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbxng" podUID="62e778a8-8270-4560-9d0d-41a95a3c9c5f" Apr 21 07:51:42.662319 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.662298 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.664408 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.664388 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gc4bz" Apr 21 07:51:42.665060 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.664533 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.665060 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.664728 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-hv82z\"" Apr 21 07:51:42.665060 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.664891 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 07:51:42.665060 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.665034 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 07:51:42.665348 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.665162 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 07:51:42.665348 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.665229 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 07:51:42.665348 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.665338 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 07:51:42.666847 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.666473 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 07:51:42.666847 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.666571 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qmsx6\"" Apr 21 07:51:42.666847 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.666718 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:51:42.666847 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.666817 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 07:51:42.667096 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.667034 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-lpxf4\"" Apr 21 07:51:42.667096 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.667091 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 07:51:42.667692 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.667486 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:42.669564 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.669545 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-w28fq\"" Apr 21 07:51:42.669766 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.669749 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 07:51:42.669841 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.669810 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.669841 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.669830 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 07:51:42.670220 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.670202 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 07:51:42.672110 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.672085 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:51:42.672199 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.672126 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 07:51:42.672199 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:42.672147 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-htrsr" podUID="9cfb4a7f-bf8c-41ae-9276-00c43802c31a" Apr 21 07:51:42.672199 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.672181 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qfw2d\"" Apr 21 07:51:42.674632 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.674346 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-x9g68" Apr 21 07:51:42.676730 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.676557 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-run\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.676730 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.676590 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-var-lib-kubelet\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.676730 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.676617 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/eb9ab596-4aad-415a-8f6b-3a6340b43812-etc-selinux\") pod \"aws-ebs-csi-driver-node-2j5gd\" (UID: \"eb9ab596-4aad-415a-8f6b-3a6340b43812\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:42.676730 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.676639 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.677880 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.676641 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g9rm\" (UniqueName: \"kubernetes.io/projected/eb9ab596-4aad-415a-8f6b-3a6340b43812-kube-api-access-9g9rm\") pod \"aws-ebs-csi-driver-node-2j5gd\" (UID: \"eb9ab596-4aad-415a-8f6b-3a6340b43812\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:42.677880 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.677374 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:51:42.677880 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.677517 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7dd7ba4-8933-440b-9c8a-6710f4843c4d-host\") pod \"node-ca-v596t\" (UID: \"a7dd7ba4-8933-440b-9c8a-6710f4843c4d\") " pod="openshift-image-registry/node-ca-v596t" Apr 21 07:51:42.677880 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.677547 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/28381821-cb18-40b6-a25f-b3a80e24f27a-tmp-dir\") pod \"node-resolver-hc6tl\" (UID: \"28381821-cb18-40b6-a25f-b3a80e24f27a\") " pod="openshift-dns/node-resolver-hc6tl" Apr 21 07:51:42.677880 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.677597 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.677880 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.677629 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dd018993-e8b6-4e3f-b574-2a1a71e75ce7-agent-certs\") pod \"konnectivity-agent-gc4bz\" (UID: \"dd018993-e8b6-4e3f-b574-2a1a71e75ce7\") " pod="kube-system/konnectivity-agent-gc4bz" Apr 21 07:51:42.677880 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.677701 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-etc-systemd\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.677880 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.677731 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-etc-tuned\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.677880 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.677757 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eb9ab596-4aad-415a-8f6b-3a6340b43812-registration-dir\") pod \"aws-ebs-csi-driver-node-2j5gd\" (UID: \"eb9ab596-4aad-415a-8f6b-3a6340b43812\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:42.677880 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.677793 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvzbg\" (UniqueName: \"kubernetes.io/projected/62e778a8-8270-4560-9d0d-41a95a3c9c5f-kube-api-access-gvzbg\") pod \"network-metrics-daemon-bbxng\" (UID: \"62e778a8-8270-4560-9d0d-41a95a3c9c5f\") " pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:51:42.678409 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.677957 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-etc-kubernetes\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.678409 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.677989 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-etc-sysctl-conf\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.678409 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678018 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb9ab596-4aad-415a-8f6b-3a6340b43812-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2j5gd\" (UID: \"eb9ab596-4aad-415a-8f6b-3a6340b43812\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:42.678409 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678047 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/eb9ab596-4aad-415a-8f6b-3a6340b43812-device-dir\") pod \"aws-ebs-csi-driver-node-2j5gd\" (UID: \"eb9ab596-4aad-415a-8f6b-3a6340b43812\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:42.678409 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678073 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/eb9ab596-4aad-415a-8f6b-3a6340b43812-sys-fs\") pod \"aws-ebs-csi-driver-node-2j5gd\" (UID: \"eb9ab596-4aad-415a-8f6b-3a6340b43812\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:42.678409 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678218 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-system-cni-dir\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.678409 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678248 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 07:51:42.678409 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678265 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-os-release\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.678409 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678298 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-etc-modprobe-d\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.678409 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678328 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-cni-binary-copy\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.678409 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678360 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbljk\" (UniqueName: \"kubernetes.io/projected/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-kube-api-access-bbljk\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.678409 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678389 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-host\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.679003 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678418 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a7dd7ba4-8933-440b-9c8a-6710f4843c4d-serviceca\") pod \"node-ca-v596t\" (UID: \"a7dd7ba4-8933-440b-9c8a-6710f4843c4d\") " pod="openshift-image-registry/node-ca-v596t" Apr 21 07:51:42.679003 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678464 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv7vk\" (UniqueName: \"kubernetes.io/projected/28381821-cb18-40b6-a25f-b3a80e24f27a-kube-api-access-sv7vk\") pod \"node-resolver-hc6tl\" (UID: \"28381821-cb18-40b6-a25f-b3a80e24f27a\") " pod="openshift-dns/node-resolver-hc6tl" Apr 21 07:51:42.679003 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678577 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 07:51:42.679003 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678581 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.679003 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678613 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-lib-modules\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.679003 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678641 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-tmp\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.679003 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678668 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv5fx\" (UniqueName: \"kubernetes.io/projected/a7dd7ba4-8933-440b-9c8a-6710f4843c4d-kube-api-access-pv5fx\") pod \"node-ca-v596t\" (UID: \"a7dd7ba4-8933-440b-9c8a-6710f4843c4d\") " pod="openshift-image-registry/node-ca-v596t" Apr 21 07:51:42.679003 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678748 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-vdrb4\"" Apr 21 07:51:42.679003 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678789 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/28381821-cb18-40b6-a25f-b3a80e24f27a-hosts-file\") pod \"node-resolver-hc6tl\" (UID: \"28381821-cb18-40b6-a25f-b3a80e24f27a\") " pod="openshift-dns/node-resolver-hc6tl" Apr 21 07:51:42.679003 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678821 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs\") pod \"network-metrics-daemon-bbxng\" (UID: \"62e778a8-8270-4560-9d0d-41a95a3c9c5f\") " pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:51:42.679003 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678849 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-etc-sysconfig\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.679003 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678896 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbsgj\" (UniqueName: \"kubernetes.io/projected/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-kube-api-access-qbsgj\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.679003 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678920 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-cnibin\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.679003 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678965 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dd018993-e8b6-4e3f-b574-2a1a71e75ce7-konnectivity-ca\") pod \"konnectivity-agent-gc4bz\" (UID: \"dd018993-e8b6-4e3f-b574-2a1a71e75ce7\") " pod="kube-system/konnectivity-agent-gc4bz" Apr 21 07:51:42.679003 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.678986 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-etc-sysctl-d\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.679617 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.679022 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-sys\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.679617 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.679076 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eb9ab596-4aad-415a-8f6b-3a6340b43812-socket-dir\") pod \"aws-ebs-csi-driver-node-2j5gd\" (UID: \"eb9ab596-4aad-415a-8f6b-3a6340b43812\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:42.679617 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.679471 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 07:51:42.679617 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.679605 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.679800 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.679734 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 07:51:42.679909 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.679886 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-6vth2\"" Apr 21 07:51:42.680775 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.680514 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 07:51:42.680775 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.680539 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 07:51:42.681011 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.680991 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 07:51:42.681648 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.681626 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 07:51:42.702438 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.702407 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 07:46:41 +0000 UTC" deadline="2028-01-27 09:22:21.488928082 +0000 UTC" Apr 21 07:51:42.702438 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.702439 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15505h30m38.786492575s" Apr 21 07:51:42.724775 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.724749 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 07:51:42.767267 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.767232 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 07:51:42.773676 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.773625 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-194.ec2.internal" event={"ID":"2e9938a109a193e2a24eba106814c8e0","Type":"ContainerStarted","Data":"a7f5e0c04fb97beeec1c12e6700f2ca3ccf4447541f554e921daca202cd74a0d"} Apr 21 07:51:42.775142 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.775114 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-194.ec2.internal" event={"ID":"dc75abb31154ae5273a07d0d5f2959e8","Type":"ContainerStarted","Data":"2315c748baae5aa3fc4a35c462104f0ccab8b1ff0a052bf2de85ae0735b28ef4"} Apr 21 07:51:42.780379 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.780354 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-systemd-units\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.780508 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.780392 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cc312109-c0ed-49ee-b44b-aebf49f43c92-ovn-node-metrics-cert\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.780508 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.780425 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-cni-binary-copy\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.780508 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.780453 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-host-kubelet\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.780508 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.780480 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-node-log\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.780508 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.780504 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55tl6\" (UniqueName: \"kubernetes.io/projected/cc312109-c0ed-49ee-b44b-aebf49f43c92-kube-api-access-55tl6\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.780752 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.780586 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-system-cni-dir\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.780752 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.780612 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-host-run-netns\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.780752 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.780639 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b142b7ae-a1de-435f-a326-eb9c1b40ba2e-iptables-alerter-script\") pod \"iptables-alerter-x9g68\" (UID: \"b142b7ae-a1de-435f-a326-eb9c1b40ba2e\") " pod="openshift-network-operator/iptables-alerter-x9g68" Apr 21 07:51:42.780752 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.780675 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-host\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.780752 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.780718 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a7dd7ba4-8933-440b-9c8a-6710f4843c4d-serviceca\") pod \"node-ca-v596t\" (UID: \"a7dd7ba4-8933-440b-9c8a-6710f4843c4d\") " pod="openshift-image-registry/node-ca-v596t" Apr 21 07:51:42.780752 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.780744 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.781052 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.780761 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pv5fx\" (UniqueName: \"kubernetes.io/projected/a7dd7ba4-8933-440b-9c8a-6710f4843c4d-kube-api-access-pv5fx\") pod \"node-ca-v596t\" (UID: \"a7dd7ba4-8933-440b-9c8a-6710f4843c4d\") " pod="openshift-image-registry/node-ca-v596t" Apr 21 07:51:42.781052 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.780778 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-run-openvswitch\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.781052 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.780791 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-host\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.781052 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.780795 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-multus-cni-dir\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.781052 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.780877 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c7b371b9-c08d-40a3-b1c7-71f402fdf061-multus-daemon-config\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.781052 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.780911 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbsgj\" (UniqueName: \"kubernetes.io/projected/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-kube-api-access-qbsgj\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.781052 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.780936 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-cnibin\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.781052 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.780962 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-host-run-netns\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.781052 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.780988 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-host-run-ovn-kubernetes\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.781052 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781014 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c7b371b9-c08d-40a3-b1c7-71f402fdf061-cni-binary-copy\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.781052 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781054 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dd018993-e8b6-4e3f-b574-2a1a71e75ce7-konnectivity-ca\") pod \"konnectivity-agent-gc4bz\" (UID: \"dd018993-e8b6-4e3f-b574-2a1a71e75ce7\") " pod="kube-system/konnectivity-agent-gc4bz" Apr 21 07:51:42.781555 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781077 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-sys\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.781555 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781103 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eb9ab596-4aad-415a-8f6b-3a6340b43812-socket-dir\") pod \"aws-ebs-csi-driver-node-2j5gd\" (UID: \"eb9ab596-4aad-415a-8f6b-3a6340b43812\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:42.781555 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781134 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.781555 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781160 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-log-socket\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.781555 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781200 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-host-cni-bin\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.781555 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781197 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-cni-binary-copy\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.781555 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781265 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a7dd7ba4-8933-440b-9c8a-6710f4843c4d-serviceca\") pod \"node-ca-v596t\" (UID: \"a7dd7ba4-8933-440b-9c8a-6710f4843c4d\") " pod="openshift-image-registry/node-ca-v596t" Apr 21 07:51:42.781555 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781405 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-cnibin\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.781555 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781455 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cc312109-c0ed-49ee-b44b-aebf49f43c92-env-overrides\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.781555 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781486 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-host-run-multus-certs\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.781555 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781529 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/eb9ab596-4aad-415a-8f6b-3a6340b43812-etc-selinux\") pod \"aws-ebs-csi-driver-node-2j5gd\" (UID: \"eb9ab596-4aad-415a-8f6b-3a6340b43812\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:42.782091 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781600 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9g9rm\" (UniqueName: \"kubernetes.io/projected/eb9ab596-4aad-415a-8f6b-3a6340b43812-kube-api-access-9g9rm\") pod \"aws-ebs-csi-driver-node-2j5gd\" (UID: \"eb9ab596-4aad-415a-8f6b-3a6340b43812\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:42.782091 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781625 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7dd7ba4-8933-440b-9c8a-6710f4843c4d-host\") pod \"node-ca-v596t\" (UID: \"a7dd7ba4-8933-440b-9c8a-6710f4843c4d\") " pod="openshift-image-registry/node-ca-v596t" Apr 21 07:51:42.782091 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781663 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/28381821-cb18-40b6-a25f-b3a80e24f27a-tmp-dir\") pod \"node-resolver-hc6tl\" (UID: \"28381821-cb18-40b6-a25f-b3a80e24f27a\") " pod="openshift-dns/node-resolver-hc6tl" Apr 21 07:51:42.782091 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781691 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql2vv\" (UniqueName: \"kubernetes.io/projected/b142b7ae-a1de-435f-a326-eb9c1b40ba2e-kube-api-access-ql2vv\") pod \"iptables-alerter-x9g68\" (UID: \"b142b7ae-a1de-435f-a326-eb9c1b40ba2e\") " pod="openshift-network-operator/iptables-alerter-x9g68" Apr 21 07:51:42.782091 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781727 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-etc-systemd\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.782091 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781751 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-etc-tuned\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.782091 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781783 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-os-release\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.782091 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781806 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/dd018993-e8b6-4e3f-b574-2a1a71e75ce7-konnectivity-ca\") pod \"konnectivity-agent-gc4bz\" (UID: \"dd018993-e8b6-4e3f-b574-2a1a71e75ce7\") " pod="kube-system/konnectivity-agent-gc4bz" Apr 21 07:51:42.782091 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781903 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-etc-sysctl-conf\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.782091 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781934 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-var-lib-openvswitch\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.782091 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781961 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-host-cni-netd\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.782091 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781981 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-multus-socket-dir-parent\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.782091 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.781979 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.782091 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782023 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7dd7ba4-8933-440b-9c8a-6710f4843c4d-host\") pod \"node-ca-v596t\" (UID: \"a7dd7ba4-8933-440b-9c8a-6710f4843c4d\") " pod="openshift-image-registry/node-ca-v596t" Apr 21 07:51:42.782693 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782100 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 07:51:42.782693 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782145 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-sys\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.782693 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782201 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eb9ab596-4aad-415a-8f6b-3a6340b43812-socket-dir\") pod \"aws-ebs-csi-driver-node-2j5gd\" (UID: \"eb9ab596-4aad-415a-8f6b-3a6340b43812\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:42.782693 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782211 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/eb9ab596-4aad-415a-8f6b-3a6340b43812-etc-selinux\") pod \"aws-ebs-csi-driver-node-2j5gd\" (UID: \"eb9ab596-4aad-415a-8f6b-3a6340b43812\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:42.782693 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782242 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-host-var-lib-kubelet\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.782693 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782302 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-etc-modprobe-d\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.782693 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782308 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-etc-sysctl-conf\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.782693 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782323 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/28381821-cb18-40b6-a25f-b3a80e24f27a-tmp-dir\") pod \"node-resolver-hc6tl\" (UID: \"28381821-cb18-40b6-a25f-b3a80e24f27a\") " pod="openshift-dns/node-resolver-hc6tl" Apr 21 07:51:42.782693 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782336 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-etc-systemd\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.782693 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782346 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbljk\" (UniqueName: \"kubernetes.io/projected/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-kube-api-access-bbljk\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.782693 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782389 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-hostroot\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.782693 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782411 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-etc-modprobe-d\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.782693 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782431 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sv7vk\" (UniqueName: \"kubernetes.io/projected/28381821-cb18-40b6-a25f-b3a80e24f27a-kube-api-access-sv7vk\") pod \"node-resolver-hc6tl\" (UID: \"28381821-cb18-40b6-a25f-b3a80e24f27a\") " pod="openshift-dns/node-resolver-hc6tl" Apr 21 07:51:42.782693 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782461 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cc312109-c0ed-49ee-b44b-aebf49f43c92-ovnkube-script-lib\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.782693 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782503 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-lib-modules\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.782693 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782525 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-tmp\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.782693 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782551 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/28381821-cb18-40b6-a25f-b3a80e24f27a-hosts-file\") pod \"node-resolver-hc6tl\" (UID: \"28381821-cb18-40b6-a25f-b3a80e24f27a\") " pod="openshift-dns/node-resolver-hc6tl" Apr 21 07:51:42.782693 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782584 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs\") pod \"network-metrics-daemon-bbxng\" (UID: \"62e778a8-8270-4560-9d0d-41a95a3c9c5f\") " pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:51:42.783494 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782611 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-etc-sysconfig\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.783494 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782629 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-host-slash\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.783494 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782679 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-run-systemd\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.783494 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782696 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-etc-openvswitch\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.783494 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782722 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cc312109-c0ed-49ee-b44b-aebf49f43c92-ovnkube-config\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.783494 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782723 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/28381821-cb18-40b6-a25f-b3a80e24f27a-hosts-file\") pod \"node-resolver-hc6tl\" (UID: \"28381821-cb18-40b6-a25f-b3a80e24f27a\") " pod="openshift-dns/node-resolver-hc6tl" Apr 21 07:51:42.783494 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782754 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.783494 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782780 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-etc-sysconfig\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.783494 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:42.782804 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:42.783494 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782835 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-cnibin\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.783494 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782858 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-lib-modules\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.783494 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:42.782891 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs podName:62e778a8-8270-4560-9d0d-41a95a3c9c5f nodeName:}" failed. No retries permitted until 2026-04-21 07:51:43.282847349 +0000 UTC m=+3.121234705 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs") pod "network-metrics-daemon-bbxng" (UID: "62e778a8-8270-4560-9d0d-41a95a3c9c5f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:42.783494 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782933 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-etc-sysctl-d\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.783494 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782963 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-run-ovn\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.783494 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.782988 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.783494 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783014 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-multus-conf-dir\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.783494 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783040 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8f97p\" (UniqueName: \"kubernetes.io/projected/c7b371b9-c08d-40a3-b1c7-71f402fdf061-kube-api-access-8f97p\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.784181 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783066 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prkcp\" (UniqueName: \"kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp\") pod \"network-check-target-htrsr\" (UID: \"9cfb4a7f-bf8c-41ae-9276-00c43802c31a\") " pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:51:42.784181 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783103 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-run\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.784181 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783132 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-var-lib-kubelet\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.784181 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783145 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-etc-sysctl-d\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.784181 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783180 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.784181 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783187 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-var-lib-kubelet\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.784181 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783220 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-host-run-k8s-cni-cncf-io\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.784181 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783223 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-run\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.784181 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783254 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-host-var-lib-cni-multus\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.784181 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783277 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-etc-kubernetes\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.784181 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783321 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.784181 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783322 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b142b7ae-a1de-435f-a326-eb9c1b40ba2e-host-slash\") pod \"iptables-alerter-x9g68\" (UID: \"b142b7ae-a1de-435f-a326-eb9c1b40ba2e\") " pod="openshift-network-operator/iptables-alerter-x9g68" Apr 21 07:51:42.784181 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783367 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dd018993-e8b6-4e3f-b574-2a1a71e75ce7-agent-certs\") pod \"konnectivity-agent-gc4bz\" (UID: \"dd018993-e8b6-4e3f-b574-2a1a71e75ce7\") " pod="kube-system/konnectivity-agent-gc4bz" Apr 21 07:51:42.784181 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783391 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eb9ab596-4aad-415a-8f6b-3a6340b43812-registration-dir\") pod \"aws-ebs-csi-driver-node-2j5gd\" (UID: \"eb9ab596-4aad-415a-8f6b-3a6340b43812\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:42.784181 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783415 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gvzbg\" (UniqueName: \"kubernetes.io/projected/62e778a8-8270-4560-9d0d-41a95a3c9c5f-kube-api-access-gvzbg\") pod \"network-metrics-daemon-bbxng\" (UID: \"62e778a8-8270-4560-9d0d-41a95a3c9c5f\") " pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:51:42.784181 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783450 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-host-var-lib-cni-bin\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.784181 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783479 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eb9ab596-4aad-415a-8f6b-3a6340b43812-registration-dir\") pod \"aws-ebs-csi-driver-node-2j5gd\" (UID: \"eb9ab596-4aad-415a-8f6b-3a6340b43812\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:42.784840 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783544 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-etc-kubernetes\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.784840 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783546 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-etc-kubernetes\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.784840 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783581 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb9ab596-4aad-415a-8f6b-3a6340b43812-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2j5gd\" (UID: \"eb9ab596-4aad-415a-8f6b-3a6340b43812\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:42.784840 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783617 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/eb9ab596-4aad-415a-8f6b-3a6340b43812-device-dir\") pod \"aws-ebs-csi-driver-node-2j5gd\" (UID: \"eb9ab596-4aad-415a-8f6b-3a6340b43812\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:42.784840 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783656 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/eb9ab596-4aad-415a-8f6b-3a6340b43812-sys-fs\") pod \"aws-ebs-csi-driver-node-2j5gd\" (UID: \"eb9ab596-4aad-415a-8f6b-3a6340b43812\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:42.784840 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783668 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/eb9ab596-4aad-415a-8f6b-3a6340b43812-device-dir\") pod \"aws-ebs-csi-driver-node-2j5gd\" (UID: \"eb9ab596-4aad-415a-8f6b-3a6340b43812\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:42.784840 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783629 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eb9ab596-4aad-415a-8f6b-3a6340b43812-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2j5gd\" (UID: \"eb9ab596-4aad-415a-8f6b-3a6340b43812\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:42.784840 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783682 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-system-cni-dir\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.784840 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783715 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-system-cni-dir\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.784840 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783718 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-os-release\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.784840 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783737 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/eb9ab596-4aad-415a-8f6b-3a6340b43812-sys-fs\") pod \"aws-ebs-csi-driver-node-2j5gd\" (UID: \"eb9ab596-4aad-415a-8f6b-3a6340b43812\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:42.784840 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.783768 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-os-release\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.785464 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.785268 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-etc-tuned\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.786422 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.786400 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-tmp\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.786514 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.786453 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/dd018993-e8b6-4e3f-b574-2a1a71e75ce7-agent-certs\") pod \"konnectivity-agent-gc4bz\" (UID: \"dd018993-e8b6-4e3f-b574-2a1a71e75ce7\") " pod="kube-system/konnectivity-agent-gc4bz" Apr 21 07:51:42.793100 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.792983 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv5fx\" (UniqueName: \"kubernetes.io/projected/a7dd7ba4-8933-440b-9c8a-6710f4843c4d-kube-api-access-pv5fx\") pod \"node-ca-v596t\" (UID: \"a7dd7ba4-8933-440b-9c8a-6710f4843c4d\") " pod="openshift-image-registry/node-ca-v596t" Apr 21 07:51:42.795314 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.794646 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbljk\" (UniqueName: \"kubernetes.io/projected/f1b4cd3f-6d6c-4602-aff3-3e7007b411bc-kube-api-access-bbljk\") pod \"multus-additional-cni-plugins-6p77p\" (UID: \"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc\") " pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:42.796698 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.796613 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbsgj\" (UniqueName: \"kubernetes.io/projected/4ec4ea1e-b376-4d45-b221-3c152bf5f6ec-kube-api-access-qbsgj\") pod \"tuned-ghbj6\" (UID: \"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec\") " pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:42.797407 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.797381 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g9rm\" (UniqueName: \"kubernetes.io/projected/eb9ab596-4aad-415a-8f6b-3a6340b43812-kube-api-access-9g9rm\") pod \"aws-ebs-csi-driver-node-2j5gd\" (UID: \"eb9ab596-4aad-415a-8f6b-3a6340b43812\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:42.799321 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.799299 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv7vk\" (UniqueName: \"kubernetes.io/projected/28381821-cb18-40b6-a25f-b3a80e24f27a-kube-api-access-sv7vk\") pod \"node-resolver-hc6tl\" (UID: \"28381821-cb18-40b6-a25f-b3a80e24f27a\") " pod="openshift-dns/node-resolver-hc6tl" Apr 21 07:51:42.799401 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.799306 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvzbg\" (UniqueName: \"kubernetes.io/projected/62e778a8-8270-4560-9d0d-41a95a3c9c5f-kube-api-access-gvzbg\") pod \"network-metrics-daemon-bbxng\" (UID: \"62e778a8-8270-4560-9d0d-41a95a3c9c5f\") " pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:51:42.884569 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884536 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-systemd-units\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.884731 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884577 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cc312109-c0ed-49ee-b44b-aebf49f43c92-ovn-node-metrics-cert\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.884731 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884602 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-host-kubelet\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.884731 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884625 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-node-log\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.884731 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884649 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55tl6\" (UniqueName: \"kubernetes.io/projected/cc312109-c0ed-49ee-b44b-aebf49f43c92-kube-api-access-55tl6\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.884731 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884670 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-system-cni-dir\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.884731 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884664 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-systemd-units\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.884731 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884691 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-host-run-netns\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.884731 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884715 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b142b7ae-a1de-435f-a326-eb9c1b40ba2e-iptables-alerter-script\") pod \"iptables-alerter-x9g68\" (UID: \"b142b7ae-a1de-435f-a326-eb9c1b40ba2e\") " pod="openshift-network-operator/iptables-alerter-x9g68" Apr 21 07:51:42.885132 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884733 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-node-log\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.885132 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884750 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-host-run-netns\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.885132 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884741 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-run-openvswitch\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.885132 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884770 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-system-cni-dir\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.885132 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884782 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-run-openvswitch\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.885132 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884799 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-multus-cni-dir\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.885132 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884841 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c7b371b9-c08d-40a3-b1c7-71f402fdf061-multus-daemon-config\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.885132 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884849 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-multus-cni-dir\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.885132 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884883 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-host-kubelet\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.885132 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884891 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-host-run-netns\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.885132 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884915 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-host-run-netns\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.885132 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884918 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-host-run-ovn-kubernetes\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.885132 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884947 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-host-run-ovn-kubernetes\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.885132 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884952 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c7b371b9-c08d-40a3-b1c7-71f402fdf061-cni-binary-copy\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.885132 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.884981 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-log-socket\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.885132 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885002 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-host-cni-bin\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.885132 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885051 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-host-cni-bin\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.885132 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885051 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-log-socket\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.885972 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885088 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cc312109-c0ed-49ee-b44b-aebf49f43c92-env-overrides\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.885972 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885119 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-host-run-multus-certs\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.885972 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885150 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ql2vv\" (UniqueName: \"kubernetes.io/projected/b142b7ae-a1de-435f-a326-eb9c1b40ba2e-kube-api-access-ql2vv\") pod \"iptables-alerter-x9g68\" (UID: \"b142b7ae-a1de-435f-a326-eb9c1b40ba2e\") " pod="openshift-network-operator/iptables-alerter-x9g68" Apr 21 07:51:42.885972 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885176 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-os-release\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.885972 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885200 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-var-lib-openvswitch\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.885972 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885223 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-host-cni-netd\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.885972 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885244 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-multus-socket-dir-parent\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.885972 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885270 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-host-var-lib-kubelet\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.885972 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885298 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-hostroot\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.885972 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885325 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cc312109-c0ed-49ee-b44b-aebf49f43c92-ovnkube-script-lib\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.885972 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885364 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-host-slash\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.885972 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885366 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b142b7ae-a1de-435f-a326-eb9c1b40ba2e-iptables-alerter-script\") pod \"iptables-alerter-x9g68\" (UID: \"b142b7ae-a1de-435f-a326-eb9c1b40ba2e\") " pod="openshift-network-operator/iptables-alerter-x9g68" Apr 21 07:51:42.885972 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885388 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-run-systemd\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.885972 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885415 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-etc-openvswitch\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.885972 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885440 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-host-cni-netd\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.885972 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885459 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cc312109-c0ed-49ee-b44b-aebf49f43c92-ovnkube-config\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.885972 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885467 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c7b371b9-c08d-40a3-b1c7-71f402fdf061-multus-daemon-config\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.885972 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885489 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-cnibin\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.886581 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885491 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c7b371b9-c08d-40a3-b1c7-71f402fdf061-cni-binary-copy\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.886581 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885514 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-run-ovn\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.886581 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885553 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-hostroot\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.886581 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885565 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-run-ovn\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.886581 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885566 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-host-slash\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.886581 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885572 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-run-systemd\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.886581 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885604 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.886581 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885638 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-etc-openvswitch\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.886581 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885657 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-cnibin\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.886581 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885656 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-multus-conf-dir\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.886581 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885690 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-multus-conf-dir\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.886581 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885687 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-multus-socket-dir-parent\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.886581 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885487 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-host-run-multus-certs\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.886581 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885691 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8f97p\" (UniqueName: \"kubernetes.io/projected/c7b371b9-c08d-40a3-b1c7-71f402fdf061-kube-api-access-8f97p\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.886581 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885611 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-host-var-lib-kubelet\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.886581 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885736 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prkcp\" (UniqueName: \"kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp\") pod \"network-check-target-htrsr\" (UID: \"9cfb4a7f-bf8c-41ae-9276-00c43802c31a\") " pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:51:42.886581 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885754 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-var-lib-openvswitch\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.886581 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885768 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-host-run-k8s-cni-cncf-io\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.887196 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885782 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-os-release\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.887196 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885792 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cc312109-c0ed-49ee-b44b-aebf49f43c92-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.887196 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885795 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-host-var-lib-cni-multus\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.887196 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885827 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-etc-kubernetes\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.887196 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885901 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b142b7ae-a1de-435f-a326-eb9c1b40ba2e-host-slash\") pod \"iptables-alerter-x9g68\" (UID: \"b142b7ae-a1de-435f-a326-eb9c1b40ba2e\") " pod="openshift-network-operator/iptables-alerter-x9g68" Apr 21 07:51:42.887196 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885934 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-host-var-lib-cni-bin\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.887196 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885999 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-host-var-lib-cni-bin\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.887196 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.886015 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-host-run-k8s-cni-cncf-io\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.887196 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.886026 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/cc312109-c0ed-49ee-b44b-aebf49f43c92-ovnkube-config\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.887196 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.886042 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-etc-kubernetes\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.887196 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.885827 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c7b371b9-c08d-40a3-b1c7-71f402fdf061-host-var-lib-cni-multus\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.887196 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.886074 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b142b7ae-a1de-435f-a326-eb9c1b40ba2e-host-slash\") pod \"iptables-alerter-x9g68\" (UID: \"b142b7ae-a1de-435f-a326-eb9c1b40ba2e\") " pod="openshift-network-operator/iptables-alerter-x9g68" Apr 21 07:51:42.887196 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.886253 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/cc312109-c0ed-49ee-b44b-aebf49f43c92-ovnkube-script-lib\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.887196 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.886297 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/cc312109-c0ed-49ee-b44b-aebf49f43c92-env-overrides\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.887709 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.887432 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/cc312109-c0ed-49ee-b44b-aebf49f43c92-ovn-node-metrics-cert\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.893955 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:42.893653 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:51:42.893955 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:42.893675 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:51:42.893955 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:42.893688 2570 projected.go:194] Error preparing data for projected volume kube-api-access-prkcp for pod openshift-network-diagnostics/network-check-target-htrsr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:42.893955 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:42.893778 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp podName:9cfb4a7f-bf8c-41ae-9276-00c43802c31a nodeName:}" failed. No retries permitted until 2026-04-21 07:51:43.393762327 +0000 UTC m=+3.232149659 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-prkcp" (UniqueName: "kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp") pod "network-check-target-htrsr" (UID: "9cfb4a7f-bf8c-41ae-9276-00c43802c31a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:42.896281 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.896258 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55tl6\" (UniqueName: \"kubernetes.io/projected/cc312109-c0ed-49ee-b44b-aebf49f43c92-kube-api-access-55tl6\") pod \"ovnkube-node-kshzg\" (UID: \"cc312109-c0ed-49ee-b44b-aebf49f43c92\") " pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:42.896399 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.896302 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8f97p\" (UniqueName: \"kubernetes.io/projected/c7b371b9-c08d-40a3-b1c7-71f402fdf061-kube-api-access-8f97p\") pod \"multus-p59jc\" (UID: \"c7b371b9-c08d-40a3-b1c7-71f402fdf061\") " pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.896399 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.896317 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql2vv\" (UniqueName: \"kubernetes.io/projected/b142b7ae-a1de-435f-a326-eb9c1b40ba2e-kube-api-access-ql2vv\") pod \"iptables-alerter-x9g68\" (UID: \"b142b7ae-a1de-435f-a326-eb9c1b40ba2e\") " pod="openshift-network-operator/iptables-alerter-x9g68" Apr 21 07:51:42.969029 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.968938 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-v596t" Apr 21 07:51:42.976742 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.976721 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-hc6tl" Apr 21 07:51:42.990561 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.990535 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-p59jc" Apr 21 07:51:42.996214 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:42.996189 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6p77p" Apr 21 07:51:43.003837 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:43.003816 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-gc4bz" Apr 21 07:51:43.011496 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:43.011436 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" Apr 21 07:51:43.019033 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:43.019011 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" Apr 21 07:51:43.032602 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:43.032575 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-x9g68" Apr 21 07:51:43.037238 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:43.037221 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:51:43.288429 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:43.288360 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs\") pod \"network-metrics-daemon-bbxng\" (UID: \"62e778a8-8270-4560-9d0d-41a95a3c9c5f\") " pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:51:43.288565 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:43.288465 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:43.288565 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:43.288520 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs podName:62e778a8-8270-4560-9d0d-41a95a3c9c5f nodeName:}" failed. No retries permitted until 2026-04-21 07:51:44.288503767 +0000 UTC m=+4.126891111 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs") pod "network-metrics-daemon-bbxng" (UID: "62e778a8-8270-4560-9d0d-41a95a3c9c5f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:43.489748 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:43.489699 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prkcp\" (UniqueName: \"kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp\") pod \"network-check-target-htrsr\" (UID: \"9cfb4a7f-bf8c-41ae-9276-00c43802c31a\") " pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:51:43.489920 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:43.489882 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:51:43.489920 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:43.489904 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:51:43.489920 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:43.489918 2570 projected.go:194] Error preparing data for projected volume kube-api-access-prkcp for pod openshift-network-diagnostics/network-check-target-htrsr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:43.490066 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:43.489980 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp podName:9cfb4a7f-bf8c-41ae-9276-00c43802c31a nodeName:}" failed. No retries permitted until 2026-04-21 07:51:44.489958997 +0000 UTC m=+4.328346324 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-prkcp" (UniqueName: "kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp") pod "network-check-target-htrsr" (UID: "9cfb4a7f-bf8c-41ae-9276-00c43802c31a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:43.674078 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:43.674053 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb9ab596_4aad_415a_8f6b_3a6340b43812.slice/crio-60972b2d8d2f97e0dd7d9de8691010affecb9146489c4966a79999b39829be18 WatchSource:0}: Error finding container 60972b2d8d2f97e0dd7d9de8691010affecb9146489c4966a79999b39829be18: Status 404 returned error can't find the container with id 60972b2d8d2f97e0dd7d9de8691010affecb9146489c4966a79999b39829be18 Apr 21 07:51:43.674894 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:43.674805 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ec4ea1e_b376_4d45_b221_3c152bf5f6ec.slice/crio-14f96530e05f1a91089a482ce1a94782a9b1b218369739a70262e540ca333f2f WatchSource:0}: Error finding container 14f96530e05f1a91089a482ce1a94782a9b1b218369739a70262e540ca333f2f: Status 404 returned error can't find the container with id 14f96530e05f1a91089a482ce1a94782a9b1b218369739a70262e540ca333f2f Apr 21 07:51:43.675850 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:43.675820 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd018993_e8b6_4e3f_b574_2a1a71e75ce7.slice/crio-5240178be33f93cdc08919638718fd5cf748ba6608ec72a81e8940eb0da20e6d WatchSource:0}: Error finding container 5240178be33f93cdc08919638718fd5cf748ba6608ec72a81e8940eb0da20e6d: Status 404 returned error can't find the container with id 5240178be33f93cdc08919638718fd5cf748ba6608ec72a81e8940eb0da20e6d Apr 21 07:51:43.676749 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:43.676694 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1b4cd3f_6d6c_4602_aff3_3e7007b411bc.slice/crio-09e4db5d340ffaeb057041361a552b5e0eb8fff41ef4a58195d4bedb522a2271 WatchSource:0}: Error finding container 09e4db5d340ffaeb057041361a552b5e0eb8fff41ef4a58195d4bedb522a2271: Status 404 returned error can't find the container with id 09e4db5d340ffaeb057041361a552b5e0eb8fff41ef4a58195d4bedb522a2271 Apr 21 07:51:43.677782 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:43.677762 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb142b7ae_a1de_435f_a326_eb9c1b40ba2e.slice/crio-9f31d51deff631aa69cf36308d88f348b970dd12b3e3661334e713cc502a530f WatchSource:0}: Error finding container 9f31d51deff631aa69cf36308d88f348b970dd12b3e3661334e713cc502a530f: Status 404 returned error can't find the container with id 9f31d51deff631aa69cf36308d88f348b970dd12b3e3661334e713cc502a530f Apr 21 07:51:43.680078 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:43.680054 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28381821_cb18_40b6_a25f_b3a80e24f27a.slice/crio-9588f34b66d598b64dc856073824b3d1bbc056f9d3e79594cfbd59cf62d896e9 WatchSource:0}: Error finding container 9588f34b66d598b64dc856073824b3d1bbc056f9d3e79594cfbd59cf62d896e9: Status 404 returned error can't find the container with id 9588f34b66d598b64dc856073824b3d1bbc056f9d3e79594cfbd59cf62d896e9 Apr 21 07:51:43.681058 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:43.681032 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc312109_c0ed_49ee_b44b_aebf49f43c92.slice/crio-1d69ddc3798951ba0671573396b1bb0d14e8ab7a12f429e755b634e4b1b21caa WatchSource:0}: Error finding container 1d69ddc3798951ba0671573396b1bb0d14e8ab7a12f429e755b634e4b1b21caa: Status 404 returned error can't find the container with id 1d69ddc3798951ba0671573396b1bb0d14e8ab7a12f429e755b634e4b1b21caa Apr 21 07:51:43.683983 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:43.683615 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7b371b9_c08d_40a3_b1c7_71f402fdf061.slice/crio-5ef97e9337caef114041b5841f5f8e9188587805f9ea5ce22348f3eef231934d WatchSource:0}: Error finding container 5ef97e9337caef114041b5841f5f8e9188587805f9ea5ce22348f3eef231934d: Status 404 returned error can't find the container with id 5ef97e9337caef114041b5841f5f8e9188587805f9ea5ce22348f3eef231934d Apr 21 07:51:43.684630 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:51:43.684505 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7dd7ba4_8933_440b_9c8a_6710f4843c4d.slice/crio-ec3c64255d9c91f1ded00f46d99df75cf60c86bfdf154201f652c4eff70dfef0 WatchSource:0}: Error finding container ec3c64255d9c91f1ded00f46d99df75cf60c86bfdf154201f652c4eff70dfef0: Status 404 returned error can't find the container with id ec3c64255d9c91f1ded00f46d99df75cf60c86bfdf154201f652c4eff70dfef0 Apr 21 07:51:43.702730 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:43.702699 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 07:46:41 +0000 UTC" deadline="2028-01-05 18:15:48.559157432 +0000 UTC" Apr 21 07:51:43.702812 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:43.702730 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14986h24m4.856430537s" Apr 21 07:51:43.777225 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:43.777193 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" event={"ID":"eb9ab596-4aad-415a-8f6b-3a6340b43812","Type":"ContainerStarted","Data":"60972b2d8d2f97e0dd7d9de8691010affecb9146489c4966a79999b39829be18"} Apr 21 07:51:43.779665 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:43.779637 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-v596t" event={"ID":"a7dd7ba4-8933-440b-9c8a-6710f4843c4d","Type":"ContainerStarted","Data":"ec3c64255d9c91f1ded00f46d99df75cf60c86bfdf154201f652c4eff70dfef0"} Apr 21 07:51:43.780560 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:43.780539 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" event={"ID":"cc312109-c0ed-49ee-b44b-aebf49f43c92","Type":"ContainerStarted","Data":"1d69ddc3798951ba0671573396b1bb0d14e8ab7a12f429e755b634e4b1b21caa"} Apr 21 07:51:43.781530 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:43.781507 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hc6tl" event={"ID":"28381821-cb18-40b6-a25f-b3a80e24f27a","Type":"ContainerStarted","Data":"9588f34b66d598b64dc856073824b3d1bbc056f9d3e79594cfbd59cf62d896e9"} Apr 21 07:51:43.782455 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:43.782436 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6p77p" event={"ID":"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc","Type":"ContainerStarted","Data":"09e4db5d340ffaeb057041361a552b5e0eb8fff41ef4a58195d4bedb522a2271"} Apr 21 07:51:43.783314 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:43.783288 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gc4bz" event={"ID":"dd018993-e8b6-4e3f-b574-2a1a71e75ce7","Type":"ContainerStarted","Data":"5240178be33f93cdc08919638718fd5cf748ba6608ec72a81e8940eb0da20e6d"} Apr 21 07:51:43.784217 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:43.784196 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" event={"ID":"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec","Type":"ContainerStarted","Data":"14f96530e05f1a91089a482ce1a94782a9b1b218369739a70262e540ca333f2f"} Apr 21 07:51:43.785141 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:43.785123 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p59jc" event={"ID":"c7b371b9-c08d-40a3-b1c7-71f402fdf061","Type":"ContainerStarted","Data":"5ef97e9337caef114041b5841f5f8e9188587805f9ea5ce22348f3eef231934d"} Apr 21 07:51:43.786220 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:43.786201 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-x9g68" event={"ID":"b142b7ae-a1de-435f-a326-eb9c1b40ba2e","Type":"ContainerStarted","Data":"9f31d51deff631aa69cf36308d88f348b970dd12b3e3661334e713cc502a530f"} Apr 21 07:51:44.302484 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:44.299569 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs\") pod \"network-metrics-daemon-bbxng\" (UID: \"62e778a8-8270-4560-9d0d-41a95a3c9c5f\") " pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:51:44.302484 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:44.299739 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:44.302484 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:44.299806 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs podName:62e778a8-8270-4560-9d0d-41a95a3c9c5f nodeName:}" failed. No retries permitted until 2026-04-21 07:51:46.299788817 +0000 UTC m=+6.138176161 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs") pod "network-metrics-daemon-bbxng" (UID: "62e778a8-8270-4560-9d0d-41a95a3c9c5f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:44.501760 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:44.501549 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prkcp\" (UniqueName: \"kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp\") pod \"network-check-target-htrsr\" (UID: \"9cfb4a7f-bf8c-41ae-9276-00c43802c31a\") " pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:51:44.501760 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:44.501730 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:51:44.501760 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:44.501750 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:51:44.501760 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:44.501763 2570 projected.go:194] Error preparing data for projected volume kube-api-access-prkcp for pod openshift-network-diagnostics/network-check-target-htrsr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:44.502155 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:44.501821 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp podName:9cfb4a7f-bf8c-41ae-9276-00c43802c31a nodeName:}" failed. No retries permitted until 2026-04-21 07:51:46.501803564 +0000 UTC m=+6.340190896 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-prkcp" (UniqueName: "kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp") pod "network-check-target-htrsr" (UID: "9cfb4a7f-bf8c-41ae-9276-00c43802c31a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:44.773265 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:44.772516 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:51:44.773265 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:44.772641 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-htrsr" podUID="9cfb4a7f-bf8c-41ae-9276-00c43802c31a" Apr 21 07:51:44.773265 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:44.772743 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:51:44.773265 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:44.772822 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbxng" podUID="62e778a8-8270-4560-9d0d-41a95a3c9c5f" Apr 21 07:51:44.795739 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:44.795049 2570 generic.go:358] "Generic (PLEG): container finished" podID="2e9938a109a193e2a24eba106814c8e0" containerID="acfefaca721cc4d3409d902654df75151eec6f4ff57f5ec045490590a43276fa" exitCode=0 Apr 21 07:51:44.795739 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:44.795125 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-194.ec2.internal" event={"ID":"2e9938a109a193e2a24eba106814c8e0","Type":"ContainerDied","Data":"acfefaca721cc4d3409d902654df75151eec6f4ff57f5ec045490590a43276fa"} Apr 21 07:51:44.810651 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:44.810129 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-194.ec2.internal" event={"ID":"dc75abb31154ae5273a07d0d5f2959e8","Type":"ContainerStarted","Data":"12dcd389566b606f43851c0035b01a482b797549db2a3e046acf66876ae41334"} Apr 21 07:51:45.825849 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:45.825762 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-194.ec2.internal" event={"ID":"2e9938a109a193e2a24eba106814c8e0","Type":"ContainerStarted","Data":"1659958a1253ad31d0b3b42f4fbb86fcac53731516a2cfb7de71a2733361d043"} Apr 21 07:51:45.840502 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:45.839780 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-194.ec2.internal" podStartSLOduration=3.839759688 podStartE2EDuration="3.839759688s" podCreationTimestamp="2026-04-21 07:51:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:51:44.822953994 +0000 UTC m=+4.661341342" watchObservedRunningTime="2026-04-21 07:51:45.839759688 +0000 UTC m=+5.678147040" Apr 21 07:51:46.317951 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:46.317312 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs\") pod \"network-metrics-daemon-bbxng\" (UID: \"62e778a8-8270-4560-9d0d-41a95a3c9c5f\") " pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:51:46.317951 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:46.317492 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:46.317951 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:46.317561 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs podName:62e778a8-8270-4560-9d0d-41a95a3c9c5f nodeName:}" failed. No retries permitted until 2026-04-21 07:51:50.317539873 +0000 UTC m=+10.155927223 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs") pod "network-metrics-daemon-bbxng" (UID: "62e778a8-8270-4560-9d0d-41a95a3c9c5f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:46.519644 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:46.518578 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prkcp\" (UniqueName: \"kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp\") pod \"network-check-target-htrsr\" (UID: \"9cfb4a7f-bf8c-41ae-9276-00c43802c31a\") " pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:51:46.519644 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:46.518760 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:51:46.519644 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:46.518781 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:51:46.519644 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:46.518794 2570 projected.go:194] Error preparing data for projected volume kube-api-access-prkcp for pod openshift-network-diagnostics/network-check-target-htrsr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:46.519644 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:46.518854 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp podName:9cfb4a7f-bf8c-41ae-9276-00c43802c31a nodeName:}" failed. No retries permitted until 2026-04-21 07:51:50.51883566 +0000 UTC m=+10.357223010 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-prkcp" (UniqueName: "kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp") pod "network-check-target-htrsr" (UID: "9cfb4a7f-bf8c-41ae-9276-00c43802c31a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:46.770112 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:46.769594 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:51:46.770112 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:46.769622 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:51:46.770112 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:46.769744 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbxng" podUID="62e778a8-8270-4560-9d0d-41a95a3c9c5f" Apr 21 07:51:46.770112 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:46.769858 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-htrsr" podUID="9cfb4a7f-bf8c-41ae-9276-00c43802c31a" Apr 21 07:51:48.771943 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:48.771909 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:51:48.772443 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:48.771958 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:51:48.772443 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:48.772053 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbxng" podUID="62e778a8-8270-4560-9d0d-41a95a3c9c5f" Apr 21 07:51:48.772443 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:48.772183 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-htrsr" podUID="9cfb4a7f-bf8c-41ae-9276-00c43802c31a" Apr 21 07:51:50.352623 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:50.352584 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs\") pod \"network-metrics-daemon-bbxng\" (UID: \"62e778a8-8270-4560-9d0d-41a95a3c9c5f\") " pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:51:50.353086 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:50.352719 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:50.353086 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:50.352781 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs podName:62e778a8-8270-4560-9d0d-41a95a3c9c5f nodeName:}" failed. No retries permitted until 2026-04-21 07:51:58.352762179 +0000 UTC m=+18.191149524 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs") pod "network-metrics-daemon-bbxng" (UID: "62e778a8-8270-4560-9d0d-41a95a3c9c5f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:50.553722 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:50.553685 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prkcp\" (UniqueName: \"kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp\") pod \"network-check-target-htrsr\" (UID: \"9cfb4a7f-bf8c-41ae-9276-00c43802c31a\") " pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:51:50.553909 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:50.553841 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:51:50.553909 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:50.553877 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:51:50.553909 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:50.553891 2570 projected.go:194] Error preparing data for projected volume kube-api-access-prkcp for pod openshift-network-diagnostics/network-check-target-htrsr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:50.554095 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:50.553945 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp podName:9cfb4a7f-bf8c-41ae-9276-00c43802c31a nodeName:}" failed. No retries permitted until 2026-04-21 07:51:58.553927526 +0000 UTC m=+18.392314860 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-prkcp" (UniqueName: "kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp") pod "network-check-target-htrsr" (UID: "9cfb4a7f-bf8c-41ae-9276-00c43802c31a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:50.770650 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:50.770573 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:51:50.770807 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:50.770681 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbxng" podUID="62e778a8-8270-4560-9d0d-41a95a3c9c5f" Apr 21 07:51:50.770807 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:50.770763 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:51:50.770936 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:50.770883 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-htrsr" podUID="9cfb4a7f-bf8c-41ae-9276-00c43802c31a" Apr 21 07:51:52.769945 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:52.769853 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:51:52.769945 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:52.769896 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:51:52.770527 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:52.770031 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbxng" podUID="62e778a8-8270-4560-9d0d-41a95a3c9c5f" Apr 21 07:51:52.770527 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:52.770494 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-htrsr" podUID="9cfb4a7f-bf8c-41ae-9276-00c43802c31a" Apr 21 07:51:54.495572 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:54.495515 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-194.ec2.internal" podStartSLOduration=12.495500346 podStartE2EDuration="12.495500346s" podCreationTimestamp="2026-04-21 07:51:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:51:45.840482581 +0000 UTC m=+5.678869932" watchObservedRunningTime="2026-04-21 07:51:54.495500346 +0000 UTC m=+14.333887694" Apr 21 07:51:54.496000 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:54.495704 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-pnsks"] Apr 21 07:51:54.498632 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:54.498614 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:51:54.498758 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:54.498687 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pnsks" podUID="a242e9c5-b6e9-4008-8745-35d3c8424baf" Apr 21 07:51:54.579450 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:54.579409 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a242e9c5-b6e9-4008-8745-35d3c8424baf-original-pull-secret\") pod \"global-pull-secret-syncer-pnsks\" (UID: \"a242e9c5-b6e9-4008-8745-35d3c8424baf\") " pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:51:54.579614 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:54.579515 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a242e9c5-b6e9-4008-8745-35d3c8424baf-dbus\") pod \"global-pull-secret-syncer-pnsks\" (UID: \"a242e9c5-b6e9-4008-8745-35d3c8424baf\") " pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:51:54.579614 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:54.579552 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a242e9c5-b6e9-4008-8745-35d3c8424baf-kubelet-config\") pod \"global-pull-secret-syncer-pnsks\" (UID: \"a242e9c5-b6e9-4008-8745-35d3c8424baf\") " pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:51:54.680512 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:54.680465 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a242e9c5-b6e9-4008-8745-35d3c8424baf-dbus\") pod \"global-pull-secret-syncer-pnsks\" (UID: \"a242e9c5-b6e9-4008-8745-35d3c8424baf\") " pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:51:54.680512 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:54.680515 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a242e9c5-b6e9-4008-8745-35d3c8424baf-kubelet-config\") pod \"global-pull-secret-syncer-pnsks\" (UID: \"a242e9c5-b6e9-4008-8745-35d3c8424baf\") " pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:51:54.680729 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:54.680579 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a242e9c5-b6e9-4008-8745-35d3c8424baf-original-pull-secret\") pod \"global-pull-secret-syncer-pnsks\" (UID: \"a242e9c5-b6e9-4008-8745-35d3c8424baf\") " pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:51:54.680729 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:54.680659 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a242e9c5-b6e9-4008-8745-35d3c8424baf-dbus\") pod \"global-pull-secret-syncer-pnsks\" (UID: \"a242e9c5-b6e9-4008-8745-35d3c8424baf\") " pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:51:54.680729 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:54.680664 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a242e9c5-b6e9-4008-8745-35d3c8424baf-kubelet-config\") pod \"global-pull-secret-syncer-pnsks\" (UID: \"a242e9c5-b6e9-4008-8745-35d3c8424baf\") " pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:51:54.680729 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:54.680697 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:51:54.680925 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:54.680757 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a242e9c5-b6e9-4008-8745-35d3c8424baf-original-pull-secret podName:a242e9c5-b6e9-4008-8745-35d3c8424baf nodeName:}" failed. No retries permitted until 2026-04-21 07:51:55.180742787 +0000 UTC m=+15.019130113 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a242e9c5-b6e9-4008-8745-35d3c8424baf-original-pull-secret") pod "global-pull-secret-syncer-pnsks" (UID: "a242e9c5-b6e9-4008-8745-35d3c8424baf") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:51:54.770137 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:54.770054 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:51:54.770287 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:54.770247 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-htrsr" podUID="9cfb4a7f-bf8c-41ae-9276-00c43802c31a" Apr 21 07:51:54.770362 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:54.770326 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:51:54.770486 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:54.770458 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbxng" podUID="62e778a8-8270-4560-9d0d-41a95a3c9c5f" Apr 21 07:51:55.184795 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:55.184761 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a242e9c5-b6e9-4008-8745-35d3c8424baf-original-pull-secret\") pod \"global-pull-secret-syncer-pnsks\" (UID: \"a242e9c5-b6e9-4008-8745-35d3c8424baf\") " pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:51:55.184986 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:55.184933 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:51:55.185051 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:55.185002 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a242e9c5-b6e9-4008-8745-35d3c8424baf-original-pull-secret podName:a242e9c5-b6e9-4008-8745-35d3c8424baf nodeName:}" failed. No retries permitted until 2026-04-21 07:51:56.184981931 +0000 UTC m=+16.023369264 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a242e9c5-b6e9-4008-8745-35d3c8424baf-original-pull-secret") pod "global-pull-secret-syncer-pnsks" (UID: "a242e9c5-b6e9-4008-8745-35d3c8424baf") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:51:55.770156 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:55.770125 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:51:55.770546 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:55.770250 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pnsks" podUID="a242e9c5-b6e9-4008-8745-35d3c8424baf" Apr 21 07:51:56.192630 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:56.192587 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a242e9c5-b6e9-4008-8745-35d3c8424baf-original-pull-secret\") pod \"global-pull-secret-syncer-pnsks\" (UID: \"a242e9c5-b6e9-4008-8745-35d3c8424baf\") " pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:51:56.192799 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:56.192749 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:51:56.192953 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:56.192829 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a242e9c5-b6e9-4008-8745-35d3c8424baf-original-pull-secret podName:a242e9c5-b6e9-4008-8745-35d3c8424baf nodeName:}" failed. No retries permitted until 2026-04-21 07:51:58.192809213 +0000 UTC m=+18.031196560 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a242e9c5-b6e9-4008-8745-35d3c8424baf-original-pull-secret") pod "global-pull-secret-syncer-pnsks" (UID: "a242e9c5-b6e9-4008-8745-35d3c8424baf") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:51:56.770073 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:56.770034 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:51:56.770239 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:56.770148 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-htrsr" podUID="9cfb4a7f-bf8c-41ae-9276-00c43802c31a" Apr 21 07:51:56.770562 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:56.770234 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:51:56.770562 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:56.770357 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbxng" podUID="62e778a8-8270-4560-9d0d-41a95a3c9c5f" Apr 21 07:51:57.769598 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:57.769560 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:51:57.769764 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:57.769670 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pnsks" podUID="a242e9c5-b6e9-4008-8745-35d3c8424baf" Apr 21 07:51:58.207089 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:58.207044 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a242e9c5-b6e9-4008-8745-35d3c8424baf-original-pull-secret\") pod \"global-pull-secret-syncer-pnsks\" (UID: \"a242e9c5-b6e9-4008-8745-35d3c8424baf\") " pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:51:58.207495 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:58.207192 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:51:58.207495 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:58.207256 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a242e9c5-b6e9-4008-8745-35d3c8424baf-original-pull-secret podName:a242e9c5-b6e9-4008-8745-35d3c8424baf nodeName:}" failed. No retries permitted until 2026-04-21 07:52:02.207242064 +0000 UTC m=+22.045629390 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a242e9c5-b6e9-4008-8745-35d3c8424baf-original-pull-secret") pod "global-pull-secret-syncer-pnsks" (UID: "a242e9c5-b6e9-4008-8745-35d3c8424baf") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:51:58.408896 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:58.408843 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs\") pod \"network-metrics-daemon-bbxng\" (UID: \"62e778a8-8270-4560-9d0d-41a95a3c9c5f\") " pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:51:58.409062 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:58.409004 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:58.409114 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:58.409071 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs podName:62e778a8-8270-4560-9d0d-41a95a3c9c5f nodeName:}" failed. No retries permitted until 2026-04-21 07:52:14.409055805 +0000 UTC m=+34.247443132 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs") pod "network-metrics-daemon-bbxng" (UID: "62e778a8-8270-4560-9d0d-41a95a3c9c5f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:51:58.610258 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:58.610223 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prkcp\" (UniqueName: \"kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp\") pod \"network-check-target-htrsr\" (UID: \"9cfb4a7f-bf8c-41ae-9276-00c43802c31a\") " pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:51:58.610422 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:58.610387 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:51:58.610422 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:58.610415 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:51:58.610496 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:58.610429 2570 projected.go:194] Error preparing data for projected volume kube-api-access-prkcp for pod openshift-network-diagnostics/network-check-target-htrsr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:58.610496 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:58.610492 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp podName:9cfb4a7f-bf8c-41ae-9276-00c43802c31a nodeName:}" failed. No retries permitted until 2026-04-21 07:52:14.610472509 +0000 UTC m=+34.448859853 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-prkcp" (UniqueName: "kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp") pod "network-check-target-htrsr" (UID: "9cfb4a7f-bf8c-41ae-9276-00c43802c31a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:51:58.769796 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:58.769753 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:51:58.769984 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:58.769880 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-htrsr" podUID="9cfb4a7f-bf8c-41ae-9276-00c43802c31a" Apr 21 07:51:58.769984 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:58.769928 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:51:58.770095 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:58.770050 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbxng" podUID="62e778a8-8270-4560-9d0d-41a95a3c9c5f" Apr 21 07:51:59.769776 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:51:59.769732 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:51:59.770201 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:51:59.769850 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pnsks" podUID="a242e9c5-b6e9-4008-8745-35d3c8424baf" Apr 21 07:52:00.770499 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:00.770477 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:52:00.770906 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:00.770585 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbxng" podUID="62e778a8-8270-4560-9d0d-41a95a3c9c5f" Apr 21 07:52:00.770906 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:00.770656 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:52:00.770906 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:00.770742 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-htrsr" podUID="9cfb4a7f-bf8c-41ae-9276-00c43802c31a" Apr 21 07:52:01.769966 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:01.769638 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:52:01.770099 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:01.770046 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pnsks" podUID="a242e9c5-b6e9-4008-8745-35d3c8424baf" Apr 21 07:52:01.852078 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:01.852052 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kshzg_cc312109-c0ed-49ee-b44b-aebf49f43c92/ovn-acl-logging/0.log" Apr 21 07:52:01.852858 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:01.852377 2570 generic.go:358] "Generic (PLEG): container finished" podID="cc312109-c0ed-49ee-b44b-aebf49f43c92" containerID="ac166c60b5b3f11e3e85674e810250109fc08fff01c600f4b941410f2eaca05c" exitCode=1 Apr 21 07:52:01.852858 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:01.852427 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" event={"ID":"cc312109-c0ed-49ee-b44b-aebf49f43c92","Type":"ContainerStarted","Data":"46272c863cfd6a92368fcb62ef8ef32f1fb8b7a1f6fb5b545cfe4bcd85cb5df2"} Apr 21 07:52:01.852858 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:01.852458 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" event={"ID":"cc312109-c0ed-49ee-b44b-aebf49f43c92","Type":"ContainerStarted","Data":"827a7aec9a2f45be85b3f0ea96f22a72ddd4f79722c94c1c6263530fd7910048"} Apr 21 07:52:01.852858 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:01.852471 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" event={"ID":"cc312109-c0ed-49ee-b44b-aebf49f43c92","Type":"ContainerStarted","Data":"87c527fabebebd56f6e6e9cdaddef88cfd641f77dc47a7b9de58f75b4148fc03"} Apr 21 07:52:01.852858 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:01.852483 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" event={"ID":"cc312109-c0ed-49ee-b44b-aebf49f43c92","Type":"ContainerStarted","Data":"02350d994b36a4624ea8e3ea7a753e558f9be38718d31315add964deeb2593fd"} Apr 21 07:52:01.852858 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:01.852493 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" event={"ID":"cc312109-c0ed-49ee-b44b-aebf49f43c92","Type":"ContainerDied","Data":"ac166c60b5b3f11e3e85674e810250109fc08fff01c600f4b941410f2eaca05c"} Apr 21 07:52:01.852858 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:01.852506 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" event={"ID":"cc312109-c0ed-49ee-b44b-aebf49f43c92","Type":"ContainerStarted","Data":"69f9d82a5834f16ecd4edce9bdd260009effbd1c68f664248916913fcd56bee8"} Apr 21 07:52:01.853961 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:01.853924 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-hc6tl" event={"ID":"28381821-cb18-40b6-a25f-b3a80e24f27a","Type":"ContainerStarted","Data":"ba79a8f0ca6bcf2aa209d05fbc82509a90127d308b96f81a5abb4cab27493746"} Apr 21 07:52:01.855501 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:01.855475 2570 generic.go:358] "Generic (PLEG): container finished" podID="f1b4cd3f-6d6c-4602-aff3-3e7007b411bc" containerID="ecb3ed424e62242eda0b5f28538b01d0316c1fa4c61a67e3ab63474949cc3d83" exitCode=0 Apr 21 07:52:01.855616 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:01.855548 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6p77p" event={"ID":"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc","Type":"ContainerDied","Data":"ecb3ed424e62242eda0b5f28538b01d0316c1fa4c61a67e3ab63474949cc3d83"} Apr 21 07:52:01.856945 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:01.856920 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-gc4bz" event={"ID":"dd018993-e8b6-4e3f-b574-2a1a71e75ce7","Type":"ContainerStarted","Data":"f5d1ecae5951fb89be07b6e460a36a64eedf07c610db3e6c8ae08f7d09ee294e"} Apr 21 07:52:01.858299 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:01.858268 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" event={"ID":"4ec4ea1e-b376-4d45-b221-3c152bf5f6ec","Type":"ContainerStarted","Data":"d9489824b4b2569017ebf9e9ffb439a4d476b0241df1db5f9e2bc5c7d949fb9e"} Apr 21 07:52:01.859640 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:01.859617 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-p59jc" event={"ID":"c7b371b9-c08d-40a3-b1c7-71f402fdf061","Type":"ContainerStarted","Data":"a2cc94e070d99831be249eaac881fab7b5a33702de0eae5cbd58cc2fdd5cab71"} Apr 21 07:52:01.860994 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:01.860973 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" event={"ID":"eb9ab596-4aad-415a-8f6b-3a6340b43812","Type":"ContainerStarted","Data":"c259947b48ca5c11edf9492f447585b1c06872f570df0c8948d5037ca129faa4"} Apr 21 07:52:01.862229 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:01.862205 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-v596t" event={"ID":"a7dd7ba4-8933-440b-9c8a-6710f4843c4d","Type":"ContainerStarted","Data":"b3034f6e3059aa3ee95a5b8c3d46d1ca44f8d0e5d5972a20d71931636b298e47"} Apr 21 07:52:01.867480 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:01.867444 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-hc6tl" podStartSLOduration=4.794247043 podStartE2EDuration="21.867433826s" podCreationTimestamp="2026-04-21 07:51:40 +0000 UTC" firstStartedPulling="2026-04-21 07:51:43.681926704 +0000 UTC m=+3.520314033" lastFinishedPulling="2026-04-21 07:52:00.755113477 +0000 UTC m=+20.593500816" observedRunningTime="2026-04-21 07:52:01.866964087 +0000 UTC m=+21.705351438" watchObservedRunningTime="2026-04-21 07:52:01.867433826 +0000 UTC m=+21.705821174" Apr 21 07:52:01.879288 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:01.879244 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-gc4bz" podStartSLOduration=4.82544212 podStartE2EDuration="21.879235702s" podCreationTimestamp="2026-04-21 07:51:40 +0000 UTC" firstStartedPulling="2026-04-21 07:51:43.678263194 +0000 UTC m=+3.516650536" lastFinishedPulling="2026-04-21 07:52:00.732056786 +0000 UTC m=+20.570444118" observedRunningTime="2026-04-21 07:52:01.878996656 +0000 UTC m=+21.717384031" watchObservedRunningTime="2026-04-21 07:52:01.879235702 +0000 UTC m=+21.717623050" Apr 21 07:52:01.909311 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:01.909264 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ghbj6" podStartSLOduration=4.8310574509999995 podStartE2EDuration="21.909249003s" podCreationTimestamp="2026-04-21 07:51:40 +0000 UTC" firstStartedPulling="2026-04-21 07:51:43.676950553 +0000 UTC m=+3.515337887" lastFinishedPulling="2026-04-21 07:52:00.755142095 +0000 UTC m=+20.593529439" observedRunningTime="2026-04-21 07:52:01.892815464 +0000 UTC m=+21.731202814" watchObservedRunningTime="2026-04-21 07:52:01.909249003 +0000 UTC m=+21.747636352" Apr 21 07:52:01.909498 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:01.909471 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-p59jc" podStartSLOduration=4.820369746 podStartE2EDuration="21.909462112s" podCreationTimestamp="2026-04-21 07:51:40 +0000 UTC" firstStartedPulling="2026-04-21 07:51:43.685625936 +0000 UTC m=+3.524013277" lastFinishedPulling="2026-04-21 07:52:00.774718314 +0000 UTC m=+20.613105643" observedRunningTime="2026-04-21 07:52:01.909379819 +0000 UTC m=+21.747767155" watchObservedRunningTime="2026-04-21 07:52:01.909462112 +0000 UTC m=+21.747849462" Apr 21 07:52:01.921539 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:01.921494 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-v596t" podStartSLOduration=4.875661433 podStartE2EDuration="21.921481348s" podCreationTimestamp="2026-04-21 07:51:40 +0000 UTC" firstStartedPulling="2026-04-21 07:51:43.68615812 +0000 UTC m=+3.524545456" lastFinishedPulling="2026-04-21 07:52:00.731978038 +0000 UTC m=+20.570365371" observedRunningTime="2026-04-21 07:52:01.921086109 +0000 UTC m=+21.759473458" watchObservedRunningTime="2026-04-21 07:52:01.921481348 +0000 UTC m=+21.759868712" Apr 21 07:52:02.241896 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:02.241859 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a242e9c5-b6e9-4008-8745-35d3c8424baf-original-pull-secret\") pod \"global-pull-secret-syncer-pnsks\" (UID: \"a242e9c5-b6e9-4008-8745-35d3c8424baf\") " pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:52:02.242017 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:02.242000 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:52:02.242088 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:02.242076 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a242e9c5-b6e9-4008-8745-35d3c8424baf-original-pull-secret podName:a242e9c5-b6e9-4008-8745-35d3c8424baf nodeName:}" failed. No retries permitted until 2026-04-21 07:52:10.242062012 +0000 UTC m=+30.080449339 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a242e9c5-b6e9-4008-8745-35d3c8424baf-original-pull-secret") pod "global-pull-secret-syncer-pnsks" (UID: "a242e9c5-b6e9-4008-8745-35d3c8424baf") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:52:02.375275 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:02.375247 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 07:52:02.741704 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:02.741615 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T07:52:02.37527091Z","UUID":"44ec4fe0-f6da-4084-91ea-c8952c9bd184","Handler":null,"Name":"","Endpoint":""} Apr 21 07:52:02.744090 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:02.744058 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 07:52:02.744209 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:02.744102 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 07:52:02.769748 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:02.769723 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:52:02.769901 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:02.769814 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-htrsr" podUID="9cfb4a7f-bf8c-41ae-9276-00c43802c31a" Apr 21 07:52:02.769975 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:02.769920 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:52:02.770057 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:02.770034 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbxng" podUID="62e778a8-8270-4560-9d0d-41a95a3c9c5f" Apr 21 07:52:02.865612 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:02.865576 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-x9g68" event={"ID":"b142b7ae-a1de-435f-a326-eb9c1b40ba2e","Type":"ContainerStarted","Data":"9ba9deea8ee539626ff0f5802921469719d3b7ae34316ff285d86ac10abe37d5"} Apr 21 07:52:02.867740 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:02.867704 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" event={"ID":"eb9ab596-4aad-415a-8f6b-3a6340b43812","Type":"ContainerStarted","Data":"b7b1db8842ab37abe567c19e4f5990526c2ca510370889e34cc50dab2a13f746"} Apr 21 07:52:03.769927 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:03.769714 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:52:03.770067 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:03.770041 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pnsks" podUID="a242e9c5-b6e9-4008-8745-35d3c8424baf" Apr 21 07:52:03.871276 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:03.871188 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" event={"ID":"eb9ab596-4aad-415a-8f6b-3a6340b43812","Type":"ContainerStarted","Data":"9f9883ed6a9d3a5a439090cdd61644b7aee3056826e6892b39ecc9d46c00932a"} Apr 21 07:52:03.874357 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:03.874332 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kshzg_cc312109-c0ed-49ee-b44b-aebf49f43c92/ovn-acl-logging/0.log" Apr 21 07:52:03.874819 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:03.874793 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" event={"ID":"cc312109-c0ed-49ee-b44b-aebf49f43c92","Type":"ContainerStarted","Data":"8487ad605fc3af1daeb630933fe77eac16e1ea58cfe92be52958cb9faf779ff4"} Apr 21 07:52:03.885522 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:03.885476 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-x9g68" podStartSLOduration=6.834011294 podStartE2EDuration="23.885460429s" podCreationTimestamp="2026-04-21 07:51:40 +0000 UTC" firstStartedPulling="2026-04-21 07:51:43.680904614 +0000 UTC m=+3.519291942" lastFinishedPulling="2026-04-21 07:52:00.732353733 +0000 UTC m=+20.570741077" observedRunningTime="2026-04-21 07:52:02.879213366 +0000 UTC m=+22.717600716" watchObservedRunningTime="2026-04-21 07:52:03.885460429 +0000 UTC m=+23.723847779" Apr 21 07:52:03.885807 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:03.885783 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2j5gd" podStartSLOduration=3.9392295109999997 podStartE2EDuration="23.885779444s" podCreationTimestamp="2026-04-21 07:51:40 +0000 UTC" firstStartedPulling="2026-04-21 07:51:43.676681673 +0000 UTC m=+3.515069002" lastFinishedPulling="2026-04-21 07:52:03.623231602 +0000 UTC m=+23.461618935" observedRunningTime="2026-04-21 07:52:03.885658214 +0000 UTC m=+23.724045563" watchObservedRunningTime="2026-04-21 07:52:03.885779444 +0000 UTC m=+23.724166793" Apr 21 07:52:04.769344 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:04.769313 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:52:04.769344 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:04.769343 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:52:04.769582 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:04.769455 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbxng" podUID="62e778a8-8270-4560-9d0d-41a95a3c9c5f" Apr 21 07:52:04.769641 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:04.769580 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-htrsr" podUID="9cfb4a7f-bf8c-41ae-9276-00c43802c31a" Apr 21 07:52:05.492641 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:05.492607 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-gc4bz" Apr 21 07:52:05.493388 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:05.493371 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-gc4bz" Apr 21 07:52:05.769919 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:05.769832 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:52:05.770078 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:05.769966 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pnsks" podUID="a242e9c5-b6e9-4008-8745-35d3c8424baf" Apr 21 07:52:05.878776 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:05.878747 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-gc4bz" Apr 21 07:52:05.879289 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:05.879268 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-gc4bz" Apr 21 07:52:06.772755 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:06.772597 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:52:06.773171 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:06.772597 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:52:06.773171 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:06.772887 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbxng" podUID="62e778a8-8270-4560-9d0d-41a95a3c9c5f" Apr 21 07:52:06.773171 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:06.772920 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-htrsr" podUID="9cfb4a7f-bf8c-41ae-9276-00c43802c31a" Apr 21 07:52:06.882615 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:06.882573 2570 generic.go:358] "Generic (PLEG): container finished" podID="f1b4cd3f-6d6c-4602-aff3-3e7007b411bc" containerID="7e9d8c1b5fd1fa5c61cb8413b325818dae8a7d1a9ee2e97c758a167f7da12f93" exitCode=0 Apr 21 07:52:06.882758 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:06.882639 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6p77p" event={"ID":"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc","Type":"ContainerDied","Data":"7e9d8c1b5fd1fa5c61cb8413b325818dae8a7d1a9ee2e97c758a167f7da12f93"} Apr 21 07:52:06.885821 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:06.885805 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kshzg_cc312109-c0ed-49ee-b44b-aebf49f43c92/ovn-acl-logging/0.log" Apr 21 07:52:06.886138 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:06.886118 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" event={"ID":"cc312109-c0ed-49ee-b44b-aebf49f43c92","Type":"ContainerStarted","Data":"db10105b2aef88144a502ec7b48a6896612cdfa34f42a943cbdc5812b4a0b76f"} Apr 21 07:52:06.886576 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:06.886541 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:52:06.886726 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:06.886585 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:52:06.886726 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:06.886598 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:52:06.886726 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:06.886727 2570 scope.go:117] "RemoveContainer" containerID="ac166c60b5b3f11e3e85674e810250109fc08fff01c600f4b941410f2eaca05c" Apr 21 07:52:06.902047 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:06.902027 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:52:06.903317 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:06.903294 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:52:07.769217 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:07.769132 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:52:07.769377 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:07.769262 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pnsks" podUID="a242e9c5-b6e9-4008-8745-35d3c8424baf" Apr 21 07:52:07.891804 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:07.891780 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kshzg_cc312109-c0ed-49ee-b44b-aebf49f43c92/ovn-acl-logging/0.log" Apr 21 07:52:07.892251 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:07.892135 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" event={"ID":"cc312109-c0ed-49ee-b44b-aebf49f43c92","Type":"ContainerStarted","Data":"b596dae3dcb52b3c2b55b3b8ff86edf505e8ffc30bbbf5b5013a28d66432c3b5"} Apr 21 07:52:07.893910 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:07.893886 2570 generic.go:358] "Generic (PLEG): container finished" podID="f1b4cd3f-6d6c-4602-aff3-3e7007b411bc" containerID="d92f1860f3a71456255df97c2ca5be9ad76353634682b9d1e1c2afd057b9387b" exitCode=0 Apr 21 07:52:07.894028 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:07.893964 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6p77p" event={"ID":"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc","Type":"ContainerDied","Data":"d92f1860f3a71456255df97c2ca5be9ad76353634682b9d1e1c2afd057b9387b"} Apr 21 07:52:07.917266 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:07.917224 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" podStartSLOduration=10.783238633 podStartE2EDuration="27.917210939s" podCreationTimestamp="2026-04-21 07:51:40 +0000 UTC" firstStartedPulling="2026-04-21 07:51:43.685498935 +0000 UTC m=+3.523886265" lastFinishedPulling="2026-04-21 07:52:00.819471226 +0000 UTC m=+20.657858571" observedRunningTime="2026-04-21 07:52:07.916800263 +0000 UTC m=+27.755187646" watchObservedRunningTime="2026-04-21 07:52:07.917210939 +0000 UTC m=+27.755598332" Apr 21 07:52:08.520129 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:08.519972 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pnsks"] Apr 21 07:52:08.520255 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:08.520223 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:52:08.520356 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:08.520330 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pnsks" podUID="a242e9c5-b6e9-4008-8745-35d3c8424baf" Apr 21 07:52:08.523279 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:08.523257 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bbxng"] Apr 21 07:52:08.523391 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:08.523377 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:52:08.523506 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:08.523481 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbxng" podUID="62e778a8-8270-4560-9d0d-41a95a3c9c5f" Apr 21 07:52:08.523839 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:08.523818 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-htrsr"] Apr 21 07:52:08.523965 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:08.523925 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:52:08.524028 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:08.524004 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-htrsr" podUID="9cfb4a7f-bf8c-41ae-9276-00c43802c31a" Apr 21 07:52:08.898218 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:08.898186 2570 generic.go:358] "Generic (PLEG): container finished" podID="f1b4cd3f-6d6c-4602-aff3-3e7007b411bc" containerID="6496dcba8ff9fd8ab8ef217861e2f554c33139040441b3533ca14488fee09c55" exitCode=0 Apr 21 07:52:08.898668 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:08.898275 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6p77p" event={"ID":"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc","Type":"ContainerDied","Data":"6496dcba8ff9fd8ab8ef217861e2f554c33139040441b3533ca14488fee09c55"} Apr 21 07:52:09.770154 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:09.770124 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:52:09.770411 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:09.770124 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:52:09.770411 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:09.770251 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-htrsr" podUID="9cfb4a7f-bf8c-41ae-9276-00c43802c31a" Apr 21 07:52:09.770411 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:09.770331 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pnsks" podUID="a242e9c5-b6e9-4008-8745-35d3c8424baf" Apr 21 07:52:10.301931 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:10.301677 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a242e9c5-b6e9-4008-8745-35d3c8424baf-original-pull-secret\") pod \"global-pull-secret-syncer-pnsks\" (UID: \"a242e9c5-b6e9-4008-8745-35d3c8424baf\") " pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:52:10.301931 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:10.301829 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 21 07:52:10.302690 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:10.301928 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a242e9c5-b6e9-4008-8745-35d3c8424baf-original-pull-secret podName:a242e9c5-b6e9-4008-8745-35d3c8424baf nodeName:}" failed. No retries permitted until 2026-04-21 07:52:26.301904463 +0000 UTC m=+46.140291791 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a242e9c5-b6e9-4008-8745-35d3c8424baf-original-pull-secret") pod "global-pull-secret-syncer-pnsks" (UID: "a242e9c5-b6e9-4008-8745-35d3c8424baf") : object "kube-system"/"original-pull-secret" not registered Apr 21 07:52:10.770433 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:10.770404 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:52:10.770597 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:10.770524 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbxng" podUID="62e778a8-8270-4560-9d0d-41a95a3c9c5f" Apr 21 07:52:11.770088 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:11.770034 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:52:11.770571 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:11.770034 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:52:11.770571 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:11.770192 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pnsks" podUID="a242e9c5-b6e9-4008-8745-35d3c8424baf" Apr 21 07:52:11.770571 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:11.770246 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-htrsr" podUID="9cfb4a7f-bf8c-41ae-9276-00c43802c31a" Apr 21 07:52:12.769830 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:12.769793 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:52:12.770038 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:12.769963 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-bbxng" podUID="62e778a8-8270-4560-9d0d-41a95a3c9c5f" Apr 21 07:52:13.769710 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:13.769623 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:52:13.769710 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:13.769625 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:52:13.770325 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:13.769751 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-htrsr" podUID="9cfb4a7f-bf8c-41ae-9276-00c43802c31a" Apr 21 07:52:13.770325 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:13.769935 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pnsks" podUID="a242e9c5-b6e9-4008-8745-35d3c8424baf" Apr 21 07:52:14.009858 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.009830 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-194.ec2.internal" event="NodeReady" Apr 21 07:52:14.010050 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.009993 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 07:52:14.047605 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.047535 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7rk6g"] Apr 21 07:52:14.079452 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.079426 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-pfqbt"] Apr 21 07:52:14.079612 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.079588 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7rk6g" Apr 21 07:52:14.082202 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.082036 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 07:52:14.082202 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.082064 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 07:52:14.082202 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.082036 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4xjgf\"" Apr 21 07:52:14.105232 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.105208 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7rk6g"] Apr 21 07:52:14.105232 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.105234 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pfqbt"] Apr 21 07:52:14.105390 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.105334 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pfqbt" Apr 21 07:52:14.107680 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.107656 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 07:52:14.107680 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.107672 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 07:52:14.107822 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.107696 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pcpsl\"" Apr 21 07:52:14.107822 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.107656 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 07:52:14.230176 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.230137 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls\") pod \"dns-default-7rk6g\" (UID: \"6329b105-bf72-40c1-ab25-2ba6f2aea17c\") " pod="openshift-dns/dns-default-7rk6g" Apr 21 07:52:14.230374 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.230226 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6329b105-bf72-40c1-ab25-2ba6f2aea17c-tmp-dir\") pod \"dns-default-7rk6g\" (UID: \"6329b105-bf72-40c1-ab25-2ba6f2aea17c\") " pod="openshift-dns/dns-default-7rk6g" Apr 21 07:52:14.230374 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.230340 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6329b105-bf72-40c1-ab25-2ba6f2aea17c-config-volume\") pod \"dns-default-7rk6g\" (UID: \"6329b105-bf72-40c1-ab25-2ba6f2aea17c\") " pod="openshift-dns/dns-default-7rk6g" Apr 21 07:52:14.230374 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.230372 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert\") pod \"ingress-canary-pfqbt\" (UID: \"1bcd6517-d770-467f-8536-8c7f4cdc772e\") " pod="openshift-ingress-canary/ingress-canary-pfqbt" Apr 21 07:52:14.230501 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.230397 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-976qd\" (UniqueName: \"kubernetes.io/projected/6329b105-bf72-40c1-ab25-2ba6f2aea17c-kube-api-access-976qd\") pod \"dns-default-7rk6g\" (UID: \"6329b105-bf72-40c1-ab25-2ba6f2aea17c\") " pod="openshift-dns/dns-default-7rk6g" Apr 21 07:52:14.230501 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.230493 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8kmr\" (UniqueName: \"kubernetes.io/projected/1bcd6517-d770-467f-8536-8c7f4cdc772e-kube-api-access-t8kmr\") pod \"ingress-canary-pfqbt\" (UID: \"1bcd6517-d770-467f-8536-8c7f4cdc772e\") " pod="openshift-ingress-canary/ingress-canary-pfqbt" Apr 21 07:52:14.331477 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.331403 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8kmr\" (UniqueName: \"kubernetes.io/projected/1bcd6517-d770-467f-8536-8c7f4cdc772e-kube-api-access-t8kmr\") pod \"ingress-canary-pfqbt\" (UID: \"1bcd6517-d770-467f-8536-8c7f4cdc772e\") " pod="openshift-ingress-canary/ingress-canary-pfqbt" Apr 21 07:52:14.331477 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.331454 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls\") pod \"dns-default-7rk6g\" (UID: \"6329b105-bf72-40c1-ab25-2ba6f2aea17c\") " pod="openshift-dns/dns-default-7rk6g" Apr 21 07:52:14.331477 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.331472 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6329b105-bf72-40c1-ab25-2ba6f2aea17c-tmp-dir\") pod \"dns-default-7rk6g\" (UID: \"6329b105-bf72-40c1-ab25-2ba6f2aea17c\") " pod="openshift-dns/dns-default-7rk6g" Apr 21 07:52:14.331736 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.331516 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6329b105-bf72-40c1-ab25-2ba6f2aea17c-config-volume\") pod \"dns-default-7rk6g\" (UID: \"6329b105-bf72-40c1-ab25-2ba6f2aea17c\") " pod="openshift-dns/dns-default-7rk6g" Apr 21 07:52:14.331736 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.331532 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert\") pod \"ingress-canary-pfqbt\" (UID: \"1bcd6517-d770-467f-8536-8c7f4cdc772e\") " pod="openshift-ingress-canary/ingress-canary-pfqbt" Apr 21 07:52:14.331736 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.331550 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-976qd\" (UniqueName: \"kubernetes.io/projected/6329b105-bf72-40c1-ab25-2ba6f2aea17c-kube-api-access-976qd\") pod \"dns-default-7rk6g\" (UID: \"6329b105-bf72-40c1-ab25-2ba6f2aea17c\") " pod="openshift-dns/dns-default-7rk6g" Apr 21 07:52:14.331736 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:14.331569 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:52:14.331736 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:14.331655 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls podName:6329b105-bf72-40c1-ab25-2ba6f2aea17c nodeName:}" failed. No retries permitted until 2026-04-21 07:52:14.831633429 +0000 UTC m=+34.670020768 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls") pod "dns-default-7rk6g" (UID: "6329b105-bf72-40c1-ab25-2ba6f2aea17c") : secret "dns-default-metrics-tls" not found Apr 21 07:52:14.331736 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:14.331659 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:52:14.331967 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:14.331745 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert podName:1bcd6517-d770-467f-8536-8c7f4cdc772e nodeName:}" failed. No retries permitted until 2026-04-21 07:52:14.831726981 +0000 UTC m=+34.670114322 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert") pod "ingress-canary-pfqbt" (UID: "1bcd6517-d770-467f-8536-8c7f4cdc772e") : secret "canary-serving-cert" not found Apr 21 07:52:14.331967 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.331891 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6329b105-bf72-40c1-ab25-2ba6f2aea17c-tmp-dir\") pod \"dns-default-7rk6g\" (UID: \"6329b105-bf72-40c1-ab25-2ba6f2aea17c\") " pod="openshift-dns/dns-default-7rk6g" Apr 21 07:52:14.332081 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.332062 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6329b105-bf72-40c1-ab25-2ba6f2aea17c-config-volume\") pod \"dns-default-7rk6g\" (UID: \"6329b105-bf72-40c1-ab25-2ba6f2aea17c\") " pod="openshift-dns/dns-default-7rk6g" Apr 21 07:52:14.342191 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.342154 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-976qd\" (UniqueName: \"kubernetes.io/projected/6329b105-bf72-40c1-ab25-2ba6f2aea17c-kube-api-access-976qd\") pod \"dns-default-7rk6g\" (UID: \"6329b105-bf72-40c1-ab25-2ba6f2aea17c\") " pod="openshift-dns/dns-default-7rk6g" Apr 21 07:52:14.342335 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.342303 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8kmr\" (UniqueName: \"kubernetes.io/projected/1bcd6517-d770-467f-8536-8c7f4cdc772e-kube-api-access-t8kmr\") pod \"ingress-canary-pfqbt\" (UID: \"1bcd6517-d770-467f-8536-8c7f4cdc772e\") " pod="openshift-ingress-canary/ingress-canary-pfqbt" Apr 21 07:52:14.432575 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.432541 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs\") pod \"network-metrics-daemon-bbxng\" (UID: \"62e778a8-8270-4560-9d0d-41a95a3c9c5f\") " pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:52:14.432767 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:14.432686 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:52:14.432767 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:14.432749 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs podName:62e778a8-8270-4560-9d0d-41a95a3c9c5f nodeName:}" failed. No retries permitted until 2026-04-21 07:52:46.432733619 +0000 UTC m=+66.271120948 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs") pod "network-metrics-daemon-bbxng" (UID: "62e778a8-8270-4560-9d0d-41a95a3c9c5f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 07:52:14.633649 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.633607 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prkcp\" (UniqueName: \"kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp\") pod \"network-check-target-htrsr\" (UID: \"9cfb4a7f-bf8c-41ae-9276-00c43802c31a\") " pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:52:14.633812 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:14.633767 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 07:52:14.633812 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:14.633789 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 07:52:14.633812 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:14.633799 2570 projected.go:194] Error preparing data for projected volume kube-api-access-prkcp for pod openshift-network-diagnostics/network-check-target-htrsr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:52:14.633934 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:14.633850 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp podName:9cfb4a7f-bf8c-41ae-9276-00c43802c31a nodeName:}" failed. No retries permitted until 2026-04-21 07:52:46.633836297 +0000 UTC m=+66.472223624 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-prkcp" (UniqueName: "kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp") pod "network-check-target-htrsr" (UID: "9cfb4a7f-bf8c-41ae-9276-00c43802c31a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 07:52:14.769935 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.769901 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:52:14.773002 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.772953 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 07:52:14.773002 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.772997 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9jd8h\"" Apr 21 07:52:14.836319 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.836296 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert\") pod \"ingress-canary-pfqbt\" (UID: \"1bcd6517-d770-467f-8536-8c7f4cdc772e\") " pod="openshift-ingress-canary/ingress-canary-pfqbt" Apr 21 07:52:14.836416 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.836351 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls\") pod \"dns-default-7rk6g\" (UID: \"6329b105-bf72-40c1-ab25-2ba6f2aea17c\") " pod="openshift-dns/dns-default-7rk6g" Apr 21 07:52:14.836454 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:14.836432 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:52:14.836489 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:14.836453 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:52:14.836520 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:14.836490 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert podName:1bcd6517-d770-467f-8536-8c7f4cdc772e nodeName:}" failed. No retries permitted until 2026-04-21 07:52:15.836474998 +0000 UTC m=+35.674862329 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert") pod "ingress-canary-pfqbt" (UID: "1bcd6517-d770-467f-8536-8c7f4cdc772e") : secret "canary-serving-cert" not found Apr 21 07:52:14.836520 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:14.836503 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls podName:6329b105-bf72-40c1-ab25-2ba6f2aea17c nodeName:}" failed. No retries permitted until 2026-04-21 07:52:15.836497388 +0000 UTC m=+35.674884715 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls") pod "dns-default-7rk6g" (UID: "6329b105-bf72-40c1-ab25-2ba6f2aea17c") : secret "dns-default-metrics-tls" not found Apr 21 07:52:14.912339 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:14.912107 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6p77p" event={"ID":"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc","Type":"ContainerStarted","Data":"b5fe6419c2a26c2c9e7024d5fbc9f1d552c88e1fac5686c8cc59e7a27aec1352"} Apr 21 07:52:15.770008 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:15.769974 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:52:15.770430 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:15.770011 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:52:15.772652 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:15.772636 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 07:52:15.773543 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:15.773526 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 07:52:15.773609 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:15.773527 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-f27q5\"" Apr 21 07:52:15.773609 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:15.773527 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 07:52:15.844472 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:15.844441 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert\") pod \"ingress-canary-pfqbt\" (UID: \"1bcd6517-d770-467f-8536-8c7f4cdc772e\") " pod="openshift-ingress-canary/ingress-canary-pfqbt" Apr 21 07:52:15.844641 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:15.844495 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls\") pod \"dns-default-7rk6g\" (UID: \"6329b105-bf72-40c1-ab25-2ba6f2aea17c\") " pod="openshift-dns/dns-default-7rk6g" Apr 21 07:52:15.844641 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:15.844589 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:52:15.844641 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:15.844592 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:52:15.844737 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:15.844653 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls podName:6329b105-bf72-40c1-ab25-2ba6f2aea17c nodeName:}" failed. No retries permitted until 2026-04-21 07:52:17.844639269 +0000 UTC m=+37.683026596 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls") pod "dns-default-7rk6g" (UID: "6329b105-bf72-40c1-ab25-2ba6f2aea17c") : secret "dns-default-metrics-tls" not found Apr 21 07:52:15.844737 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:15.844668 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert podName:1bcd6517-d770-467f-8536-8c7f4cdc772e nodeName:}" failed. No retries permitted until 2026-04-21 07:52:17.844659985 +0000 UTC m=+37.683047312 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert") pod "ingress-canary-pfqbt" (UID: "1bcd6517-d770-467f-8536-8c7f4cdc772e") : secret "canary-serving-cert" not found Apr 21 07:52:15.916255 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:15.916219 2570 generic.go:358] "Generic (PLEG): container finished" podID="f1b4cd3f-6d6c-4602-aff3-3e7007b411bc" containerID="b5fe6419c2a26c2c9e7024d5fbc9f1d552c88e1fac5686c8cc59e7a27aec1352" exitCode=0 Apr 21 07:52:15.916255 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:15.916260 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6p77p" event={"ID":"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc","Type":"ContainerDied","Data":"b5fe6419c2a26c2c9e7024d5fbc9f1d552c88e1fac5686c8cc59e7a27aec1352"} Apr 21 07:52:16.921003 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:16.920967 2570 generic.go:358] "Generic (PLEG): container finished" podID="f1b4cd3f-6d6c-4602-aff3-3e7007b411bc" containerID="68729e1d86b8a1704d63706d681151e0e887841e94e67aa73dce2ddcc42b5492" exitCode=0 Apr 21 07:52:16.921358 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:16.921012 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6p77p" event={"ID":"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc","Type":"ContainerDied","Data":"68729e1d86b8a1704d63706d681151e0e887841e94e67aa73dce2ddcc42b5492"} Apr 21 07:52:17.857403 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:17.857363 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls\") pod \"dns-default-7rk6g\" (UID: \"6329b105-bf72-40c1-ab25-2ba6f2aea17c\") " pod="openshift-dns/dns-default-7rk6g" Apr 21 07:52:17.857584 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:17.857430 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert\") pod \"ingress-canary-pfqbt\" (UID: \"1bcd6517-d770-467f-8536-8c7f4cdc772e\") " pod="openshift-ingress-canary/ingress-canary-pfqbt" Apr 21 07:52:17.857584 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:17.857521 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:52:17.857673 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:17.857596 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert podName:1bcd6517-d770-467f-8536-8c7f4cdc772e nodeName:}" failed. No retries permitted until 2026-04-21 07:52:21.857581086 +0000 UTC m=+41.695968420 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert") pod "ingress-canary-pfqbt" (UID: "1bcd6517-d770-467f-8536-8c7f4cdc772e") : secret "canary-serving-cert" not found Apr 21 07:52:17.857673 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:17.857521 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:52:17.857673 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:17.857666 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls podName:6329b105-bf72-40c1-ab25-2ba6f2aea17c nodeName:}" failed. No retries permitted until 2026-04-21 07:52:21.857652477 +0000 UTC m=+41.696039804 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls") pod "dns-default-7rk6g" (UID: "6329b105-bf72-40c1-ab25-2ba6f2aea17c") : secret "dns-default-metrics-tls" not found Apr 21 07:52:17.926157 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:17.926127 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6p77p" event={"ID":"f1b4cd3f-6d6c-4602-aff3-3e7007b411bc","Type":"ContainerStarted","Data":"e10027d39af225c44279bc5d13e65844188c075d21bc17304e0a2eec41ec82f0"} Apr 21 07:52:17.947228 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:17.947181 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6p77p" podStartSLOduration=6.927221905 podStartE2EDuration="37.947169024s" podCreationTimestamp="2026-04-21 07:51:40 +0000 UTC" firstStartedPulling="2026-04-21 07:51:43.678528794 +0000 UTC m=+3.516916122" lastFinishedPulling="2026-04-21 07:52:14.698475897 +0000 UTC m=+34.536863241" observedRunningTime="2026-04-21 07:52:17.946026724 +0000 UTC m=+37.784414073" watchObservedRunningTime="2026-04-21 07:52:17.947169024 +0000 UTC m=+37.785556372" Apr 21 07:52:21.884759 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:21.884700 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls\") pod \"dns-default-7rk6g\" (UID: \"6329b105-bf72-40c1-ab25-2ba6f2aea17c\") " pod="openshift-dns/dns-default-7rk6g" Apr 21 07:52:21.885162 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:21.884779 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert\") pod \"ingress-canary-pfqbt\" (UID: \"1bcd6517-d770-467f-8536-8c7f4cdc772e\") " pod="openshift-ingress-canary/ingress-canary-pfqbt" Apr 21 07:52:21.885162 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:21.884891 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:52:21.885162 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:21.884893 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:52:21.885162 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:21.884958 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert podName:1bcd6517-d770-467f-8536-8c7f4cdc772e nodeName:}" failed. No retries permitted until 2026-04-21 07:52:29.884941514 +0000 UTC m=+49.723328841 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert") pod "ingress-canary-pfqbt" (UID: "1bcd6517-d770-467f-8536-8c7f4cdc772e") : secret "canary-serving-cert" not found Apr 21 07:52:21.885162 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:21.884978 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls podName:6329b105-bf72-40c1-ab25-2ba6f2aea17c nodeName:}" failed. No retries permitted until 2026-04-21 07:52:29.884963325 +0000 UTC m=+49.723350653 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls") pod "dns-default-7rk6g" (UID: "6329b105-bf72-40c1-ab25-2ba6f2aea17c") : secret "dns-default-metrics-tls" not found Apr 21 07:52:26.316354 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:26.316318 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a242e9c5-b6e9-4008-8745-35d3c8424baf-original-pull-secret\") pod \"global-pull-secret-syncer-pnsks\" (UID: \"a242e9c5-b6e9-4008-8745-35d3c8424baf\") " pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:52:26.319369 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:26.319343 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a242e9c5-b6e9-4008-8745-35d3c8424baf-original-pull-secret\") pod \"global-pull-secret-syncer-pnsks\" (UID: \"a242e9c5-b6e9-4008-8745-35d3c8424baf\") " pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:52:26.584069 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:26.583991 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pnsks" Apr 21 07:52:26.749170 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:26.749134 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pnsks"] Apr 21 07:52:26.752817 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:52:26.752788 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda242e9c5_b6e9_4008_8745_35d3c8424baf.slice/crio-6caf1dca161f08f47b4bf4037b511e396022bc2b3dda98e47d07dcc4bb6182ef WatchSource:0}: Error finding container 6caf1dca161f08f47b4bf4037b511e396022bc2b3dda98e47d07dcc4bb6182ef: Status 404 returned error can't find the container with id 6caf1dca161f08f47b4bf4037b511e396022bc2b3dda98e47d07dcc4bb6182ef Apr 21 07:52:26.944204 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:26.944164 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pnsks" event={"ID":"a242e9c5-b6e9-4008-8745-35d3c8424baf","Type":"ContainerStarted","Data":"6caf1dca161f08f47b4bf4037b511e396022bc2b3dda98e47d07dcc4bb6182ef"} Apr 21 07:52:29.945368 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:29.945326 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls\") pod \"dns-default-7rk6g\" (UID: \"6329b105-bf72-40c1-ab25-2ba6f2aea17c\") " pod="openshift-dns/dns-default-7rk6g" Apr 21 07:52:29.946066 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:29.945409 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert\") pod \"ingress-canary-pfqbt\" (UID: \"1bcd6517-d770-467f-8536-8c7f4cdc772e\") " pod="openshift-ingress-canary/ingress-canary-pfqbt" Apr 21 07:52:29.946066 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:29.945495 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:52:29.946066 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:29.945531 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:52:29.946066 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:29.945574 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls podName:6329b105-bf72-40c1-ab25-2ba6f2aea17c nodeName:}" failed. No retries permitted until 2026-04-21 07:52:45.945554451 +0000 UTC m=+65.783941785 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls") pod "dns-default-7rk6g" (UID: "6329b105-bf72-40c1-ab25-2ba6f2aea17c") : secret "dns-default-metrics-tls" not found Apr 21 07:52:29.946066 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:29.945599 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert podName:1bcd6517-d770-467f-8536-8c7f4cdc772e nodeName:}" failed. No retries permitted until 2026-04-21 07:52:45.945584343 +0000 UTC m=+65.783971669 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert") pod "ingress-canary-pfqbt" (UID: "1bcd6517-d770-467f-8536-8c7f4cdc772e") : secret "canary-serving-cert" not found Apr 21 07:52:31.954924 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:31.954888 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pnsks" event={"ID":"a242e9c5-b6e9-4008-8745-35d3c8424baf","Type":"ContainerStarted","Data":"56552325690d77bbf690a034158c04bb3097c73f343978eb917f81f0272bc9f8"} Apr 21 07:52:31.968877 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:31.968824 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-pnsks" podStartSLOduration=33.520709092 podStartE2EDuration="37.968810317s" podCreationTimestamp="2026-04-21 07:51:54 +0000 UTC" firstStartedPulling="2026-04-21 07:52:26.754516168 +0000 UTC m=+46.592903495" lastFinishedPulling="2026-04-21 07:52:31.202617393 +0000 UTC m=+51.041004720" observedRunningTime="2026-04-21 07:52:31.968032671 +0000 UTC m=+51.806420013" watchObservedRunningTime="2026-04-21 07:52:31.968810317 +0000 UTC m=+51.807197665" Apr 21 07:52:38.908252 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:38.908225 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kshzg" Apr 21 07:52:45.959191 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:45.959153 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls\") pod \"dns-default-7rk6g\" (UID: \"6329b105-bf72-40c1-ab25-2ba6f2aea17c\") " pod="openshift-dns/dns-default-7rk6g" Apr 21 07:52:45.959658 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:45.959213 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert\") pod \"ingress-canary-pfqbt\" (UID: \"1bcd6517-d770-467f-8536-8c7f4cdc772e\") " pod="openshift-ingress-canary/ingress-canary-pfqbt" Apr 21 07:52:45.959658 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:45.959297 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:52:45.959658 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:45.959301 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:52:45.959658 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:45.959350 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert podName:1bcd6517-d770-467f-8536-8c7f4cdc772e nodeName:}" failed. No retries permitted until 2026-04-21 07:53:17.959335326 +0000 UTC m=+97.797722652 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert") pod "ingress-canary-pfqbt" (UID: "1bcd6517-d770-467f-8536-8c7f4cdc772e") : secret "canary-serving-cert" not found Apr 21 07:52:45.959658 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:45.959363 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls podName:6329b105-bf72-40c1-ab25-2ba6f2aea17c nodeName:}" failed. No retries permitted until 2026-04-21 07:53:17.959355832 +0000 UTC m=+97.797743159 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls") pod "dns-default-7rk6g" (UID: "6329b105-bf72-40c1-ab25-2ba6f2aea17c") : secret "dns-default-metrics-tls" not found Apr 21 07:52:46.462323 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:46.462282 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs\") pod \"network-metrics-daemon-bbxng\" (UID: \"62e778a8-8270-4560-9d0d-41a95a3c9c5f\") " pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:52:46.464720 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:46.464702 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 07:52:46.472628 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:46.472609 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 07:52:46.472696 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:52:46.472669 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs podName:62e778a8-8270-4560-9d0d-41a95a3c9c5f nodeName:}" failed. No retries permitted until 2026-04-21 07:53:50.472654519 +0000 UTC m=+130.311041846 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs") pod "network-metrics-daemon-bbxng" (UID: "62e778a8-8270-4560-9d0d-41a95a3c9c5f") : secret "metrics-daemon-secret" not found Apr 21 07:52:46.663801 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:46.663765 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prkcp\" (UniqueName: \"kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp\") pod \"network-check-target-htrsr\" (UID: \"9cfb4a7f-bf8c-41ae-9276-00c43802c31a\") " pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:52:46.666036 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:46.666019 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 07:52:46.676727 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:46.676705 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 07:52:46.687944 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:46.687920 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prkcp\" (UniqueName: \"kubernetes.io/projected/9cfb4a7f-bf8c-41ae-9276-00c43802c31a-kube-api-access-prkcp\") pod \"network-check-target-htrsr\" (UID: \"9cfb4a7f-bf8c-41ae-9276-00c43802c31a\") " pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:52:46.982140 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:46.982115 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-f27q5\"" Apr 21 07:52:46.989931 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:46.989912 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:52:47.120163 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:47.120133 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-htrsr"] Apr 21 07:52:47.124035 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:52:47.123997 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cfb4a7f_bf8c_41ae_9276_00c43802c31a.slice/crio-8f3e2e2544a20c38b7c8197052538eecfb398afd3afba2acacdb8f4c6c5112f4 WatchSource:0}: Error finding container 8f3e2e2544a20c38b7c8197052538eecfb398afd3afba2acacdb8f4c6c5112f4: Status 404 returned error can't find the container with id 8f3e2e2544a20c38b7c8197052538eecfb398afd3afba2acacdb8f4c6c5112f4 Apr 21 07:52:47.982752 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:47.982709 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-htrsr" event={"ID":"9cfb4a7f-bf8c-41ae-9276-00c43802c31a","Type":"ContainerStarted","Data":"8f3e2e2544a20c38b7c8197052538eecfb398afd3afba2acacdb8f4c6c5112f4"} Apr 21 07:52:49.987476 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:49.987384 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-htrsr" event={"ID":"9cfb4a7f-bf8c-41ae-9276-00c43802c31a","Type":"ContainerStarted","Data":"705cbb3a08049b2d6d6acb0ef0813692ba8a3f6c8e85873dc9bf40138735a711"} Apr 21 07:52:49.987805 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:49.987491 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:52:50.002590 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:52:50.002540 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-htrsr" podStartSLOduration=67.445519994 podStartE2EDuration="1m10.00252565s" podCreationTimestamp="2026-04-21 07:51:40 +0000 UTC" firstStartedPulling="2026-04-21 07:52:47.125448141 +0000 UTC m=+66.963835468" lastFinishedPulling="2026-04-21 07:52:49.682453791 +0000 UTC m=+69.520841124" observedRunningTime="2026-04-21 07:52:50.001558794 +0000 UTC m=+69.839946143" watchObservedRunningTime="2026-04-21 07:52:50.00252565 +0000 UTC m=+69.840912999" Apr 21 07:53:17.969116 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:17.968997 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls\") pod \"dns-default-7rk6g\" (UID: \"6329b105-bf72-40c1-ab25-2ba6f2aea17c\") " pod="openshift-dns/dns-default-7rk6g" Apr 21 07:53:17.969116 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:17.969045 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert\") pod \"ingress-canary-pfqbt\" (UID: \"1bcd6517-d770-467f-8536-8c7f4cdc772e\") " pod="openshift-ingress-canary/ingress-canary-pfqbt" Apr 21 07:53:17.969621 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:17.969136 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 07:53:17.969621 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:17.969143 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 07:53:17.969621 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:17.969189 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert podName:1bcd6517-d770-467f-8536-8c7f4cdc772e nodeName:}" failed. No retries permitted until 2026-04-21 07:54:21.969175836 +0000 UTC m=+161.807563164 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert") pod "ingress-canary-pfqbt" (UID: "1bcd6517-d770-467f-8536-8c7f4cdc772e") : secret "canary-serving-cert" not found Apr 21 07:53:17.969621 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:17.969215 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls podName:6329b105-bf72-40c1-ab25-2ba6f2aea17c nodeName:}" failed. No retries permitted until 2026-04-21 07:54:21.969198169 +0000 UTC m=+161.807585505 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls") pod "dns-default-7rk6g" (UID: "6329b105-bf72-40c1-ab25-2ba6f2aea17c") : secret "dns-default-metrics-tls" not found Apr 21 07:53:20.992433 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:20.992403 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-htrsr" Apr 21 07:53:22.687584 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.687550 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-d5f96b85b-ghcsf"] Apr 21 07:53:22.690566 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.690550 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:22.694049 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.694029 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 21 07:53:22.694632 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.694613 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 21 07:53:22.694698 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.694617 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 21 07:53:22.694932 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.694912 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 21 07:53:22.695059 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.694948 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-9tzkg\"" Apr 21 07:53:22.695059 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.694956 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 21 07:53:22.695059 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.694948 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 21 07:53:22.706855 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.706836 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-d5f96b85b-ghcsf"] Apr 21 07:53:22.803763 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.803730 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-default-certificate\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:22.803763 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.803759 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5x5m\" (UniqueName: \"kubernetes.io/projected/9f41707d-3f57-4540-b483-7dd01a7d4ef0-kube-api-access-p5x5m\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:22.803991 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.803783 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-stats-auth\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:22.803991 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.803804 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f41707d-3f57-4540-b483-7dd01a7d4ef0-service-ca-bundle\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:22.803991 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.803848 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-metrics-certs\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:22.905080 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.905048 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-stats-auth\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:22.905166 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.905088 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f41707d-3f57-4540-b483-7dd01a7d4ef0-service-ca-bundle\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:22.905283 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:22.905272 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9f41707d-3f57-4540-b483-7dd01a7d4ef0-service-ca-bundle podName:9f41707d-3f57-4540-b483-7dd01a7d4ef0 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:23.405256189 +0000 UTC m=+103.243643516 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9f41707d-3f57-4540-b483-7dd01a7d4ef0-service-ca-bundle") pod "router-default-d5f96b85b-ghcsf" (UID: "9f41707d-3f57-4540-b483-7dd01a7d4ef0") : configmap references non-existent config key: service-ca.crt Apr 21 07:53:22.905358 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.905342 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-metrics-certs\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:22.905393 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.905385 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-default-certificate\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:22.905439 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.905402 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5x5m\" (UniqueName: \"kubernetes.io/projected/9f41707d-3f57-4540-b483-7dd01a7d4ef0-kube-api-access-p5x5m\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:22.905510 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:22.905482 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 07:53:22.905590 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:22.905577 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-metrics-certs podName:9f41707d-3f57-4540-b483-7dd01a7d4ef0 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:23.405551781 +0000 UTC m=+103.243939112 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-metrics-certs") pod "router-default-d5f96b85b-ghcsf" (UID: "9f41707d-3f57-4540-b483-7dd01a7d4ef0") : secret "router-metrics-certs-default" not found Apr 21 07:53:22.907615 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.907592 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-default-certificate\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:22.907704 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.907636 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-stats-auth\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:22.913877 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:22.913839 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5x5m\" (UniqueName: \"kubernetes.io/projected/9f41707d-3f57-4540-b483-7dd01a7d4ef0-kube-api-access-p5x5m\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:23.409474 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:23.409421 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f41707d-3f57-4540-b483-7dd01a7d4ef0-service-ca-bundle\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:23.409747 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:23.409493 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-metrics-certs\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:23.409747 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:23.409609 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9f41707d-3f57-4540-b483-7dd01a7d4ef0-service-ca-bundle podName:9f41707d-3f57-4540-b483-7dd01a7d4ef0 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:24.409590009 +0000 UTC m=+104.247977340 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9f41707d-3f57-4540-b483-7dd01a7d4ef0-service-ca-bundle") pod "router-default-d5f96b85b-ghcsf" (UID: "9f41707d-3f57-4540-b483-7dd01a7d4ef0") : configmap references non-existent config key: service-ca.crt Apr 21 07:53:23.409747 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:23.409616 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 07:53:23.409747 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:23.409651 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-metrics-certs podName:9f41707d-3f57-4540-b483-7dd01a7d4ef0 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:24.409643969 +0000 UTC m=+104.248031296 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-metrics-certs") pod "router-default-d5f96b85b-ghcsf" (UID: "9f41707d-3f57-4540-b483-7dd01a7d4ef0") : secret "router-metrics-certs-default" not found Apr 21 07:53:24.415900 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:24.415832 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f41707d-3f57-4540-b483-7dd01a7d4ef0-service-ca-bundle\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:24.416385 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:24.416023 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-metrics-certs\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:24.416385 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:24.416101 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9f41707d-3f57-4540-b483-7dd01a7d4ef0-service-ca-bundle podName:9f41707d-3f57-4540-b483-7dd01a7d4ef0 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:26.416086287 +0000 UTC m=+106.254473614 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9f41707d-3f57-4540-b483-7dd01a7d4ef0-service-ca-bundle") pod "router-default-d5f96b85b-ghcsf" (UID: "9f41707d-3f57-4540-b483-7dd01a7d4ef0") : configmap references non-existent config key: service-ca.crt Apr 21 07:53:24.416385 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:24.416134 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 07:53:24.416385 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:24.416177 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-metrics-certs podName:9f41707d-3f57-4540-b483-7dd01a7d4ef0 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:26.416165726 +0000 UTC m=+106.254553053 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-metrics-certs") pod "router-default-d5f96b85b-ghcsf" (UID: "9f41707d-3f57-4540-b483-7dd01a7d4ef0") : secret "router-metrics-certs-default" not found Apr 21 07:53:26.429876 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:26.429830 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f41707d-3f57-4540-b483-7dd01a7d4ef0-service-ca-bundle\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:26.430281 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:26.429911 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-metrics-certs\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:26.430281 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:26.430025 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9f41707d-3f57-4540-b483-7dd01a7d4ef0-service-ca-bundle podName:9f41707d-3f57-4540-b483-7dd01a7d4ef0 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:30.42999805 +0000 UTC m=+110.268385380 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9f41707d-3f57-4540-b483-7dd01a7d4ef0-service-ca-bundle") pod "router-default-d5f96b85b-ghcsf" (UID: "9f41707d-3f57-4540-b483-7dd01a7d4ef0") : configmap references non-existent config key: service-ca.crt Apr 21 07:53:26.430281 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:26.430029 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 07:53:26.430281 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:26.430076 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-metrics-certs podName:9f41707d-3f57-4540-b483-7dd01a7d4ef0 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:30.430066631 +0000 UTC m=+110.268453959 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-metrics-certs") pod "router-default-d5f96b85b-ghcsf" (UID: "9f41707d-3f57-4540-b483-7dd01a7d4ef0") : secret "router-metrics-certs-default" not found Apr 21 07:53:27.654117 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:27.654083 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xnjzr"] Apr 21 07:53:27.657046 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:27.657027 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xnjzr" Apr 21 07:53:27.659475 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:27.659449 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:53:27.659581 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:27.659474 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 21 07:53:27.660403 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:27.660385 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-k5gkc\"" Apr 21 07:53:27.660492 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:27.660438 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 21 07:53:27.666557 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:27.666538 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xnjzr"] Apr 21 07:53:27.738495 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:27.738463 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3ee1e09-3f66-4942-b704-81077b9efa31-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xnjzr\" (UID: \"e3ee1e09-3f66-4942-b704-81077b9efa31\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xnjzr" Apr 21 07:53:27.738495 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:27.738497 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgv69\" (UniqueName: \"kubernetes.io/projected/e3ee1e09-3f66-4942-b704-81077b9efa31-kube-api-access-rgv69\") pod \"cluster-samples-operator-6dc5bdb6b4-xnjzr\" (UID: \"e3ee1e09-3f66-4942-b704-81077b9efa31\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xnjzr" Apr 21 07:53:27.838831 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:27.838785 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3ee1e09-3f66-4942-b704-81077b9efa31-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xnjzr\" (UID: \"e3ee1e09-3f66-4942-b704-81077b9efa31\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xnjzr" Apr 21 07:53:27.838831 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:27.838832 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgv69\" (UniqueName: \"kubernetes.io/projected/e3ee1e09-3f66-4942-b704-81077b9efa31-kube-api-access-rgv69\") pod \"cluster-samples-operator-6dc5bdb6b4-xnjzr\" (UID: \"e3ee1e09-3f66-4942-b704-81077b9efa31\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xnjzr" Apr 21 07:53:27.839030 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:27.838949 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 07:53:27.839030 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:27.839013 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3ee1e09-3f66-4942-b704-81077b9efa31-samples-operator-tls podName:e3ee1e09-3f66-4942-b704-81077b9efa31 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:28.338996208 +0000 UTC m=+108.177383535 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e3ee1e09-3f66-4942-b704-81077b9efa31-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-xnjzr" (UID: "e3ee1e09-3f66-4942-b704-81077b9efa31") : secret "samples-operator-tls" not found Apr 21 07:53:27.847140 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:27.847104 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgv69\" (UniqueName: \"kubernetes.io/projected/e3ee1e09-3f66-4942-b704-81077b9efa31-kube-api-access-rgv69\") pod \"cluster-samples-operator-6dc5bdb6b4-xnjzr\" (UID: \"e3ee1e09-3f66-4942-b704-81077b9efa31\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xnjzr" Apr 21 07:53:28.233287 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:28.233264 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hc6tl_28381821-cb18-40b6-a25f-b3a80e24f27a/dns-node-resolver/0.log" Apr 21 07:53:28.343170 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:28.343134 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3ee1e09-3f66-4942-b704-81077b9efa31-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xnjzr\" (UID: \"e3ee1e09-3f66-4942-b704-81077b9efa31\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xnjzr" Apr 21 07:53:28.343314 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:28.343250 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 07:53:28.343314 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:28.343313 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3ee1e09-3f66-4942-b704-81077b9efa31-samples-operator-tls podName:e3ee1e09-3f66-4942-b704-81077b9efa31 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:29.343298782 +0000 UTC m=+109.181686108 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e3ee1e09-3f66-4942-b704-81077b9efa31-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-xnjzr" (UID: "e3ee1e09-3f66-4942-b704-81077b9efa31") : secret "samples-operator-tls" not found Apr 21 07:53:29.232499 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.232470 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-v596t_a7dd7ba4-8933-440b-9c8a-6710f4843c4d/node-ca/0.log" Apr 21 07:53:29.350213 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.350182 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3ee1e09-3f66-4942-b704-81077b9efa31-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xnjzr\" (UID: \"e3ee1e09-3f66-4942-b704-81077b9efa31\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xnjzr" Apr 21 07:53:29.350323 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:29.350286 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 07:53:29.350363 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:29.350339 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3ee1e09-3f66-4942-b704-81077b9efa31-samples-operator-tls podName:e3ee1e09-3f66-4942-b704-81077b9efa31 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:31.350325434 +0000 UTC m=+111.188712761 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e3ee1e09-3f66-4942-b704-81077b9efa31-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-xnjzr" (UID: "e3ee1e09-3f66-4942-b704-81077b9efa31") : secret "samples-operator-tls" not found Apr 21 07:53:29.656619 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.656585 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-kgsrp"] Apr 21 07:53:29.659359 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.659344 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" Apr 21 07:53:29.662520 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.662498 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-qw9bs\"" Apr 21 07:53:29.662644 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.662542 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 21 07:53:29.663172 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.663156 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 21 07:53:29.663227 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.663158 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 21 07:53:29.663405 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.663391 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:53:29.667184 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.667161 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-kgsrp"] Apr 21 07:53:29.667571 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.667551 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 21 07:53:29.753262 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.753221 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be0cbccd-e9f7-4b37-b3e3-e6ef1514b734-config\") pod \"console-operator-9d4b6777b-kgsrp\" (UID: \"be0cbccd-e9f7-4b37-b3e3-e6ef1514b734\") " pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" Apr 21 07:53:29.753262 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.753263 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be0cbccd-e9f7-4b37-b3e3-e6ef1514b734-serving-cert\") pod \"console-operator-9d4b6777b-kgsrp\" (UID: \"be0cbccd-e9f7-4b37-b3e3-e6ef1514b734\") " pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" Apr 21 07:53:29.753467 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.753282 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be0cbccd-e9f7-4b37-b3e3-e6ef1514b734-trusted-ca\") pod \"console-operator-9d4b6777b-kgsrp\" (UID: \"be0cbccd-e9f7-4b37-b3e3-e6ef1514b734\") " pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" Apr 21 07:53:29.753467 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.753331 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jmc2\" (UniqueName: \"kubernetes.io/projected/be0cbccd-e9f7-4b37-b3e3-e6ef1514b734-kube-api-access-6jmc2\") pod \"console-operator-9d4b6777b-kgsrp\" (UID: \"be0cbccd-e9f7-4b37-b3e3-e6ef1514b734\") " pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" Apr 21 07:53:29.854284 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.854254 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be0cbccd-e9f7-4b37-b3e3-e6ef1514b734-config\") pod \"console-operator-9d4b6777b-kgsrp\" (UID: \"be0cbccd-e9f7-4b37-b3e3-e6ef1514b734\") " pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" Apr 21 07:53:29.854351 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.854292 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be0cbccd-e9f7-4b37-b3e3-e6ef1514b734-serving-cert\") pod \"console-operator-9d4b6777b-kgsrp\" (UID: \"be0cbccd-e9f7-4b37-b3e3-e6ef1514b734\") " pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" Apr 21 07:53:29.854351 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.854311 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be0cbccd-e9f7-4b37-b3e3-e6ef1514b734-trusted-ca\") pod \"console-operator-9d4b6777b-kgsrp\" (UID: \"be0cbccd-e9f7-4b37-b3e3-e6ef1514b734\") " pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" Apr 21 07:53:29.854458 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.854437 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jmc2\" (UniqueName: \"kubernetes.io/projected/be0cbccd-e9f7-4b37-b3e3-e6ef1514b734-kube-api-access-6jmc2\") pod \"console-operator-9d4b6777b-kgsrp\" (UID: \"be0cbccd-e9f7-4b37-b3e3-e6ef1514b734\") " pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" Apr 21 07:53:29.855016 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.854995 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be0cbccd-e9f7-4b37-b3e3-e6ef1514b734-config\") pod \"console-operator-9d4b6777b-kgsrp\" (UID: \"be0cbccd-e9f7-4b37-b3e3-e6ef1514b734\") " pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" Apr 21 07:53:29.855118 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.855021 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/be0cbccd-e9f7-4b37-b3e3-e6ef1514b734-trusted-ca\") pod \"console-operator-9d4b6777b-kgsrp\" (UID: \"be0cbccd-e9f7-4b37-b3e3-e6ef1514b734\") " pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" Apr 21 07:53:29.856584 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.856565 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/be0cbccd-e9f7-4b37-b3e3-e6ef1514b734-serving-cert\") pod \"console-operator-9d4b6777b-kgsrp\" (UID: \"be0cbccd-e9f7-4b37-b3e3-e6ef1514b734\") " pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" Apr 21 07:53:29.861704 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.861683 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jmc2\" (UniqueName: \"kubernetes.io/projected/be0cbccd-e9f7-4b37-b3e3-e6ef1514b734-kube-api-access-6jmc2\") pod \"console-operator-9d4b6777b-kgsrp\" (UID: \"be0cbccd-e9f7-4b37-b3e3-e6ef1514b734\") " pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" Apr 21 07:53:29.969515 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:29.969444 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" Apr 21 07:53:30.081792 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:30.081644 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-kgsrp"] Apr 21 07:53:30.084191 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:53:30.084162 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe0cbccd_e9f7_4b37_b3e3_e6ef1514b734.slice/crio-8ecbb332a3cd559ca6d0ffeb1fe17830144da7a50fb1b8d93314b326e92343cb WatchSource:0}: Error finding container 8ecbb332a3cd559ca6d0ffeb1fe17830144da7a50fb1b8d93314b326e92343cb: Status 404 returned error can't find the container with id 8ecbb332a3cd559ca6d0ffeb1fe17830144da7a50fb1b8d93314b326e92343cb Apr 21 07:53:30.458521 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:30.458486 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f41707d-3f57-4540-b483-7dd01a7d4ef0-service-ca-bundle\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:30.458942 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:30.458539 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-metrics-certs\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:30.458942 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:30.458628 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 07:53:30.458942 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:30.458678 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-metrics-certs podName:9f41707d-3f57-4540-b483-7dd01a7d4ef0 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:38.458665342 +0000 UTC m=+118.297052669 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-metrics-certs") pod "router-default-d5f96b85b-ghcsf" (UID: "9f41707d-3f57-4540-b483-7dd01a7d4ef0") : secret "router-metrics-certs-default" not found Apr 21 07:53:30.458942 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:30.458692 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9f41707d-3f57-4540-b483-7dd01a7d4ef0-service-ca-bundle podName:9f41707d-3f57-4540-b483-7dd01a7d4ef0 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:38.458685845 +0000 UTC m=+118.297073172 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9f41707d-3f57-4540-b483-7dd01a7d4ef0-service-ca-bundle") pod "router-default-d5f96b85b-ghcsf" (UID: "9f41707d-3f57-4540-b483-7dd01a7d4ef0") : configmap references non-existent config key: service-ca.crt Apr 21 07:53:31.068142 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:31.068103 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" event={"ID":"be0cbccd-e9f7-4b37-b3e3-e6ef1514b734","Type":"ContainerStarted","Data":"8ecbb332a3cd559ca6d0ffeb1fe17830144da7a50fb1b8d93314b326e92343cb"} Apr 21 07:53:31.365458 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:31.365426 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3ee1e09-3f66-4942-b704-81077b9efa31-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xnjzr\" (UID: \"e3ee1e09-3f66-4942-b704-81077b9efa31\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xnjzr" Apr 21 07:53:31.365648 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:31.365588 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 07:53:31.365692 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:31.365653 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3ee1e09-3f66-4942-b704-81077b9efa31-samples-operator-tls podName:e3ee1e09-3f66-4942-b704-81077b9efa31 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:35.365638376 +0000 UTC m=+115.204025703 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e3ee1e09-3f66-4942-b704-81077b9efa31-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-xnjzr" (UID: "e3ee1e09-3f66-4942-b704-81077b9efa31") : secret "samples-operator-tls" not found Apr 21 07:53:32.650578 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:32.650546 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mdt7b"] Apr 21 07:53:32.653302 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:32.653286 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mdt7b" Apr 21 07:53:32.655595 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:32.655569 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 21 07:53:32.655595 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:32.655590 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 21 07:53:32.655777 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:32.655596 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 21 07:53:32.655777 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:32.655642 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 21 07:53:32.656551 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:32.656531 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-q9w2r\"" Apr 21 07:53:32.660544 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:32.660525 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mdt7b"] Apr 21 07:53:32.777635 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:32.777601 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcqcq\" (UniqueName: \"kubernetes.io/projected/0e1f90ef-6932-43c1-bb6a-d25451d794e8-kube-api-access-lcqcq\") pod \"kube-storage-version-migrator-operator-6769c5d45-mdt7b\" (UID: \"0e1f90ef-6932-43c1-bb6a-d25451d794e8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mdt7b" Apr 21 07:53:32.777794 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:32.777643 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e1f90ef-6932-43c1-bb6a-d25451d794e8-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-mdt7b\" (UID: \"0e1f90ef-6932-43c1-bb6a-d25451d794e8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mdt7b" Apr 21 07:53:32.777794 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:32.777662 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e1f90ef-6932-43c1-bb6a-d25451d794e8-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-mdt7b\" (UID: \"0e1f90ef-6932-43c1-bb6a-d25451d794e8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mdt7b" Apr 21 07:53:32.878295 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:32.878259 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e1f90ef-6932-43c1-bb6a-d25451d794e8-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-mdt7b\" (UID: \"0e1f90ef-6932-43c1-bb6a-d25451d794e8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mdt7b" Apr 21 07:53:32.878295 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:32.878299 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e1f90ef-6932-43c1-bb6a-d25451d794e8-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-mdt7b\" (UID: \"0e1f90ef-6932-43c1-bb6a-d25451d794e8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mdt7b" Apr 21 07:53:32.878566 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:32.878547 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcqcq\" (UniqueName: \"kubernetes.io/projected/0e1f90ef-6932-43c1-bb6a-d25451d794e8-kube-api-access-lcqcq\") pod \"kube-storage-version-migrator-operator-6769c5d45-mdt7b\" (UID: \"0e1f90ef-6932-43c1-bb6a-d25451d794e8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mdt7b" Apr 21 07:53:32.878916 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:32.878898 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e1f90ef-6932-43c1-bb6a-d25451d794e8-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-mdt7b\" (UID: \"0e1f90ef-6932-43c1-bb6a-d25451d794e8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mdt7b" Apr 21 07:53:32.880568 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:32.880548 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e1f90ef-6932-43c1-bb6a-d25451d794e8-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-mdt7b\" (UID: \"0e1f90ef-6932-43c1-bb6a-d25451d794e8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mdt7b" Apr 21 07:53:32.887039 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:32.887017 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcqcq\" (UniqueName: \"kubernetes.io/projected/0e1f90ef-6932-43c1-bb6a-d25451d794e8-kube-api-access-lcqcq\") pod \"kube-storage-version-migrator-operator-6769c5d45-mdt7b\" (UID: \"0e1f90ef-6932-43c1-bb6a-d25451d794e8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mdt7b" Apr 21 07:53:32.961852 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:32.961772 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mdt7b" Apr 21 07:53:33.067910 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:33.067881 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mdt7b"] Apr 21 07:53:33.070678 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:53:33.070643 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e1f90ef_6932_43c1_bb6a_d25451d794e8.slice/crio-9d913d892e26bb3fa409b6bf68ac80b01716fa7edccb62a0b25cfb88099242e7 WatchSource:0}: Error finding container 9d913d892e26bb3fa409b6bf68ac80b01716fa7edccb62a0b25cfb88099242e7: Status 404 returned error can't find the container with id 9d913d892e26bb3fa409b6bf68ac80b01716fa7edccb62a0b25cfb88099242e7 Apr 21 07:53:33.073181 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:33.073165 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kgsrp_be0cbccd-e9f7-4b37-b3e3-e6ef1514b734/console-operator/0.log" Apr 21 07:53:33.073244 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:33.073200 2570 generic.go:358] "Generic (PLEG): container finished" podID="be0cbccd-e9f7-4b37-b3e3-e6ef1514b734" containerID="8cf606fb67d64e69d681587dc12fc9b09bbfddb184b0cda0979a8fe641ff348e" exitCode=255 Apr 21 07:53:33.073244 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:33.073225 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" event={"ID":"be0cbccd-e9f7-4b37-b3e3-e6ef1514b734","Type":"ContainerDied","Data":"8cf606fb67d64e69d681587dc12fc9b09bbfddb184b0cda0979a8fe641ff348e"} Apr 21 07:53:33.073545 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:33.073528 2570 scope.go:117] "RemoveContainer" containerID="8cf606fb67d64e69d681587dc12fc9b09bbfddb184b0cda0979a8fe641ff348e" Apr 21 07:53:34.076747 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:34.076711 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mdt7b" event={"ID":"0e1f90ef-6932-43c1-bb6a-d25451d794e8","Type":"ContainerStarted","Data":"9d913d892e26bb3fa409b6bf68ac80b01716fa7edccb62a0b25cfb88099242e7"} Apr 21 07:53:34.078092 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:34.078072 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kgsrp_be0cbccd-e9f7-4b37-b3e3-e6ef1514b734/console-operator/1.log" Apr 21 07:53:34.078500 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:34.078477 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kgsrp_be0cbccd-e9f7-4b37-b3e3-e6ef1514b734/console-operator/0.log" Apr 21 07:53:34.078587 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:34.078521 2570 generic.go:358] "Generic (PLEG): container finished" podID="be0cbccd-e9f7-4b37-b3e3-e6ef1514b734" containerID="7c12718b852187b33e5489876ca937ce83ae54be98bb4713364dc363dd4fef03" exitCode=255 Apr 21 07:53:34.078587 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:34.078560 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" event={"ID":"be0cbccd-e9f7-4b37-b3e3-e6ef1514b734","Type":"ContainerDied","Data":"7c12718b852187b33e5489876ca937ce83ae54be98bb4713364dc363dd4fef03"} Apr 21 07:53:34.078587 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:34.078585 2570 scope.go:117] "RemoveContainer" containerID="8cf606fb67d64e69d681587dc12fc9b09bbfddb184b0cda0979a8fe641ff348e" Apr 21 07:53:34.078876 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:34.078850 2570 scope.go:117] "RemoveContainer" containerID="7c12718b852187b33e5489876ca937ce83ae54be98bb4713364dc363dd4fef03" Apr 21 07:53:34.079147 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:34.079120 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-kgsrp_openshift-console-operator(be0cbccd-e9f7-4b37-b3e3-e6ef1514b734)\"" pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" podUID="be0cbccd-e9f7-4b37-b3e3-e6ef1514b734" Apr 21 07:53:34.094578 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:34.094548 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-dkmsp"] Apr 21 07:53:34.097731 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:34.097714 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dkmsp" Apr 21 07:53:34.099949 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:34.099931 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-gpkdc\"" Apr 21 07:53:34.106001 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:34.105977 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-dkmsp"] Apr 21 07:53:34.191217 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:34.191183 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-445c4\" (UniqueName: \"kubernetes.io/projected/29603244-ac93-448d-b27a-816d739bf681-kube-api-access-445c4\") pod \"network-check-source-8894fc9bd-dkmsp\" (UID: \"29603244-ac93-448d-b27a-816d739bf681\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dkmsp" Apr 21 07:53:34.291929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:34.291895 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-445c4\" (UniqueName: \"kubernetes.io/projected/29603244-ac93-448d-b27a-816d739bf681-kube-api-access-445c4\") pod \"network-check-source-8894fc9bd-dkmsp\" (UID: \"29603244-ac93-448d-b27a-816d739bf681\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dkmsp" Apr 21 07:53:34.299806 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:34.299784 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-445c4\" (UniqueName: \"kubernetes.io/projected/29603244-ac93-448d-b27a-816d739bf681-kube-api-access-445c4\") pod \"network-check-source-8894fc9bd-dkmsp\" (UID: \"29603244-ac93-448d-b27a-816d739bf681\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dkmsp" Apr 21 07:53:34.408642 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:34.408604 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dkmsp" Apr 21 07:53:34.518666 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:34.518638 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-dkmsp"] Apr 21 07:53:34.522047 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:53:34.522018 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29603244_ac93_448d_b27a_816d739bf681.slice/crio-172b2c0d3156df5cc4c22f4e1f5da9fc92060bfe6c2d262ead6e7ac4bd599a54 WatchSource:0}: Error finding container 172b2c0d3156df5cc4c22f4e1f5da9fc92060bfe6c2d262ead6e7ac4bd599a54: Status 404 returned error can't find the container with id 172b2c0d3156df5cc4c22f4e1f5da9fc92060bfe6c2d262ead6e7ac4bd599a54 Apr 21 07:53:35.082275 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:35.082229 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dkmsp" event={"ID":"29603244-ac93-448d-b27a-816d739bf681","Type":"ContainerStarted","Data":"90bdf0cbed7dd4719467527e6717870332a64983f8e02cb1d3e2c0df56845cd8"} Apr 21 07:53:35.082275 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:35.082275 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dkmsp" event={"ID":"29603244-ac93-448d-b27a-816d739bf681","Type":"ContainerStarted","Data":"172b2c0d3156df5cc4c22f4e1f5da9fc92060bfe6c2d262ead6e7ac4bd599a54"} Apr 21 07:53:35.083726 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:35.083701 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kgsrp_be0cbccd-e9f7-4b37-b3e3-e6ef1514b734/console-operator/1.log" Apr 21 07:53:35.084059 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:35.084036 2570 scope.go:117] "RemoveContainer" containerID="7c12718b852187b33e5489876ca937ce83ae54be98bb4713364dc363dd4fef03" Apr 21 07:53:35.084238 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:35.084215 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-kgsrp_openshift-console-operator(be0cbccd-e9f7-4b37-b3e3-e6ef1514b734)\"" pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" podUID="be0cbccd-e9f7-4b37-b3e3-e6ef1514b734" Apr 21 07:53:35.096328 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:35.096277 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-dkmsp" podStartSLOduration=1.096258249 podStartE2EDuration="1.096258249s" podCreationTimestamp="2026-04-21 07:53:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:53:35.095111759 +0000 UTC m=+114.933499121" watchObservedRunningTime="2026-04-21 07:53:35.096258249 +0000 UTC m=+114.934645601" Apr 21 07:53:35.400384 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:35.400350 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3ee1e09-3f66-4942-b704-81077b9efa31-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xnjzr\" (UID: \"e3ee1e09-3f66-4942-b704-81077b9efa31\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xnjzr" Apr 21 07:53:35.400602 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:35.400524 2570 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 21 07:53:35.400661 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:35.400609 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3ee1e09-3f66-4942-b704-81077b9efa31-samples-operator-tls podName:e3ee1e09-3f66-4942-b704-81077b9efa31 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:43.400588229 +0000 UTC m=+123.238975559 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/e3ee1e09-3f66-4942-b704-81077b9efa31-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-xnjzr" (UID: "e3ee1e09-3f66-4942-b704-81077b9efa31") : secret "samples-operator-tls" not found Apr 21 07:53:36.087580 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:36.087540 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mdt7b" event={"ID":"0e1f90ef-6932-43c1-bb6a-d25451d794e8","Type":"ContainerStarted","Data":"ef362d15a15e481d2aafbfb939e7623717584dbe404e28ec5a40cb4979bce378"} Apr 21 07:53:36.102052 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:36.102004 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mdt7b" podStartSLOduration=1.999846433 podStartE2EDuration="4.101991121s" podCreationTimestamp="2026-04-21 07:53:32 +0000 UTC" firstStartedPulling="2026-04-21 07:53:33.072481492 +0000 UTC m=+112.910868822" lastFinishedPulling="2026-04-21 07:53:35.174626182 +0000 UTC m=+115.013013510" observedRunningTime="2026-04-21 07:53:36.101400454 +0000 UTC m=+115.939787804" watchObservedRunningTime="2026-04-21 07:53:36.101991121 +0000 UTC m=+115.940378469" Apr 21 07:53:38.524201 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:38.524146 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f41707d-3f57-4540-b483-7dd01a7d4ef0-service-ca-bundle\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:38.524643 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:38.524239 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-metrics-certs\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:38.524643 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:38.524314 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9f41707d-3f57-4540-b483-7dd01a7d4ef0-service-ca-bundle podName:9f41707d-3f57-4540-b483-7dd01a7d4ef0 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:54.524294757 +0000 UTC m=+134.362682084 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9f41707d-3f57-4540-b483-7dd01a7d4ef0-service-ca-bundle") pod "router-default-d5f96b85b-ghcsf" (UID: "9f41707d-3f57-4540-b483-7dd01a7d4ef0") : configmap references non-existent config key: service-ca.crt Apr 21 07:53:38.524643 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:38.524382 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 21 07:53:38.524643 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:38.524442 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-metrics-certs podName:9f41707d-3f57-4540-b483-7dd01a7d4ef0 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:54.524425264 +0000 UTC m=+134.362812598 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-metrics-certs") pod "router-default-d5f96b85b-ghcsf" (UID: "9f41707d-3f57-4540-b483-7dd01a7d4ef0") : secret "router-metrics-certs-default" not found Apr 21 07:53:39.665649 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.665613 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-l68lw"] Apr 21 07:53:39.668834 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.668813 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-l68lw" Apr 21 07:53:39.671223 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.671201 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 07:53:39.671516 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.671501 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vjklh\"" Apr 21 07:53:39.671587 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.671514 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 07:53:39.671587 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.671502 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 07:53:39.672344 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.672325 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 07:53:39.682157 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.682138 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-l68lw"] Apr 21 07:53:39.733141 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.733081 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6e65eff4-f7d9-4bb7-9121-580138a669e3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l68lw\" (UID: \"6e65eff4-f7d9-4bb7-9121-580138a669e3\") " pod="openshift-insights/insights-runtime-extractor-l68lw" Apr 21 07:53:39.733272 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.733164 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6e65eff4-f7d9-4bb7-9121-580138a669e3-crio-socket\") pod \"insights-runtime-extractor-l68lw\" (UID: \"6e65eff4-f7d9-4bb7-9121-580138a669e3\") " pod="openshift-insights/insights-runtime-extractor-l68lw" Apr 21 07:53:39.733336 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.733280 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6e65eff4-f7d9-4bb7-9121-580138a669e3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l68lw\" (UID: \"6e65eff4-f7d9-4bb7-9121-580138a669e3\") " pod="openshift-insights/insights-runtime-extractor-l68lw" Apr 21 07:53:39.733336 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.733307 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rbxc\" (UniqueName: \"kubernetes.io/projected/6e65eff4-f7d9-4bb7-9121-580138a669e3-kube-api-access-6rbxc\") pod \"insights-runtime-extractor-l68lw\" (UID: \"6e65eff4-f7d9-4bb7-9121-580138a669e3\") " pod="openshift-insights/insights-runtime-extractor-l68lw" Apr 21 07:53:39.733417 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.733372 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6e65eff4-f7d9-4bb7-9121-580138a669e3-data-volume\") pod \"insights-runtime-extractor-l68lw\" (UID: \"6e65eff4-f7d9-4bb7-9121-580138a669e3\") " pod="openshift-insights/insights-runtime-extractor-l68lw" Apr 21 07:53:39.834388 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.834353 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6e65eff4-f7d9-4bb7-9121-580138a669e3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l68lw\" (UID: \"6e65eff4-f7d9-4bb7-9121-580138a669e3\") " pod="openshift-insights/insights-runtime-extractor-l68lw" Apr 21 07:53:39.834388 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.834385 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6rbxc\" (UniqueName: \"kubernetes.io/projected/6e65eff4-f7d9-4bb7-9121-580138a669e3-kube-api-access-6rbxc\") pod \"insights-runtime-extractor-l68lw\" (UID: \"6e65eff4-f7d9-4bb7-9121-580138a669e3\") " pod="openshift-insights/insights-runtime-extractor-l68lw" Apr 21 07:53:39.834588 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.834416 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6e65eff4-f7d9-4bb7-9121-580138a669e3-data-volume\") pod \"insights-runtime-extractor-l68lw\" (UID: \"6e65eff4-f7d9-4bb7-9121-580138a669e3\") " pod="openshift-insights/insights-runtime-extractor-l68lw" Apr 21 07:53:39.834588 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.834446 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6e65eff4-f7d9-4bb7-9121-580138a669e3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l68lw\" (UID: \"6e65eff4-f7d9-4bb7-9121-580138a669e3\") " pod="openshift-insights/insights-runtime-extractor-l68lw" Apr 21 07:53:39.834588 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:39.834522 2570 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 07:53:39.834588 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:39.834589 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e65eff4-f7d9-4bb7-9121-580138a669e3-insights-runtime-extractor-tls podName:6e65eff4-f7d9-4bb7-9121-580138a669e3 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:40.334573688 +0000 UTC m=+120.172961016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/6e65eff4-f7d9-4bb7-9121-580138a669e3-insights-runtime-extractor-tls") pod "insights-runtime-extractor-l68lw" (UID: "6e65eff4-f7d9-4bb7-9121-580138a669e3") : secret "insights-runtime-extractor-tls" not found Apr 21 07:53:39.834798 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.834578 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6e65eff4-f7d9-4bb7-9121-580138a669e3-crio-socket\") pod \"insights-runtime-extractor-l68lw\" (UID: \"6e65eff4-f7d9-4bb7-9121-580138a669e3\") " pod="openshift-insights/insights-runtime-extractor-l68lw" Apr 21 07:53:39.834798 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.834639 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/6e65eff4-f7d9-4bb7-9121-580138a669e3-crio-socket\") pod \"insights-runtime-extractor-l68lw\" (UID: \"6e65eff4-f7d9-4bb7-9121-580138a669e3\") " pod="openshift-insights/insights-runtime-extractor-l68lw" Apr 21 07:53:39.834798 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.834726 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/6e65eff4-f7d9-4bb7-9121-580138a669e3-data-volume\") pod \"insights-runtime-extractor-l68lw\" (UID: \"6e65eff4-f7d9-4bb7-9121-580138a669e3\") " pod="openshift-insights/insights-runtime-extractor-l68lw" Apr 21 07:53:39.835064 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.835045 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/6e65eff4-f7d9-4bb7-9121-580138a669e3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-l68lw\" (UID: \"6e65eff4-f7d9-4bb7-9121-580138a669e3\") " pod="openshift-insights/insights-runtime-extractor-l68lw" Apr 21 07:53:39.844343 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.844323 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rbxc\" (UniqueName: \"kubernetes.io/projected/6e65eff4-f7d9-4bb7-9121-580138a669e3-kube-api-access-6rbxc\") pod \"insights-runtime-extractor-l68lw\" (UID: \"6e65eff4-f7d9-4bb7-9121-580138a669e3\") " pod="openshift-insights/insights-runtime-extractor-l68lw" Apr 21 07:53:39.970537 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.970447 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" Apr 21 07:53:39.970537 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.970485 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" Apr 21 07:53:39.970924 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:39.970906 2570 scope.go:117] "RemoveContainer" containerID="7c12718b852187b33e5489876ca937ce83ae54be98bb4713364dc363dd4fef03" Apr 21 07:53:39.971116 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:39.971095 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-kgsrp_openshift-console-operator(be0cbccd-e9f7-4b37-b3e3-e6ef1514b734)\"" pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" podUID="be0cbccd-e9f7-4b37-b3e3-e6ef1514b734" Apr 21 07:53:40.338182 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:40.338086 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6e65eff4-f7d9-4bb7-9121-580138a669e3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l68lw\" (UID: \"6e65eff4-f7d9-4bb7-9121-580138a669e3\") " pod="openshift-insights/insights-runtime-extractor-l68lw" Apr 21 07:53:40.338328 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:40.338236 2570 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 07:53:40.338328 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:40.338304 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e65eff4-f7d9-4bb7-9121-580138a669e3-insights-runtime-extractor-tls podName:6e65eff4-f7d9-4bb7-9121-580138a669e3 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:41.338290103 +0000 UTC m=+121.176677434 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/6e65eff4-f7d9-4bb7-9121-580138a669e3-insights-runtime-extractor-tls") pod "insights-runtime-extractor-l68lw" (UID: "6e65eff4-f7d9-4bb7-9121-580138a669e3") : secret "insights-runtime-extractor-tls" not found Apr 21 07:53:41.346364 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:41.346312 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6e65eff4-f7d9-4bb7-9121-580138a669e3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l68lw\" (UID: \"6e65eff4-f7d9-4bb7-9121-580138a669e3\") " pod="openshift-insights/insights-runtime-extractor-l68lw" Apr 21 07:53:41.346713 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:41.346467 2570 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 07:53:41.346713 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:41.346534 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e65eff4-f7d9-4bb7-9121-580138a669e3-insights-runtime-extractor-tls podName:6e65eff4-f7d9-4bb7-9121-580138a669e3 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:43.346518066 +0000 UTC m=+123.184905393 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/6e65eff4-f7d9-4bb7-9121-580138a669e3-insights-runtime-extractor-tls") pod "insights-runtime-extractor-l68lw" (UID: "6e65eff4-f7d9-4bb7-9121-580138a669e3") : secret "insights-runtime-extractor-tls" not found Apr 21 07:53:43.361368 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:43.361322 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6e65eff4-f7d9-4bb7-9121-580138a669e3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l68lw\" (UID: \"6e65eff4-f7d9-4bb7-9121-580138a669e3\") " pod="openshift-insights/insights-runtime-extractor-l68lw" Apr 21 07:53:43.361835 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:43.361465 2570 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 07:53:43.361835 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:43.361533 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e65eff4-f7d9-4bb7-9121-580138a669e3-insights-runtime-extractor-tls podName:6e65eff4-f7d9-4bb7-9121-580138a669e3 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:47.361514867 +0000 UTC m=+127.199902195 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/6e65eff4-f7d9-4bb7-9121-580138a669e3-insights-runtime-extractor-tls") pod "insights-runtime-extractor-l68lw" (UID: "6e65eff4-f7d9-4bb7-9121-580138a669e3") : secret "insights-runtime-extractor-tls" not found Apr 21 07:53:43.462144 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:43.462103 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3ee1e09-3f66-4942-b704-81077b9efa31-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xnjzr\" (UID: \"e3ee1e09-3f66-4942-b704-81077b9efa31\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xnjzr" Apr 21 07:53:43.464650 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:43.464625 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e3ee1e09-3f66-4942-b704-81077b9efa31-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-xnjzr\" (UID: \"e3ee1e09-3f66-4942-b704-81077b9efa31\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xnjzr" Apr 21 07:53:43.568067 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:43.568041 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-k5gkc\"" Apr 21 07:53:43.576243 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:43.576219 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xnjzr" Apr 21 07:53:43.687590 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:43.687441 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xnjzr"] Apr 21 07:53:44.106826 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:44.106796 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xnjzr" event={"ID":"e3ee1e09-3f66-4942-b704-81077b9efa31","Type":"ContainerStarted","Data":"aa9cf334e97c715565fb7a54e8b0dd66040f31b0a2502cd39c29bb113a5ce224"} Apr 21 07:53:46.112460 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:46.112419 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xnjzr" event={"ID":"e3ee1e09-3f66-4942-b704-81077b9efa31","Type":"ContainerStarted","Data":"793ad6e56bc1cc8544f95586853044b003d885085913f2f5448580b3439b7409"} Apr 21 07:53:46.112798 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:46.112465 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xnjzr" event={"ID":"e3ee1e09-3f66-4942-b704-81077b9efa31","Type":"ContainerStarted","Data":"05ca0776044a4a9c21c8ba7506fd80a804533f3829fb2e30ec7cb7254e37e4b8"} Apr 21 07:53:46.129909 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:46.129841 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-xnjzr" podStartSLOduration=17.336722614 podStartE2EDuration="19.129829776s" podCreationTimestamp="2026-04-21 07:53:27 +0000 UTC" firstStartedPulling="2026-04-21 07:53:43.724622527 +0000 UTC m=+123.563009854" lastFinishedPulling="2026-04-21 07:53:45.517729686 +0000 UTC m=+125.356117016" observedRunningTime="2026-04-21 07:53:46.128402613 +0000 UTC m=+125.966789962" watchObservedRunningTime="2026-04-21 07:53:46.129829776 +0000 UTC m=+125.968217124" Apr 21 07:53:47.395226 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:47.395190 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6e65eff4-f7d9-4bb7-9121-580138a669e3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l68lw\" (UID: \"6e65eff4-f7d9-4bb7-9121-580138a669e3\") " pod="openshift-insights/insights-runtime-extractor-l68lw" Apr 21 07:53:47.395575 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:47.395337 2570 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 21 07:53:47.395575 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:47.395408 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6e65eff4-f7d9-4bb7-9121-580138a669e3-insights-runtime-extractor-tls podName:6e65eff4-f7d9-4bb7-9121-580138a669e3 nodeName:}" failed. No retries permitted until 2026-04-21 07:53:55.395392897 +0000 UTC m=+135.233780235 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/6e65eff4-f7d9-4bb7-9121-580138a669e3-insights-runtime-extractor-tls") pod "insights-runtime-extractor-l68lw" (UID: "6e65eff4-f7d9-4bb7-9121-580138a669e3") : secret "insights-runtime-extractor-tls" not found Apr 21 07:53:50.519341 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:50.519304 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs\") pod \"network-metrics-daemon-bbxng\" (UID: \"62e778a8-8270-4560-9d0d-41a95a3c9c5f\") " pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:53:50.519713 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:50.519439 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 07:53:50.519713 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:50.519500 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs podName:62e778a8-8270-4560-9d0d-41a95a3c9c5f nodeName:}" failed. No retries permitted until 2026-04-21 07:55:52.519485662 +0000 UTC m=+252.357872988 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs") pod "network-metrics-daemon-bbxng" (UID: "62e778a8-8270-4560-9d0d-41a95a3c9c5f") : secret "metrics-daemon-secret" not found Apr 21 07:53:51.770240 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:51.770211 2570 scope.go:117] "RemoveContainer" containerID="7c12718b852187b33e5489876ca937ce83ae54be98bb4713364dc363dd4fef03" Apr 21 07:53:52.130558 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:52.130533 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kgsrp_be0cbccd-e9f7-4b37-b3e3-e6ef1514b734/console-operator/2.log" Apr 21 07:53:52.130904 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:52.130889 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kgsrp_be0cbccd-e9f7-4b37-b3e3-e6ef1514b734/console-operator/1.log" Apr 21 07:53:52.130950 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:52.130925 2570 generic.go:358] "Generic (PLEG): container finished" podID="be0cbccd-e9f7-4b37-b3e3-e6ef1514b734" containerID="5a8ea2901e98d0c1f29aeba30dffa5cd77792b1f662c9a882f97814a5f6a89cf" exitCode=255 Apr 21 07:53:52.130992 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:52.130976 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" event={"ID":"be0cbccd-e9f7-4b37-b3e3-e6ef1514b734","Type":"ContainerDied","Data":"5a8ea2901e98d0c1f29aeba30dffa5cd77792b1f662c9a882f97814a5f6a89cf"} Apr 21 07:53:52.131036 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:52.131015 2570 scope.go:117] "RemoveContainer" containerID="7c12718b852187b33e5489876ca937ce83ae54be98bb4713364dc363dd4fef03" Apr 21 07:53:52.131378 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:52.131350 2570 scope.go:117] "RemoveContainer" containerID="5a8ea2901e98d0c1f29aeba30dffa5cd77792b1f662c9a882f97814a5f6a89cf" Apr 21 07:53:52.131546 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:52.131528 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-kgsrp_openshift-console-operator(be0cbccd-e9f7-4b37-b3e3-e6ef1514b734)\"" pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" podUID="be0cbccd-e9f7-4b37-b3e3-e6ef1514b734" Apr 21 07:53:53.134144 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:53.134119 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kgsrp_be0cbccd-e9f7-4b37-b3e3-e6ef1514b734/console-operator/2.log" Apr 21 07:53:54.550198 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:54.550151 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-metrics-certs\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:54.550690 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:54.550239 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f41707d-3f57-4540-b483-7dd01a7d4ef0-service-ca-bundle\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:54.550773 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:54.550754 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f41707d-3f57-4540-b483-7dd01a7d4ef0-service-ca-bundle\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:54.552489 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:54.552472 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9f41707d-3f57-4540-b483-7dd01a7d4ef0-metrics-certs\") pod \"router-default-d5f96b85b-ghcsf\" (UID: \"9f41707d-3f57-4540-b483-7dd01a7d4ef0\") " pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:54.801479 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:54.801395 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-9tzkg\"" Apr 21 07:53:54.809216 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:54.809197 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:54.924115 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:54.924087 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-d5f96b85b-ghcsf"] Apr 21 07:53:54.926937 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:53:54.926909 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f41707d_3f57_4540_b483_7dd01a7d4ef0.slice/crio-9600744808d50daa48a96fe0e9b1449cb17c70679b3915e8c7b82d8d258eb565 WatchSource:0}: Error finding container 9600744808d50daa48a96fe0e9b1449cb17c70679b3915e8c7b82d8d258eb565: Status 404 returned error can't find the container with id 9600744808d50daa48a96fe0e9b1449cb17c70679b3915e8c7b82d8d258eb565 Apr 21 07:53:55.140656 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:55.140621 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-d5f96b85b-ghcsf" event={"ID":"9f41707d-3f57-4540-b483-7dd01a7d4ef0","Type":"ContainerStarted","Data":"0e67e4135077cb28f82c6586beece83d53b29d310df847d62705d793d6ba60a7"} Apr 21 07:53:55.140819 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:55.140665 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-d5f96b85b-ghcsf" event={"ID":"9f41707d-3f57-4540-b483-7dd01a7d4ef0","Type":"ContainerStarted","Data":"9600744808d50daa48a96fe0e9b1449cb17c70679b3915e8c7b82d8d258eb565"} Apr 21 07:53:55.156730 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:55.156685 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-d5f96b85b-ghcsf" podStartSLOduration=33.156671506 podStartE2EDuration="33.156671506s" podCreationTimestamp="2026-04-21 07:53:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:53:55.156446912 +0000 UTC m=+134.994834260" watchObservedRunningTime="2026-04-21 07:53:55.156671506 +0000 UTC m=+134.995058854" Apr 21 07:53:55.457547 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:55.457454 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6e65eff4-f7d9-4bb7-9121-580138a669e3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l68lw\" (UID: \"6e65eff4-f7d9-4bb7-9121-580138a669e3\") " pod="openshift-insights/insights-runtime-extractor-l68lw" Apr 21 07:53:55.459663 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:55.459634 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/6e65eff4-f7d9-4bb7-9121-580138a669e3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-l68lw\" (UID: \"6e65eff4-f7d9-4bb7-9121-580138a669e3\") " pod="openshift-insights/insights-runtime-extractor-l68lw" Apr 21 07:53:55.580068 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:55.580033 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-vjklh\"" Apr 21 07:53:55.588715 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:55.588690 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-l68lw" Apr 21 07:53:55.702320 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:55.702287 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-l68lw"] Apr 21 07:53:55.705772 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:53:55.705748 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e65eff4_f7d9_4bb7_9121_580138a669e3.slice/crio-a4e4357cd13abf6316469d178f9043f406c8e3e653fe0b6686a89c0ee8bb667b WatchSource:0}: Error finding container a4e4357cd13abf6316469d178f9043f406c8e3e653fe0b6686a89c0ee8bb667b: Status 404 returned error can't find the container with id a4e4357cd13abf6316469d178f9043f406c8e3e653fe0b6686a89c0ee8bb667b Apr 21 07:53:55.809844 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:55.809809 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:55.812199 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:55.812175 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:56.144165 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:56.144132 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l68lw" event={"ID":"6e65eff4-f7d9-4bb7-9121-580138a669e3","Type":"ContainerStarted","Data":"2777044eea8ab99e4d87825a4f177b9dd86cb386856c93e7e9c2663f5063443d"} Apr 21 07:53:56.144165 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:56.144168 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l68lw" event={"ID":"6e65eff4-f7d9-4bb7-9121-580138a669e3","Type":"ContainerStarted","Data":"a4e4357cd13abf6316469d178f9043f406c8e3e653fe0b6686a89c0ee8bb667b"} Apr 21 07:53:56.144398 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:56.144337 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:56.145667 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:56.145646 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-d5f96b85b-ghcsf" Apr 21 07:53:57.148116 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.148067 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l68lw" event={"ID":"6e65eff4-f7d9-4bb7-9121-580138a669e3","Type":"ContainerStarted","Data":"2dbad1bc604e331d060e7fb64c7845d2a8757fc9a5ecfa022df24c4eb261b380"} Apr 21 07:53:57.778194 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.778160 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-xj2ss"] Apr 21 07:53:57.781523 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.781500 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-545b5c7bc8-c52cb"] Apr 21 07:53:57.781681 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.781662 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-xj2ss" Apr 21 07:53:57.784101 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.784079 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 21 07:53:57.784218 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.784082 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-flcp4\"" Apr 21 07:53:57.784218 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.784206 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.784493 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.784390 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 21 07:53:57.786646 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.786615 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 07:53:57.786738 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.786677 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-cp6pq\"" Apr 21 07:53:57.786807 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.786776 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 07:53:57.786985 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.786938 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 07:53:57.791468 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.791444 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 07:53:57.794168 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.794148 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-xj2ss"] Apr 21 07:53:57.800521 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.800502 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-545b5c7bc8-c52cb"] Apr 21 07:53:57.844950 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.844922 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-545b5c7bc8-c52cb"] Apr 21 07:53:57.845149 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:57.845124 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[bound-sa-token ca-trust-extracted image-registry-private-configuration installation-pull-secrets kube-api-access-89l2j registry-certificates registry-tls trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" podUID="2d7f00bd-2448-47d2-9cff-0ff49baa640a" Apr 21 07:53:57.875999 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.875959 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d7f00bd-2448-47d2-9cff-0ff49baa640a-trusted-ca\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.876186 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.876020 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f8912392-4510-4052-97fe-c1f6926ae955-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xj2ss\" (UID: \"f8912392-4510-4052-97fe-c1f6926ae955\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xj2ss" Apr 21 07:53:57.876186 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.876056 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2d7f00bd-2448-47d2-9cff-0ff49baa640a-registry-certificates\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.876186 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.876085 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2d7f00bd-2448-47d2-9cff-0ff49baa640a-registry-tls\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.876186 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.876128 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2d7f00bd-2448-47d2-9cff-0ff49baa640a-ca-trust-extracted\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.876186 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.876156 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2d7f00bd-2448-47d2-9cff-0ff49baa640a-installation-pull-secrets\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.876444 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.876192 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89l2j\" (UniqueName: \"kubernetes.io/projected/2d7f00bd-2448-47d2-9cff-0ff49baa640a-kube-api-access-89l2j\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.876444 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.876217 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2d7f00bd-2448-47d2-9cff-0ff49baa640a-image-registry-private-configuration\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.876444 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.876251 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d7f00bd-2448-47d2-9cff-0ff49baa640a-bound-sa-token\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.876444 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.876290 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f8912392-4510-4052-97fe-c1f6926ae955-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-xj2ss\" (UID: \"f8912392-4510-4052-97fe-c1f6926ae955\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xj2ss" Apr 21 07:53:57.877110 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.877090 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6459cccc96-5nt4w"] Apr 21 07:53:57.880142 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.880124 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:57.891508 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.891466 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6459cccc96-5nt4w"] Apr 21 07:53:57.977366 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.977328 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d7f00bd-2448-47d2-9cff-0ff49baa640a-trusted-ca\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.977541 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.977377 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-bound-sa-token\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:57.977541 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.977399 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnn4f\" (UniqueName: \"kubernetes.io/projected/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-kube-api-access-qnn4f\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:57.977541 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.977456 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f8912392-4510-4052-97fe-c1f6926ae955-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xj2ss\" (UID: \"f8912392-4510-4052-97fe-c1f6926ae955\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xj2ss" Apr 21 07:53:57.977541 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.977492 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2d7f00bd-2448-47d2-9cff-0ff49baa640a-registry-certificates\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.977541 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.977522 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2d7f00bd-2448-47d2-9cff-0ff49baa640a-registry-tls\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.977800 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.977669 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-image-registry-private-configuration\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:57.977800 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.977724 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-installation-pull-secrets\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:57.977800 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.977761 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2d7f00bd-2448-47d2-9cff-0ff49baa640a-ca-trust-extracted\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.977800 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.977792 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2d7f00bd-2448-47d2-9cff-0ff49baa640a-installation-pull-secrets\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.978058 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.977819 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-ca-trust-extracted\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:57.978058 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.977884 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89l2j\" (UniqueName: \"kubernetes.io/projected/2d7f00bd-2448-47d2-9cff-0ff49baa640a-kube-api-access-89l2j\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.978058 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.977930 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2d7f00bd-2448-47d2-9cff-0ff49baa640a-image-registry-private-configuration\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.978058 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.977975 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-registry-tls\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:57.978058 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.978019 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d7f00bd-2448-47d2-9cff-0ff49baa640a-bound-sa-token\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.978297 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.978069 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f8912392-4510-4052-97fe-c1f6926ae955-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-xj2ss\" (UID: \"f8912392-4510-4052-97fe-c1f6926ae955\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xj2ss" Apr 21 07:53:57.978297 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.978114 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-registry-certificates\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:57.978297 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.978139 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-trusted-ca\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:57.978297 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.978214 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2d7f00bd-2448-47d2-9cff-0ff49baa640a-ca-trust-extracted\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.978476 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.978336 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d7f00bd-2448-47d2-9cff-0ff49baa640a-trusted-ca\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.978721 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.978699 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/f8912392-4510-4052-97fe-c1f6926ae955-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-xj2ss\" (UID: \"f8912392-4510-4052-97fe-c1f6926ae955\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xj2ss" Apr 21 07:53:57.979462 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.979439 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2d7f00bd-2448-47d2-9cff-0ff49baa640a-registry-certificates\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.980536 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.980512 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2d7f00bd-2448-47d2-9cff-0ff49baa640a-registry-tls\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.980933 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.980909 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2d7f00bd-2448-47d2-9cff-0ff49baa640a-image-registry-private-configuration\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.981159 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.981135 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2d7f00bd-2448-47d2-9cff-0ff49baa640a-installation-pull-secrets\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.981669 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.981636 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/f8912392-4510-4052-97fe-c1f6926ae955-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-xj2ss\" (UID: \"f8912392-4510-4052-97fe-c1f6926ae955\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-xj2ss" Apr 21 07:53:57.986099 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.986076 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d7f00bd-2448-47d2-9cff-0ff49baa640a-bound-sa-token\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:57.986207 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:57.986190 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89l2j\" (UniqueName: \"kubernetes.io/projected/2d7f00bd-2448-47d2-9cff-0ff49baa640a-kube-api-access-89l2j\") pod \"image-registry-545b5c7bc8-c52cb\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:58.078908 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.078810 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-image-registry-private-configuration\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:58.078908 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.078846 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-installation-pull-secrets\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:58.078908 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.078883 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-ca-trust-extracted\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:58.079159 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.078923 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-registry-tls\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:58.079159 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.078981 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-registry-certificates\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:58.079159 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.079004 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-trusted-ca\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:58.079159 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.079052 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-bound-sa-token\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:58.079159 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.079079 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qnn4f\" (UniqueName: \"kubernetes.io/projected/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-kube-api-access-qnn4f\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:58.079431 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.079395 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-ca-trust-extracted\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:58.079918 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.079898 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-registry-certificates\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:58.080134 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.080085 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-trusted-ca\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:58.081803 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.081783 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-image-registry-private-configuration\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:58.081935 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.081913 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-registry-tls\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:58.082067 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.082048 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-installation-pull-secrets\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:58.089264 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.089208 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-bound-sa-token\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:58.089358 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.089313 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnn4f\" (UniqueName: \"kubernetes.io/projected/3d259e3f-23a3-4b2f-8d21-5ddf47a2d960-kube-api-access-qnn4f\") pod \"image-registry-6459cccc96-5nt4w\" (UID: \"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960\") " pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:58.094196 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.094152 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-xj2ss" Apr 21 07:53:58.153213 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.152912 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:58.158484 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.158464 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:58.191604 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.189989 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:58.255272 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.255212 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-xj2ss"] Apr 21 07:53:58.280033 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.280006 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89l2j\" (UniqueName: \"kubernetes.io/projected/2d7f00bd-2448-47d2-9cff-0ff49baa640a-kube-api-access-89l2j\") pod \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " Apr 21 07:53:58.280133 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.280051 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2d7f00bd-2448-47d2-9cff-0ff49baa640a-registry-tls\") pod \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " Apr 21 07:53:58.280133 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.280100 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d7f00bd-2448-47d2-9cff-0ff49baa640a-trusted-ca\") pod \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " Apr 21 07:53:58.280133 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.280120 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2d7f00bd-2448-47d2-9cff-0ff49baa640a-image-registry-private-configuration\") pod \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " Apr 21 07:53:58.280290 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.280157 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2d7f00bd-2448-47d2-9cff-0ff49baa640a-installation-pull-secrets\") pod \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " Apr 21 07:53:58.280290 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.280187 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2d7f00bd-2448-47d2-9cff-0ff49baa640a-registry-certificates\") pod \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " Apr 21 07:53:58.280290 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.280255 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d7f00bd-2448-47d2-9cff-0ff49baa640a-bound-sa-token\") pod \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " Apr 21 07:53:58.280290 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.280287 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2d7f00bd-2448-47d2-9cff-0ff49baa640a-ca-trust-extracted\") pod \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\" (UID: \"2d7f00bd-2448-47d2-9cff-0ff49baa640a\") " Apr 21 07:53:58.281346 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.281126 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d7f00bd-2448-47d2-9cff-0ff49baa640a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "2d7f00bd-2448-47d2-9cff-0ff49baa640a" (UID: "2d7f00bd-2448-47d2-9cff-0ff49baa640a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:53:58.281829 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.281510 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d7f00bd-2448-47d2-9cff-0ff49baa640a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "2d7f00bd-2448-47d2-9cff-0ff49baa640a" (UID: "2d7f00bd-2448-47d2-9cff-0ff49baa640a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 07:53:58.281943 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.281706 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d7f00bd-2448-47d2-9cff-0ff49baa640a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "2d7f00bd-2448-47d2-9cff-0ff49baa640a" (UID: "2d7f00bd-2448-47d2-9cff-0ff49baa640a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:53:58.283435 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.283412 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d7f00bd-2448-47d2-9cff-0ff49baa640a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "2d7f00bd-2448-47d2-9cff-0ff49baa640a" (UID: "2d7f00bd-2448-47d2-9cff-0ff49baa640a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:53:58.283521 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.283473 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d7f00bd-2448-47d2-9cff-0ff49baa640a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "2d7f00bd-2448-47d2-9cff-0ff49baa640a" (UID: "2d7f00bd-2448-47d2-9cff-0ff49baa640a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:53:58.287297 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.287090 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d7f00bd-2448-47d2-9cff-0ff49baa640a-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "2d7f00bd-2448-47d2-9cff-0ff49baa640a" (UID: "2d7f00bd-2448-47d2-9cff-0ff49baa640a"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:53:58.287297 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.287141 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d7f00bd-2448-47d2-9cff-0ff49baa640a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "2d7f00bd-2448-47d2-9cff-0ff49baa640a" (UID: "2d7f00bd-2448-47d2-9cff-0ff49baa640a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:53:58.287422 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.287309 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d7f00bd-2448-47d2-9cff-0ff49baa640a-kube-api-access-89l2j" (OuterVolumeSpecName: "kube-api-access-89l2j") pod "2d7f00bd-2448-47d2-9cff-0ff49baa640a" (UID: "2d7f00bd-2448-47d2-9cff-0ff49baa640a"). InnerVolumeSpecName "kube-api-access-89l2j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:53:58.338639 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.338563 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6459cccc96-5nt4w"] Apr 21 07:53:58.342834 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:53:58.342799 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d259e3f_23a3_4b2f_8d21_5ddf47a2d960.slice/crio-9cf73c55ec616e45416f38236b62633f9abd3a3a04bcc92c8ca59a4f32d20a2a WatchSource:0}: Error finding container 9cf73c55ec616e45416f38236b62633f9abd3a3a04bcc92c8ca59a4f32d20a2a: Status 404 returned error can't find the container with id 9cf73c55ec616e45416f38236b62633f9abd3a3a04bcc92c8ca59a4f32d20a2a Apr 21 07:53:58.381521 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.381482 2570 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2d7f00bd-2448-47d2-9cff-0ff49baa640a-bound-sa-token\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:53:58.381521 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.381509 2570 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/2d7f00bd-2448-47d2-9cff-0ff49baa640a-ca-trust-extracted\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:53:58.381521 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.381525 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-89l2j\" (UniqueName: \"kubernetes.io/projected/2d7f00bd-2448-47d2-9cff-0ff49baa640a-kube-api-access-89l2j\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:53:58.381701 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.381536 2570 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/2d7f00bd-2448-47d2-9cff-0ff49baa640a-registry-tls\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:53:58.381701 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.381545 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2d7f00bd-2448-47d2-9cff-0ff49baa640a-trusted-ca\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:53:58.381701 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.381553 2570 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/2d7f00bd-2448-47d2-9cff-0ff49baa640a-image-registry-private-configuration\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:53:58.381701 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.381562 2570 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/2d7f00bd-2448-47d2-9cff-0ff49baa640a-installation-pull-secrets\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:53:58.381701 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:58.381571 2570 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/2d7f00bd-2448-47d2-9cff-0ff49baa640a-registry-certificates\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:53:59.157685 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:59.157638 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-l68lw" event={"ID":"6e65eff4-f7d9-4bb7-9121-580138a669e3","Type":"ContainerStarted","Data":"651ed314c0c6048a67938f78571736fc7d742594e6bc07e34211e51883c593b5"} Apr 21 07:53:59.159064 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:59.159041 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" event={"ID":"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960","Type":"ContainerStarted","Data":"eb85237b0b2e72ee7d8a7893713b7d0a7b6970b1c073dee416b337846da1cfb2"} Apr 21 07:53:59.159183 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:59.159070 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" event={"ID":"3d259e3f-23a3-4b2f-8d21-5ddf47a2d960","Type":"ContainerStarted","Data":"9cf73c55ec616e45416f38236b62633f9abd3a3a04bcc92c8ca59a4f32d20a2a"} Apr 21 07:53:59.159183 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:59.159146 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:53:59.160143 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:59.160118 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-xj2ss" event={"ID":"f8912392-4510-4052-97fe-c1f6926ae955","Type":"ContainerStarted","Data":"4fb5155f10ecd14b54739a3ed24e7547045cf22050008488145c7e1bf35a5850"} Apr 21 07:53:59.160143 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:59.160130 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-545b5c7bc8-c52cb" Apr 21 07:53:59.173962 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:59.173924 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-l68lw" podStartSLOduration=17.749015022000002 podStartE2EDuration="20.173910918s" podCreationTimestamp="2026-04-21 07:53:39 +0000 UTC" firstStartedPulling="2026-04-21 07:53:55.758763237 +0000 UTC m=+135.597150567" lastFinishedPulling="2026-04-21 07:53:58.183659134 +0000 UTC m=+138.022046463" observedRunningTime="2026-04-21 07:53:59.173056303 +0000 UTC m=+139.011443652" watchObservedRunningTime="2026-04-21 07:53:59.173910918 +0000 UTC m=+139.012298267" Apr 21 07:53:59.188307 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:59.188237 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" podStartSLOduration=2.1882211480000002 podStartE2EDuration="2.188221148s" podCreationTimestamp="2026-04-21 07:53:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:53:59.188115076 +0000 UTC m=+139.026502426" watchObservedRunningTime="2026-04-21 07:53:59.188221148 +0000 UTC m=+139.026608498" Apr 21 07:53:59.212113 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:59.212073 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-545b5c7bc8-c52cb"] Apr 21 07:53:59.214118 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:59.214092 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-545b5c7bc8-c52cb"] Apr 21 07:53:59.970217 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:59.970184 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" Apr 21 07:53:59.970217 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:59.970214 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" Apr 21 07:53:59.970561 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:53:59.970549 2570 scope.go:117] "RemoveContainer" containerID="5a8ea2901e98d0c1f29aeba30dffa5cd77792b1f662c9a882f97814a5f6a89cf" Apr 21 07:53:59.970732 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:53:59.970715 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-kgsrp_openshift-console-operator(be0cbccd-e9f7-4b37-b3e3-e6ef1514b734)\"" pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" podUID="be0cbccd-e9f7-4b37-b3e3-e6ef1514b734" Apr 21 07:54:00.165387 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:00.165353 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-xj2ss" event={"ID":"f8912392-4510-4052-97fe-c1f6926ae955","Type":"ContainerStarted","Data":"5ce4b5d7bb3241cc478be1305cf7a674940cabfc4259eacf0598282e8d443dc4"} Apr 21 07:54:00.180042 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:00.179996 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-xj2ss" podStartSLOduration=2.03971038 podStartE2EDuration="3.179983169s" podCreationTimestamp="2026-04-21 07:53:57 +0000 UTC" firstStartedPulling="2026-04-21 07:53:58.271107936 +0000 UTC m=+138.109495277" lastFinishedPulling="2026-04-21 07:53:59.411380739 +0000 UTC m=+139.249768066" observedRunningTime="2026-04-21 07:54:00.179852202 +0000 UTC m=+140.018239550" watchObservedRunningTime="2026-04-21 07:54:00.179983169 +0000 UTC m=+140.018370517" Apr 21 07:54:00.772165 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:00.772130 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d7f00bd-2448-47d2-9cff-0ff49baa640a" path="/var/lib/kubelet/pods/2d7f00bd-2448-47d2-9cff-0ff49baa640a/volumes" Apr 21 07:54:05.639196 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.639164 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh"] Apr 21 07:54:05.644009 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.643987 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh" Apr 21 07:54:05.646295 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.646273 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 07:54:05.647122 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.647099 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 07:54:05.647729 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.647411 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 21 07:54:05.647729 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.647582 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-tpjgh\"" Apr 21 07:54:05.647907 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.647740 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 21 07:54:05.647907 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.647803 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 07:54:05.654124 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.654099 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh"] Apr 21 07:54:05.675136 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.675100 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hjljn"] Apr 21 07:54:05.678599 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.678576 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.680851 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.680831 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 07:54:05.681043 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.680983 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 07:54:05.681131 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.681066 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 07:54:05.681280 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.681206 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wdcf8\"" Apr 21 07:54:05.739697 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.739665 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/812db1cf-487b-4a90-a0c8-a72c028f5c45-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.739697 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.739697 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7mgq\" (UniqueName: \"kubernetes.io/projected/812db1cf-487b-4a90-a0c8-a72c028f5c45-kube-api-access-t7mgq\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.739910 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.739773 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dlfj\" (UniqueName: \"kubernetes.io/projected/c7ea73b2-778a-49e2-ba0d-7d918ce7b31d-kube-api-access-8dlfj\") pod \"openshift-state-metrics-9d44df66c-xhclh\" (UID: \"c7ea73b2-778a-49e2-ba0d-7d918ce7b31d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh" Apr 21 07:54:05.739910 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.739812 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7ea73b2-778a-49e2-ba0d-7d918ce7b31d-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-xhclh\" (UID: \"c7ea73b2-778a-49e2-ba0d-7d918ce7b31d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh" Apr 21 07:54:05.739910 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.739828 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/812db1cf-487b-4a90-a0c8-a72c028f5c45-node-exporter-textfile\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.739910 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.739851 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7ea73b2-778a-49e2-ba0d-7d918ce7b31d-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-xhclh\" (UID: \"c7ea73b2-778a-49e2-ba0d-7d918ce7b31d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh" Apr 21 07:54:05.739910 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.739883 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/812db1cf-487b-4a90-a0c8-a72c028f5c45-metrics-client-ca\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.739910 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.739903 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7ea73b2-778a-49e2-ba0d-7d918ce7b31d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-xhclh\" (UID: \"c7ea73b2-778a-49e2-ba0d-7d918ce7b31d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh" Apr 21 07:54:05.740128 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.739918 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/812db1cf-487b-4a90-a0c8-a72c028f5c45-sys\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.740128 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.739937 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/812db1cf-487b-4a90-a0c8-a72c028f5c45-node-exporter-tls\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.740128 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.740007 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/812db1cf-487b-4a90-a0c8-a72c028f5c45-node-exporter-wtmp\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.740128 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.740051 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/812db1cf-487b-4a90-a0c8-a72c028f5c45-node-exporter-accelerators-collector-config\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.740128 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.740075 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/812db1cf-487b-4a90-a0c8-a72c028f5c45-root\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.840907 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.840849 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/812db1cf-487b-4a90-a0c8-a72c028f5c45-root\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.840907 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.840913 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/812db1cf-487b-4a90-a0c8-a72c028f5c45-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.841104 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.840931 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7mgq\" (UniqueName: \"kubernetes.io/projected/812db1cf-487b-4a90-a0c8-a72c028f5c45-kube-api-access-t7mgq\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.841104 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.840954 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/812db1cf-487b-4a90-a0c8-a72c028f5c45-root\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.841104 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.840978 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dlfj\" (UniqueName: \"kubernetes.io/projected/c7ea73b2-778a-49e2-ba0d-7d918ce7b31d-kube-api-access-8dlfj\") pod \"openshift-state-metrics-9d44df66c-xhclh\" (UID: \"c7ea73b2-778a-49e2-ba0d-7d918ce7b31d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh" Apr 21 07:54:05.841104 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.841038 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7ea73b2-778a-49e2-ba0d-7d918ce7b31d-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-xhclh\" (UID: \"c7ea73b2-778a-49e2-ba0d-7d918ce7b31d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh" Apr 21 07:54:05.841104 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.841067 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/812db1cf-487b-4a90-a0c8-a72c028f5c45-node-exporter-textfile\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.841333 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.841103 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7ea73b2-778a-49e2-ba0d-7d918ce7b31d-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-xhclh\" (UID: \"c7ea73b2-778a-49e2-ba0d-7d918ce7b31d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh" Apr 21 07:54:05.841333 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.841126 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/812db1cf-487b-4a90-a0c8-a72c028f5c45-metrics-client-ca\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.841333 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.841157 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7ea73b2-778a-49e2-ba0d-7d918ce7b31d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-xhclh\" (UID: \"c7ea73b2-778a-49e2-ba0d-7d918ce7b31d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh" Apr 21 07:54:05.841333 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.841180 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/812db1cf-487b-4a90-a0c8-a72c028f5c45-sys\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.841333 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.841217 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/812db1cf-487b-4a90-a0c8-a72c028f5c45-node-exporter-tls\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.841333 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.841248 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/812db1cf-487b-4a90-a0c8-a72c028f5c45-node-exporter-wtmp\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.841333 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:54:05.841291 2570 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 21 07:54:05.841684 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:54:05.841356 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7ea73b2-778a-49e2-ba0d-7d918ce7b31d-openshift-state-metrics-tls podName:c7ea73b2-778a-49e2-ba0d-7d918ce7b31d nodeName:}" failed. No retries permitted until 2026-04-21 07:54:06.341335402 +0000 UTC m=+146.179722743 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/c7ea73b2-778a-49e2-ba0d-7d918ce7b31d-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-xhclh" (UID: "c7ea73b2-778a-49e2-ba0d-7d918ce7b31d") : secret "openshift-state-metrics-tls" not found Apr 21 07:54:05.841684 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.841498 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/812db1cf-487b-4a90-a0c8-a72c028f5c45-node-exporter-textfile\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.841684 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.841290 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/812db1cf-487b-4a90-a0c8-a72c028f5c45-node-exporter-accelerators-collector-config\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.841684 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.841609 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/812db1cf-487b-4a90-a0c8-a72c028f5c45-sys\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.841929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.841708 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/812db1cf-487b-4a90-a0c8-a72c028f5c45-node-exporter-wtmp\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.841929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.841891 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/812db1cf-487b-4a90-a0c8-a72c028f5c45-node-exporter-accelerators-collector-config\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.841929 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.841908 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/812db1cf-487b-4a90-a0c8-a72c028f5c45-metrics-client-ca\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.842322 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.842300 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7ea73b2-778a-49e2-ba0d-7d918ce7b31d-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-xhclh\" (UID: \"c7ea73b2-778a-49e2-ba0d-7d918ce7b31d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh" Apr 21 07:54:05.843356 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.843335 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/812db1cf-487b-4a90-a0c8-a72c028f5c45-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.843533 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.843511 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/812db1cf-487b-4a90-a0c8-a72c028f5c45-node-exporter-tls\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.843593 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.843558 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c7ea73b2-778a-49e2-ba0d-7d918ce7b31d-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-xhclh\" (UID: \"c7ea73b2-778a-49e2-ba0d-7d918ce7b31d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh" Apr 21 07:54:05.851004 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.850979 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7mgq\" (UniqueName: \"kubernetes.io/projected/812db1cf-487b-4a90-a0c8-a72c028f5c45-kube-api-access-t7mgq\") pod \"node-exporter-hjljn\" (UID: \"812db1cf-487b-4a90-a0c8-a72c028f5c45\") " pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.851422 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.851406 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dlfj\" (UniqueName: \"kubernetes.io/projected/c7ea73b2-778a-49e2-ba0d-7d918ce7b31d-kube-api-access-8dlfj\") pod \"openshift-state-metrics-9d44df66c-xhclh\" (UID: \"c7ea73b2-778a-49e2-ba0d-7d918ce7b31d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh" Apr 21 07:54:05.987261 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:05.987195 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hjljn" Apr 21 07:54:05.994946 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:54:05.994917 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod812db1cf_487b_4a90_a0c8_a72c028f5c45.slice/crio-69a785293bf08680b28ce24c907283509f080c6bd2b97e12ba7e50f5c2a53b2b WatchSource:0}: Error finding container 69a785293bf08680b28ce24c907283509f080c6bd2b97e12ba7e50f5c2a53b2b: Status 404 returned error can't find the container with id 69a785293bf08680b28ce24c907283509f080c6bd2b97e12ba7e50f5c2a53b2b Apr 21 07:54:06.182023 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.181989 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hjljn" event={"ID":"812db1cf-487b-4a90-a0c8-a72c028f5c45","Type":"ContainerStarted","Data":"69a785293bf08680b28ce24c907283509f080c6bd2b97e12ba7e50f5c2a53b2b"} Apr 21 07:54:06.345236 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.345148 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7ea73b2-778a-49e2-ba0d-7d918ce7b31d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-xhclh\" (UID: \"c7ea73b2-778a-49e2-ba0d-7d918ce7b31d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh" Apr 21 07:54:06.347918 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.347888 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7ea73b2-778a-49e2-ba0d-7d918ce7b31d-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-xhclh\" (UID: \"c7ea73b2-778a-49e2-ba0d-7d918ce7b31d\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh" Apr 21 07:54:06.557497 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.557461 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh" Apr 21 07:54:06.690598 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.690521 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh"] Apr 21 07:54:06.693083 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:54:06.693051 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7ea73b2_778a_49e2_ba0d_7d918ce7b31d.slice/crio-a64ac48a8e61824ec9bc0c877ca0bb687a531591d29bb5b26b91c278f70bf48f WatchSource:0}: Error finding container a64ac48a8e61824ec9bc0c877ca0bb687a531591d29bb5b26b91c278f70bf48f: Status 404 returned error can't find the container with id a64ac48a8e61824ec9bc0c877ca0bb687a531591d29bb5b26b91c278f70bf48f Apr 21 07:54:06.725310 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.725280 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 07:54:06.737514 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.737493 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.740021 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.739974 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 07:54:06.740114 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.740014 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 07:54:06.740114 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.740027 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 07:54:06.740234 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.740209 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 07:54:06.740284 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.740256 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 07:54:06.740590 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.740538 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 07:54:06.741026 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.741004 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 07:54:06.741146 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.741127 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 07:54:06.741223 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.741209 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 07:54:06.741668 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.741648 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-xbxr6\"" Apr 21 07:54:06.742415 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.742368 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 07:54:06.750103 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.750079 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fppxp\" (UniqueName: \"kubernetes.io/projected/8afbf877-96ee-4486-8378-8972eb8c7ebd-kube-api-access-fppxp\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.750199 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.750106 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.750199 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.750152 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8afbf877-96ee-4486-8378-8972eb8c7ebd-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.750199 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.750182 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-web-config\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.750299 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.750222 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8afbf877-96ee-4486-8378-8972eb8c7ebd-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.750299 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.750248 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.750299 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.750272 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8afbf877-96ee-4486-8378-8972eb8c7ebd-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.750299 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.750294 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-config-volume\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.750471 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.750353 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8afbf877-96ee-4486-8378-8972eb8c7ebd-config-out\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.750471 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.750396 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8afbf877-96ee-4486-8378-8972eb8c7ebd-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.750471 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.750421 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.750471 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.750454 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.750587 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.750474 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.851444 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.851410 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-config-volume\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.851591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.851490 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8afbf877-96ee-4486-8378-8972eb8c7ebd-config-out\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.851591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.851536 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8afbf877-96ee-4486-8378-8972eb8c7ebd-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.851591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.851571 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.851758 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.851605 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.851758 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.851636 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.851758 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.851691 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fppxp\" (UniqueName: \"kubernetes.io/projected/8afbf877-96ee-4486-8378-8972eb8c7ebd-kube-api-access-fppxp\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.851758 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.851716 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.851758 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.851741 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8afbf877-96ee-4486-8378-8972eb8c7ebd-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.852029 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.851768 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-web-config\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.852029 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.851820 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8afbf877-96ee-4486-8378-8972eb8c7ebd-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.852029 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.851883 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.852029 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.851917 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8afbf877-96ee-4486-8378-8972eb8c7ebd-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.853648 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.852517 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8afbf877-96ee-4486-8378-8972eb8c7ebd-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.853648 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:54:06.853180 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8afbf877-96ee-4486-8378-8972eb8c7ebd-alertmanager-trusted-ca-bundle podName:8afbf877-96ee-4486-8378-8972eb8c7ebd nodeName:}" failed. No retries permitted until 2026-04-21 07:54:07.353156781 +0000 UTC m=+147.191544129 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/8afbf877-96ee-4486-8378-8972eb8c7ebd-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "8afbf877-96ee-4486-8378-8972eb8c7ebd") : configmap references non-existent config key: ca-bundle.crt Apr 21 07:54:06.854307 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.854244 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8afbf877-96ee-4486-8378-8972eb8c7ebd-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.856619 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.856568 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.858715 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.858691 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8afbf877-96ee-4486-8378-8972eb8c7ebd-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.858819 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.858764 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-web-config\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.859152 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.859112 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8afbf877-96ee-4486-8378-8972eb8c7ebd-config-out\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.859255 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.859226 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.859363 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.859305 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-config-volume\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.859438 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.859396 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.860371 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.860352 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.860441 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.860389 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:06.861399 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:06.861382 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fppxp\" (UniqueName: \"kubernetes.io/projected/8afbf877-96ee-4486-8378-8972eb8c7ebd-kube-api-access-fppxp\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:07.185954 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:07.185918 2570 generic.go:358] "Generic (PLEG): container finished" podID="812db1cf-487b-4a90-a0c8-a72c028f5c45" containerID="9a47073be6de9eb09a94ec44c2f07c0745835c2ebd3c308aecf744316eaf2c56" exitCode=0 Apr 21 07:54:07.186098 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:07.186002 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hjljn" event={"ID":"812db1cf-487b-4a90-a0c8-a72c028f5c45","Type":"ContainerDied","Data":"9a47073be6de9eb09a94ec44c2f07c0745835c2ebd3c308aecf744316eaf2c56"} Apr 21 07:54:07.187582 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:07.187562 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh" event={"ID":"c7ea73b2-778a-49e2-ba0d-7d918ce7b31d","Type":"ContainerStarted","Data":"204c7971fe015f3ca6d82680d497e77c01928a329816034402cff57d8c384b22"} Apr 21 07:54:07.187698 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:07.187586 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh" event={"ID":"c7ea73b2-778a-49e2-ba0d-7d918ce7b31d","Type":"ContainerStarted","Data":"8c9df5aff61e9390be3be9ea73d99ac2b0858deebd8513f18f7cc0bb58d98e5f"} Apr 21 07:54:07.187698 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:07.187595 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh" event={"ID":"c7ea73b2-778a-49e2-ba0d-7d918ce7b31d","Type":"ContainerStarted","Data":"a64ac48a8e61824ec9bc0c877ca0bb687a531591d29bb5b26b91c278f70bf48f"} Apr 21 07:54:07.356042 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:07.356008 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8afbf877-96ee-4486-8378-8972eb8c7ebd-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:07.356813 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:07.356792 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8afbf877-96ee-4486-8378-8972eb8c7ebd-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:07.363998 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:07.363977 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:54:07.487489 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:07.487458 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 07:54:07.490553 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:54:07.490525 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8afbf877_96ee_4486_8378_8972eb8c7ebd.slice/crio-a656690e65b49520a8ac3a95885f14255e9c4e5e69b10f2fe3b2593e1ed91c55 WatchSource:0}: Error finding container a656690e65b49520a8ac3a95885f14255e9c4e5e69b10f2fe3b2593e1ed91c55: Status 404 returned error can't find the container with id a656690e65b49520a8ac3a95885f14255e9c4e5e69b10f2fe3b2593e1ed91c55 Apr 21 07:54:08.194408 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:08.194311 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hjljn" event={"ID":"812db1cf-487b-4a90-a0c8-a72c028f5c45","Type":"ContainerStarted","Data":"1260e6ea4c5e77aa235598f67d03dde174098b768f4368ef1a714f9d98c0bcc7"} Apr 21 07:54:08.194408 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:08.194353 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hjljn" event={"ID":"812db1cf-487b-4a90-a0c8-a72c028f5c45","Type":"ContainerStarted","Data":"098ab466ea49b1c1edc1572c37eef132acb64265b5cea2766ad51f484e61d00f"} Apr 21 07:54:08.195755 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:08.195722 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8afbf877-96ee-4486-8378-8972eb8c7ebd","Type":"ContainerStarted","Data":"a656690e65b49520a8ac3a95885f14255e9c4e5e69b10f2fe3b2593e1ed91c55"} Apr 21 07:54:08.213434 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:08.213386 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hjljn" podStartSLOduration=2.175322156 podStartE2EDuration="3.21337275s" podCreationTimestamp="2026-04-21 07:54:05 +0000 UTC" firstStartedPulling="2026-04-21 07:54:05.996460993 +0000 UTC m=+145.834848320" lastFinishedPulling="2026-04-21 07:54:07.034511586 +0000 UTC m=+146.872898914" observedRunningTime="2026-04-21 07:54:08.211076896 +0000 UTC m=+148.049464236" watchObservedRunningTime="2026-04-21 07:54:08.21337275 +0000 UTC m=+148.051760098" Apr 21 07:54:09.200446 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:09.200410 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh" event={"ID":"c7ea73b2-778a-49e2-ba0d-7d918ce7b31d","Type":"ContainerStarted","Data":"5f33cd720f9e59c9f00db6ef6056d1fd4221cfcd0ecadbadce636ab2944c4093"} Apr 21 07:54:09.201590 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:09.201568 2570 generic.go:358] "Generic (PLEG): container finished" podID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerID="074840e062f372612950ec8255a4d7c1a35426d545e0f33a37c0dce5d711bf2f" exitCode=0 Apr 21 07:54:09.201705 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:09.201655 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8afbf877-96ee-4486-8378-8972eb8c7ebd","Type":"ContainerDied","Data":"074840e062f372612950ec8255a4d7c1a35426d545e0f33a37c0dce5d711bf2f"} Apr 21 07:54:09.217458 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:09.217420 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-xhclh" podStartSLOduration=2.957085706 podStartE2EDuration="4.217406841s" podCreationTimestamp="2026-04-21 07:54:05 +0000 UTC" firstStartedPulling="2026-04-21 07:54:07.042712348 +0000 UTC m=+146.881099678" lastFinishedPulling="2026-04-21 07:54:08.303033474 +0000 UTC m=+148.141420813" observedRunningTime="2026-04-21 07:54:09.216552736 +0000 UTC m=+149.054940084" watchObservedRunningTime="2026-04-21 07:54:09.217406841 +0000 UTC m=+149.055794236" Apr 21 07:54:10.052234 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.052194 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-967d7fc7f-6g42n"] Apr 21 07:54:10.055568 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.055537 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.057670 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.057642 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 21 07:54:10.057800 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.057708 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-m4hls\"" Apr 21 07:54:10.058648 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.058624 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 21 07:54:10.058728 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.058671 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 21 07:54:10.058946 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.058928 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-fundiblduhthh\"" Apr 21 07:54:10.059019 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.058935 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 21 07:54:10.064536 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.064516 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-967d7fc7f-6g42n"] Apr 21 07:54:10.078187 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.078166 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/947c3943-080e-4c53-8507-bd9465492df2-secret-metrics-server-tls\") pod \"metrics-server-967d7fc7f-6g42n\" (UID: \"947c3943-080e-4c53-8507-bd9465492df2\") " pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.078280 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.078217 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/947c3943-080e-4c53-8507-bd9465492df2-secret-metrics-server-client-certs\") pod \"metrics-server-967d7fc7f-6g42n\" (UID: \"947c3943-080e-4c53-8507-bd9465492df2\") " pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.078330 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.078280 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/947c3943-080e-4c53-8507-bd9465492df2-audit-log\") pod \"metrics-server-967d7fc7f-6g42n\" (UID: \"947c3943-080e-4c53-8507-bd9465492df2\") " pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.082658 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.078670 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/947c3943-080e-4c53-8507-bd9465492df2-metrics-server-audit-profiles\") pod \"metrics-server-967d7fc7f-6g42n\" (UID: \"947c3943-080e-4c53-8507-bd9465492df2\") " pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.082658 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.078781 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/947c3943-080e-4c53-8507-bd9465492df2-client-ca-bundle\") pod \"metrics-server-967d7fc7f-6g42n\" (UID: \"947c3943-080e-4c53-8507-bd9465492df2\") " pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.082658 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.078883 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/947c3943-080e-4c53-8507-bd9465492df2-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-967d7fc7f-6g42n\" (UID: \"947c3943-080e-4c53-8507-bd9465492df2\") " pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.082658 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.078948 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szjvl\" (UniqueName: \"kubernetes.io/projected/947c3943-080e-4c53-8507-bd9465492df2-kube-api-access-szjvl\") pod \"metrics-server-967d7fc7f-6g42n\" (UID: \"947c3943-080e-4c53-8507-bd9465492df2\") " pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.179942 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.179897 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/947c3943-080e-4c53-8507-bd9465492df2-secret-metrics-server-tls\") pod \"metrics-server-967d7fc7f-6g42n\" (UID: \"947c3943-080e-4c53-8507-bd9465492df2\") " pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.180101 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.179970 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/947c3943-080e-4c53-8507-bd9465492df2-secret-metrics-server-client-certs\") pod \"metrics-server-967d7fc7f-6g42n\" (UID: \"947c3943-080e-4c53-8507-bd9465492df2\") " pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.180101 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.179998 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/947c3943-080e-4c53-8507-bd9465492df2-audit-log\") pod \"metrics-server-967d7fc7f-6g42n\" (UID: \"947c3943-080e-4c53-8507-bd9465492df2\") " pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.180228 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.180145 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/947c3943-080e-4c53-8507-bd9465492df2-metrics-server-audit-profiles\") pod \"metrics-server-967d7fc7f-6g42n\" (UID: \"947c3943-080e-4c53-8507-bd9465492df2\") " pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.180228 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.180197 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/947c3943-080e-4c53-8507-bd9465492df2-client-ca-bundle\") pod \"metrics-server-967d7fc7f-6g42n\" (UID: \"947c3943-080e-4c53-8507-bd9465492df2\") " pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.180325 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.180242 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/947c3943-080e-4c53-8507-bd9465492df2-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-967d7fc7f-6g42n\" (UID: \"947c3943-080e-4c53-8507-bd9465492df2\") " pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.180325 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.180282 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szjvl\" (UniqueName: \"kubernetes.io/projected/947c3943-080e-4c53-8507-bd9465492df2-kube-api-access-szjvl\") pod \"metrics-server-967d7fc7f-6g42n\" (UID: \"947c3943-080e-4c53-8507-bd9465492df2\") " pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.180446 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.180421 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/947c3943-080e-4c53-8507-bd9465492df2-audit-log\") pod \"metrics-server-967d7fc7f-6g42n\" (UID: \"947c3943-080e-4c53-8507-bd9465492df2\") " pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.181134 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.181108 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/947c3943-080e-4c53-8507-bd9465492df2-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-967d7fc7f-6g42n\" (UID: \"947c3943-080e-4c53-8507-bd9465492df2\") " pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.181260 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.181218 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/947c3943-080e-4c53-8507-bd9465492df2-metrics-server-audit-profiles\") pod \"metrics-server-967d7fc7f-6g42n\" (UID: \"947c3943-080e-4c53-8507-bd9465492df2\") " pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.182921 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.182897 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/947c3943-080e-4c53-8507-bd9465492df2-secret-metrics-server-tls\") pod \"metrics-server-967d7fc7f-6g42n\" (UID: \"947c3943-080e-4c53-8507-bd9465492df2\") " pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.183154 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.183128 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/947c3943-080e-4c53-8507-bd9465492df2-client-ca-bundle\") pod \"metrics-server-967d7fc7f-6g42n\" (UID: \"947c3943-080e-4c53-8507-bd9465492df2\") " pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.183154 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.183141 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/947c3943-080e-4c53-8507-bd9465492df2-secret-metrics-server-client-certs\") pod \"metrics-server-967d7fc7f-6g42n\" (UID: \"947c3943-080e-4c53-8507-bd9465492df2\") " pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.190557 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.190531 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szjvl\" (UniqueName: \"kubernetes.io/projected/947c3943-080e-4c53-8507-bd9465492df2-kube-api-access-szjvl\") pod \"metrics-server-967d7fc7f-6g42n\" (UID: \"947c3943-080e-4c53-8507-bd9465492df2\") " pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.367607 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.367570 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:10.425784 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.425748 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-zlxkn"] Apr 21 07:54:10.430183 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.430162 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zlxkn" Apr 21 07:54:10.432617 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.432591 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 21 07:54:10.432617 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.432606 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-jc2ds\"" Apr 21 07:54:10.438792 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.438764 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-zlxkn"] Apr 21 07:54:10.484342 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.484302 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d7157023-0d3e-48a7-b859-da313a4a8bb8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-zlxkn\" (UID: \"d7157023-0d3e-48a7-b859-da313a4a8bb8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zlxkn" Apr 21 07:54:10.585432 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.585393 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d7157023-0d3e-48a7-b859-da313a4a8bb8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-zlxkn\" (UID: \"d7157023-0d3e-48a7-b859-da313a4a8bb8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zlxkn" Apr 21 07:54:10.585599 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:54:10.585548 2570 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 21 07:54:10.585679 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:54:10.585627 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7157023-0d3e-48a7-b859-da313a4a8bb8-monitoring-plugin-cert podName:d7157023-0d3e-48a7-b859-da313a4a8bb8 nodeName:}" failed. No retries permitted until 2026-04-21 07:54:11.08560072 +0000 UTC m=+150.923988067 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/d7157023-0d3e-48a7-b859-da313a4a8bb8-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-zlxkn" (UID: "d7157023-0d3e-48a7-b859-da313a4a8bb8") : secret "monitoring-plugin-cert" not found Apr 21 07:54:10.771701 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.771668 2570 scope.go:117] "RemoveContainer" containerID="5a8ea2901e98d0c1f29aeba30dffa5cd77792b1f662c9a882f97814a5f6a89cf" Apr 21 07:54:10.772490 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:54:10.772137 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-kgsrp_openshift-console-operator(be0cbccd-e9f7-4b37-b3e3-e6ef1514b734)\"" pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" podUID="be0cbccd-e9f7-4b37-b3e3-e6ef1514b734" Apr 21 07:54:10.801249 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:10.801227 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-967d7fc7f-6g42n"] Apr 21 07:54:10.803506 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:54:10.803482 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod947c3943_080e_4c53_8507_bd9465492df2.slice/crio-34e3626719ebb1bdc602d14b4e6357063a72358b2d84902464af1857dec34a44 WatchSource:0}: Error finding container 34e3626719ebb1bdc602d14b4e6357063a72358b2d84902464af1857dec34a44: Status 404 returned error can't find the container with id 34e3626719ebb1bdc602d14b4e6357063a72358b2d84902464af1857dec34a44 Apr 21 07:54:11.091011 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:11.090977 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d7157023-0d3e-48a7-b859-da313a4a8bb8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-zlxkn\" (UID: \"d7157023-0d3e-48a7-b859-da313a4a8bb8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zlxkn" Apr 21 07:54:11.093262 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:11.093239 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d7157023-0d3e-48a7-b859-da313a4a8bb8-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-zlxkn\" (UID: \"d7157023-0d3e-48a7-b859-da313a4a8bb8\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zlxkn" Apr 21 07:54:11.210128 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:11.210049 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" event={"ID":"947c3943-080e-4c53-8507-bd9465492df2","Type":"ContainerStarted","Data":"34e3626719ebb1bdc602d14b4e6357063a72358b2d84902464af1857dec34a44"} Apr 21 07:54:11.212392 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:11.212370 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8afbf877-96ee-4486-8378-8972eb8c7ebd","Type":"ContainerStarted","Data":"8aaefcdda76482b91d34372ff4387e09cd87e3ee2690ba819e3266321cec22ec"} Apr 21 07:54:11.212489 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:11.212398 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8afbf877-96ee-4486-8378-8972eb8c7ebd","Type":"ContainerStarted","Data":"b224a39c766b43852b524d197d32e5e5b4583396ace359c4299a82850b811ae2"} Apr 21 07:54:11.212489 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:11.212407 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8afbf877-96ee-4486-8378-8972eb8c7ebd","Type":"ContainerStarted","Data":"aa7ae5fe780bd89b1c84f207589162352028f380c1aec60dc5235c461b3cf5b1"} Apr 21 07:54:11.212489 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:11.212415 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8afbf877-96ee-4486-8378-8972eb8c7ebd","Type":"ContainerStarted","Data":"c6e7b32e2a7f6615a0a1a8c39d89c1ef7ae14422c2be8e193a9681f7338834e3"} Apr 21 07:54:11.212489 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:11.212424 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8afbf877-96ee-4486-8378-8972eb8c7ebd","Type":"ContainerStarted","Data":"ca9339aa392673ae12afc0b05cd0687a90f349d56e8fbcdca20f4bbbbce4b4f5"} Apr 21 07:54:11.341752 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:11.341719 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zlxkn" Apr 21 07:54:11.474998 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:11.474921 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-zlxkn"] Apr 21 07:54:11.605929 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:54:11.605886 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7157023_0d3e_48a7_b859_da313a4a8bb8.slice/crio-a959729b008400867aecae0dd2c2a047ec6eef427b79d0299f527e910d50a2a5 WatchSource:0}: Error finding container a959729b008400867aecae0dd2c2a047ec6eef427b79d0299f527e910d50a2a5: Status 404 returned error can't find the container with id a959729b008400867aecae0dd2c2a047ec6eef427b79d0299f527e910d50a2a5 Apr 21 07:54:12.221568 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:12.221527 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8afbf877-96ee-4486-8378-8972eb8c7ebd","Type":"ContainerStarted","Data":"7d3d76783df71f32032152e651ebfb879469a0797fdfb5601c4e0630f84eb2a2"} Apr 21 07:54:12.222760 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:12.222727 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zlxkn" event={"ID":"d7157023-0d3e-48a7-b859-da313a4a8bb8","Type":"ContainerStarted","Data":"a959729b008400867aecae0dd2c2a047ec6eef427b79d0299f527e910d50a2a5"} Apr 21 07:54:12.248693 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:12.248351 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.089607874 podStartE2EDuration="6.248335394s" podCreationTimestamp="2026-04-21 07:54:06 +0000 UTC" firstStartedPulling="2026-04-21 07:54:07.492376603 +0000 UTC m=+147.330763930" lastFinishedPulling="2026-04-21 07:54:11.651104109 +0000 UTC m=+151.489491450" observedRunningTime="2026-04-21 07:54:12.247996658 +0000 UTC m=+152.086384074" watchObservedRunningTime="2026-04-21 07:54:12.248335394 +0000 UTC m=+152.086722748" Apr 21 07:54:13.227102 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:13.227052 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" event={"ID":"947c3943-080e-4c53-8507-bd9465492df2","Type":"ContainerStarted","Data":"dad64ba996295e2c4cdab5a471ad2e6420df747f6d036f20c23e97dd910765bb"} Apr 21 07:54:13.243697 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:13.243575 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" podStartSLOduration=1.729987994 podStartE2EDuration="3.243554942s" podCreationTimestamp="2026-04-21 07:54:10 +0000 UTC" firstStartedPulling="2026-04-21 07:54:10.805328041 +0000 UTC m=+150.643715368" lastFinishedPulling="2026-04-21 07:54:12.318894986 +0000 UTC m=+152.157282316" observedRunningTime="2026-04-21 07:54:13.243459023 +0000 UTC m=+153.081846374" watchObservedRunningTime="2026-04-21 07:54:13.243554942 +0000 UTC m=+153.081942294" Apr 21 07:54:14.233297 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:14.233263 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zlxkn" event={"ID":"d7157023-0d3e-48a7-b859-da313a4a8bb8","Type":"ContainerStarted","Data":"ae38d26a9d57cf991ff0bd35dd264dc94fc20ae7e0694c1da757b567270246f1"} Apr 21 07:54:14.248345 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:14.248294 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zlxkn" podStartSLOduration=2.274143001 podStartE2EDuration="4.2482814s" podCreationTimestamp="2026-04-21 07:54:10 +0000 UTC" firstStartedPulling="2026-04-21 07:54:11.608533347 +0000 UTC m=+151.446920679" lastFinishedPulling="2026-04-21 07:54:13.582671751 +0000 UTC m=+153.421059078" observedRunningTime="2026-04-21 07:54:14.246847952 +0000 UTC m=+154.085235303" watchObservedRunningTime="2026-04-21 07:54:14.2482814 +0000 UTC m=+154.086668790" Apr 21 07:54:15.236433 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:15.236401 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zlxkn" Apr 21 07:54:15.241125 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:15.241102 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zlxkn" Apr 21 07:54:17.090340 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:54:17.090292 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-7rk6g" podUID="6329b105-bf72-40c1-ab25-2ba6f2aea17c" Apr 21 07:54:17.114618 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:54:17.114573 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-pfqbt" podUID="1bcd6517-d770-467f-8536-8c7f4cdc772e" Apr 21 07:54:17.241793 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:17.241769 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7rk6g" Apr 21 07:54:17.780740 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:54:17.780697 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-bbxng" podUID="62e778a8-8270-4560-9d0d-41a95a3c9c5f" Apr 21 07:54:18.194495 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:18.194463 2570 patch_prober.go:28] interesting pod/image-registry-6459cccc96-5nt4w container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 07:54:18.194832 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:18.194514 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" podUID="3d259e3f-23a3-4b2f-8d21-5ddf47a2d960" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 07:54:20.169746 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:20.169708 2570 patch_prober.go:28] interesting pod/image-registry-6459cccc96-5nt4w container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 07:54:20.170181 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:20.169764 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" podUID="3d259e3f-23a3-4b2f-8d21-5ddf47a2d960" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 07:54:21.982467 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:21.982386 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls\") pod \"dns-default-7rk6g\" (UID: \"6329b105-bf72-40c1-ab25-2ba6f2aea17c\") " pod="openshift-dns/dns-default-7rk6g" Apr 21 07:54:21.982467 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:21.982437 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert\") pod \"ingress-canary-pfqbt\" (UID: \"1bcd6517-d770-467f-8536-8c7f4cdc772e\") " pod="openshift-ingress-canary/ingress-canary-pfqbt" Apr 21 07:54:21.984689 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:21.984669 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6329b105-bf72-40c1-ab25-2ba6f2aea17c-metrics-tls\") pod \"dns-default-7rk6g\" (UID: \"6329b105-bf72-40c1-ab25-2ba6f2aea17c\") " pod="openshift-dns/dns-default-7rk6g" Apr 21 07:54:21.984825 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:21.984803 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1bcd6517-d770-467f-8536-8c7f4cdc772e-cert\") pod \"ingress-canary-pfqbt\" (UID: \"1bcd6517-d770-467f-8536-8c7f4cdc772e\") " pod="openshift-ingress-canary/ingress-canary-pfqbt" Apr 21 07:54:22.044526 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:22.044497 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4xjgf\"" Apr 21 07:54:22.053307 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:22.053286 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7rk6g" Apr 21 07:54:22.169670 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:22.169628 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7rk6g"] Apr 21 07:54:22.172534 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:54:22.172502 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6329b105_bf72_40c1_ab25_2ba6f2aea17c.slice/crio-f95b045842904daac45f70ee3b04f41950abb63badb81b5a9985c2ff8da5fb16 WatchSource:0}: Error finding container f95b045842904daac45f70ee3b04f41950abb63badb81b5a9985c2ff8da5fb16: Status 404 returned error can't find the container with id f95b045842904daac45f70ee3b04f41950abb63badb81b5a9985c2ff8da5fb16 Apr 21 07:54:22.256405 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:22.256327 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7rk6g" event={"ID":"6329b105-bf72-40c1-ab25-2ba6f2aea17c","Type":"ContainerStarted","Data":"f95b045842904daac45f70ee3b04f41950abb63badb81b5a9985c2ff8da5fb16"} Apr 21 07:54:24.263531 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:24.263498 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7rk6g" event={"ID":"6329b105-bf72-40c1-ab25-2ba6f2aea17c","Type":"ContainerStarted","Data":"81b0b8237854979c1c134e31cda47a12fe7ff9392f3bb9fb6befdb76e09a2de9"} Apr 21 07:54:24.263531 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:24.263533 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7rk6g" event={"ID":"6329b105-bf72-40c1-ab25-2ba6f2aea17c","Type":"ContainerStarted","Data":"24934605537ca9b38f6e786732a054f012030e72fcf7bfcfe1f619dfcdadf8bf"} Apr 21 07:54:24.263968 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:24.263629 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-7rk6g" Apr 21 07:54:24.282066 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:24.282023 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7rk6g" podStartSLOduration=129.039581514 podStartE2EDuration="2m10.282012782s" podCreationTimestamp="2026-04-21 07:52:14 +0000 UTC" firstStartedPulling="2026-04-21 07:54:22.174408014 +0000 UTC m=+162.012795341" lastFinishedPulling="2026-04-21 07:54:23.416839268 +0000 UTC m=+163.255226609" observedRunningTime="2026-04-21 07:54:24.281507154 +0000 UTC m=+164.119894502" watchObservedRunningTime="2026-04-21 07:54:24.282012782 +0000 UTC m=+164.120400130" Apr 21 07:54:25.769820 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:25.769793 2570 scope.go:117] "RemoveContainer" containerID="5a8ea2901e98d0c1f29aeba30dffa5cd77792b1f662c9a882f97814a5f6a89cf" Apr 21 07:54:26.271132 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:26.271103 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kgsrp_be0cbccd-e9f7-4b37-b3e3-e6ef1514b734/console-operator/2.log" Apr 21 07:54:26.271308 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:26.271215 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" event={"ID":"be0cbccd-e9f7-4b37-b3e3-e6ef1514b734","Type":"ContainerStarted","Data":"ec3b6b50f5ba347fba9023ba4f3bbfd7e4deb298f693fbf9291ac969ddd8ad42"} Apr 21 07:54:26.271504 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:26.271477 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" Apr 21 07:54:26.287177 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:26.287126 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" podStartSLOduration=55.265257667 podStartE2EDuration="57.287110952s" podCreationTimestamp="2026-04-21 07:53:29 +0000 UTC" firstStartedPulling="2026-04-21 07:53:30.085916726 +0000 UTC m=+109.924304052" lastFinishedPulling="2026-04-21 07:53:32.107770006 +0000 UTC m=+111.946157337" observedRunningTime="2026-04-21 07:54:26.28707218 +0000 UTC m=+166.125459554" watchObservedRunningTime="2026-04-21 07:54:26.287110952 +0000 UTC m=+166.125498301" Apr 21 07:54:26.304416 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:26.304393 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-kgsrp" Apr 21 07:54:26.478431 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:26.478401 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-8tdj4"] Apr 21 07:54:26.481786 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:26.481751 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-8tdj4" Apr 21 07:54:26.484114 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:26.484086 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 07:54:26.484226 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:26.484086 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 07:54:26.484226 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:26.484135 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-rl7wm\"" Apr 21 07:54:26.491797 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:26.491762 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-8tdj4"] Apr 21 07:54:26.624122 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:26.624087 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx58c\" (UniqueName: \"kubernetes.io/projected/d42b0066-31da-406b-a83f-01a4f0ced05a-kube-api-access-tx58c\") pod \"downloads-6bcc868b7-8tdj4\" (UID: \"d42b0066-31da-406b-a83f-01a4f0ced05a\") " pod="openshift-console/downloads-6bcc868b7-8tdj4" Apr 21 07:54:26.725594 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:26.725558 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx58c\" (UniqueName: \"kubernetes.io/projected/d42b0066-31da-406b-a83f-01a4f0ced05a-kube-api-access-tx58c\") pod \"downloads-6bcc868b7-8tdj4\" (UID: \"d42b0066-31da-406b-a83f-01a4f0ced05a\") " pod="openshift-console/downloads-6bcc868b7-8tdj4" Apr 21 07:54:26.736175 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:26.736144 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx58c\" (UniqueName: \"kubernetes.io/projected/d42b0066-31da-406b-a83f-01a4f0ced05a-kube-api-access-tx58c\") pod \"downloads-6bcc868b7-8tdj4\" (UID: \"d42b0066-31da-406b-a83f-01a4f0ced05a\") " pod="openshift-console/downloads-6bcc868b7-8tdj4" Apr 21 07:54:26.796706 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:26.796658 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-8tdj4" Apr 21 07:54:26.912945 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:26.912816 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-8tdj4"] Apr 21 07:54:26.915806 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:54:26.915772 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd42b0066_31da_406b_a83f_01a4f0ced05a.slice/crio-183372c8570bf9b618b06e53d837011b60078bc4f10af99d944a2aecc2e658d0 WatchSource:0}: Error finding container 183372c8570bf9b618b06e53d837011b60078bc4f10af99d944a2aecc2e658d0: Status 404 returned error can't find the container with id 183372c8570bf9b618b06e53d837011b60078bc4f10af99d944a2aecc2e658d0 Apr 21 07:54:27.274942 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:27.274820 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-8tdj4" event={"ID":"d42b0066-31da-406b-a83f-01a4f0ced05a","Type":"ContainerStarted","Data":"183372c8570bf9b618b06e53d837011b60078bc4f10af99d944a2aecc2e658d0"} Apr 21 07:54:27.770178 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:27.770146 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pfqbt" Apr 21 07:54:27.772484 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:27.772454 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pcpsl\"" Apr 21 07:54:27.781131 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:27.781111 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-pfqbt" Apr 21 07:54:27.933925 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:27.933713 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-pfqbt"] Apr 21 07:54:27.939012 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:54:27.938734 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bcd6517_d770_467f_8536_8c7f4cdc772e.slice/crio-2d9ac5d28688dcf51fbc9fb8f7811cbd1fb50da7afc8972f31a62e53ba588846 WatchSource:0}: Error finding container 2d9ac5d28688dcf51fbc9fb8f7811cbd1fb50da7afc8972f31a62e53ba588846: Status 404 returned error can't find the container with id 2d9ac5d28688dcf51fbc9fb8f7811cbd1fb50da7afc8972f31a62e53ba588846 Apr 21 07:54:28.195468 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:28.195434 2570 patch_prober.go:28] interesting pod/image-registry-6459cccc96-5nt4w container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 21 07:54:28.195652 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:28.195494 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" podUID="3d259e3f-23a3-4b2f-8d21-5ddf47a2d960" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 07:54:28.279014 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:28.278978 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pfqbt" event={"ID":"1bcd6517-d770-467f-8536-8c7f4cdc772e","Type":"ContainerStarted","Data":"2d9ac5d28688dcf51fbc9fb8f7811cbd1fb50da7afc8972f31a62e53ba588846"} Apr 21 07:54:30.170633 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:30.170602 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6459cccc96-5nt4w" Apr 21 07:54:30.286325 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:30.286289 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-pfqbt" event={"ID":"1bcd6517-d770-467f-8536-8c7f4cdc772e","Type":"ContainerStarted","Data":"6953b75fd9caab219dc1632dbabee73aaac9ed7acdaefec643b4bb12e1ff0819"} Apr 21 07:54:30.300804 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:30.300756 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-pfqbt" podStartSLOduration=134.540432359 podStartE2EDuration="2m16.300742909s" podCreationTimestamp="2026-04-21 07:52:14 +0000 UTC" firstStartedPulling="2026-04-21 07:54:27.941845418 +0000 UTC m=+167.780232751" lastFinishedPulling="2026-04-21 07:54:29.702155955 +0000 UTC m=+169.540543301" observedRunningTime="2026-04-21 07:54:30.300465966 +0000 UTC m=+170.138853314" watchObservedRunningTime="2026-04-21 07:54:30.300742909 +0000 UTC m=+170.139130257" Apr 21 07:54:30.368583 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:30.368550 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:30.368734 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:30.368620 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:31.769896 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:31.769838 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:54:34.269577 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:34.269544 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7rk6g" Apr 21 07:54:35.807692 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:35.807660 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-66774759c9-86fzv"] Apr 21 07:54:35.814731 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:35.814707 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:54:35.817339 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:35.817311 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 07:54:35.817487 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:35.817428 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 07:54:35.817563 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:35.817527 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-htmn6\"" Apr 21 07:54:35.818238 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:35.818218 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 07:54:35.818418 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:35.818399 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 07:54:35.818530 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:35.818473 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 07:54:35.823118 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:35.822985 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66774759c9-86fzv"] Apr 21 07:54:35.914834 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:35.914737 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f9ce75b-eb86-4861-bb34-14be36dc43b1-service-ca\") pod \"console-66774759c9-86fzv\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:54:35.915018 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:35.914926 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f9ce75b-eb86-4861-bb34-14be36dc43b1-console-config\") pod \"console-66774759c9-86fzv\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:54:35.915088 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:35.915042 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f9ce75b-eb86-4861-bb34-14be36dc43b1-console-oauth-config\") pod \"console-66774759c9-86fzv\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:54:35.915210 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:35.915184 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f9ce75b-eb86-4861-bb34-14be36dc43b1-oauth-serving-cert\") pod \"console-66774759c9-86fzv\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:54:35.915295 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:35.915257 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f9ce75b-eb86-4861-bb34-14be36dc43b1-console-serving-cert\") pod \"console-66774759c9-86fzv\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:54:35.915347 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:35.915318 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxth7\" (UniqueName: \"kubernetes.io/projected/5f9ce75b-eb86-4861-bb34-14be36dc43b1-kube-api-access-xxth7\") pod \"console-66774759c9-86fzv\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:54:36.016195 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:36.016160 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxth7\" (UniqueName: \"kubernetes.io/projected/5f9ce75b-eb86-4861-bb34-14be36dc43b1-kube-api-access-xxth7\") pod \"console-66774759c9-86fzv\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:54:36.016195 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:36.016204 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f9ce75b-eb86-4861-bb34-14be36dc43b1-service-ca\") pod \"console-66774759c9-86fzv\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:54:36.016377 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:36.016228 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f9ce75b-eb86-4861-bb34-14be36dc43b1-console-config\") pod \"console-66774759c9-86fzv\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:54:36.016377 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:36.016276 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f9ce75b-eb86-4861-bb34-14be36dc43b1-console-oauth-config\") pod \"console-66774759c9-86fzv\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:54:36.016377 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:36.016313 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f9ce75b-eb86-4861-bb34-14be36dc43b1-oauth-serving-cert\") pod \"console-66774759c9-86fzv\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:54:36.016377 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:36.016359 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f9ce75b-eb86-4861-bb34-14be36dc43b1-console-serving-cert\") pod \"console-66774759c9-86fzv\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:54:36.017022 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:36.016994 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f9ce75b-eb86-4861-bb34-14be36dc43b1-service-ca\") pod \"console-66774759c9-86fzv\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:54:36.017135 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:36.017023 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f9ce75b-eb86-4861-bb34-14be36dc43b1-console-config\") pod \"console-66774759c9-86fzv\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:54:36.017135 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:36.017103 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f9ce75b-eb86-4861-bb34-14be36dc43b1-oauth-serving-cert\") pod \"console-66774759c9-86fzv\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:54:36.018701 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:36.018685 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f9ce75b-eb86-4861-bb34-14be36dc43b1-console-serving-cert\") pod \"console-66774759c9-86fzv\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:54:36.018749 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:36.018701 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f9ce75b-eb86-4861-bb34-14be36dc43b1-console-oauth-config\") pod \"console-66774759c9-86fzv\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:54:36.025064 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:36.025044 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxth7\" (UniqueName: \"kubernetes.io/projected/5f9ce75b-eb86-4861-bb34-14be36dc43b1-kube-api-access-xxth7\") pod \"console-66774759c9-86fzv\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:54:36.127026 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:36.126978 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:54:36.260121 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:36.260093 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66774759c9-86fzv"] Apr 21 07:54:36.262526 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:54:36.262494 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f9ce75b_eb86_4861_bb34_14be36dc43b1.slice/crio-9fbec44ec346591c8cce0c5f843ef43e7a7196c7ea151ab57fdcc1c85e3c7991 WatchSource:0}: Error finding container 9fbec44ec346591c8cce0c5f843ef43e7a7196c7ea151ab57fdcc1c85e3c7991: Status 404 returned error can't find the container with id 9fbec44ec346591c8cce0c5f843ef43e7a7196c7ea151ab57fdcc1c85e3c7991 Apr 21 07:54:36.307258 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:36.307225 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66774759c9-86fzv" event={"ID":"5f9ce75b-eb86-4861-bb34-14be36dc43b1","Type":"ContainerStarted","Data":"9fbec44ec346591c8cce0c5f843ef43e7a7196c7ea151ab57fdcc1c85e3c7991"} Apr 21 07:54:39.792944 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:39.792908 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-578766656d-jdz8k"] Apr 21 07:54:39.799419 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:39.799398 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:39.804444 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:39.804418 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-578766656d-jdz8k"] Apr 21 07:54:39.808389 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:39.808360 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 07:54:39.953398 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:39.953356 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-console-serving-cert\") pod \"console-578766656d-jdz8k\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:39.953572 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:39.953434 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-oauth-serving-cert\") pod \"console-578766656d-jdz8k\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:39.953572 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:39.953454 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-console-oauth-config\") pod \"console-578766656d-jdz8k\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:39.953572 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:39.953492 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-console-config\") pod \"console-578766656d-jdz8k\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:39.953572 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:39.953543 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nmrw\" (UniqueName: \"kubernetes.io/projected/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-kube-api-access-5nmrw\") pod \"console-578766656d-jdz8k\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:39.953746 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:39.953580 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-service-ca\") pod \"console-578766656d-jdz8k\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:39.953746 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:39.953595 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-trusted-ca-bundle\") pod \"console-578766656d-jdz8k\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:40.055089 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:40.054979 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-oauth-serving-cert\") pod \"console-578766656d-jdz8k\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:40.055089 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:40.055030 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-console-oauth-config\") pod \"console-578766656d-jdz8k\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:40.055089 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:40.055075 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-console-config\") pod \"console-578766656d-jdz8k\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:40.055350 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:40.055095 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nmrw\" (UniqueName: \"kubernetes.io/projected/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-kube-api-access-5nmrw\") pod \"console-578766656d-jdz8k\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:40.055350 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:40.055129 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-service-ca\") pod \"console-578766656d-jdz8k\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:40.055350 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:40.055251 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-trusted-ca-bundle\") pod \"console-578766656d-jdz8k\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:40.055509 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:40.055354 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-console-serving-cert\") pod \"console-578766656d-jdz8k\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:40.056540 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:40.056372 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-console-config\") pod \"console-578766656d-jdz8k\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:40.056749 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:40.056712 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-service-ca\") pod \"console-578766656d-jdz8k\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:40.056890 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:40.056833 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-trusted-ca-bundle\") pod \"console-578766656d-jdz8k\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:40.057177 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:40.057080 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-oauth-serving-cert\") pod \"console-578766656d-jdz8k\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:40.062232 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:40.058017 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-console-oauth-config\") pod \"console-578766656d-jdz8k\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:40.062232 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:40.059675 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-console-serving-cert\") pod \"console-578766656d-jdz8k\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:40.064435 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:40.064409 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nmrw\" (UniqueName: \"kubernetes.io/projected/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-kube-api-access-5nmrw\") pod \"console-578766656d-jdz8k\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:40.112501 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:40.112460 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:44.107431 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:44.107401 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8afbf877-96ee-4486-8378-8972eb8c7ebd/init-config-reloader/0.log" Apr 21 07:54:44.307716 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:44.307672 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8afbf877-96ee-4486-8378-8972eb8c7ebd/alertmanager/0.log" Apr 21 07:54:44.507358 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:44.507279 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8afbf877-96ee-4486-8378-8972eb8c7ebd/config-reloader/0.log" Apr 21 07:54:44.707565 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:44.707537 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8afbf877-96ee-4486-8378-8972eb8c7ebd/kube-rbac-proxy-web/0.log" Apr 21 07:54:44.907080 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:44.907054 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8afbf877-96ee-4486-8378-8972eb8c7ebd/kube-rbac-proxy/0.log" Apr 21 07:54:45.107181 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:45.107152 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8afbf877-96ee-4486-8378-8972eb8c7ebd/kube-rbac-proxy-metric/0.log" Apr 21 07:54:45.307693 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:45.307602 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_8afbf877-96ee-4486-8378-8972eb8c7ebd/prom-label-proxy/0.log" Apr 21 07:54:45.861739 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:45.861708 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-578766656d-jdz8k"] Apr 21 07:54:45.864473 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:54:45.864448 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdf2acca_2cd9_4648_aa3f_eefd7bf76627.slice/crio-2ce63f8e679bbe3e1f6b8d4eba4931504a68f951fec6fbe96d9ea6dd733ad979 WatchSource:0}: Error finding container 2ce63f8e679bbe3e1f6b8d4eba4931504a68f951fec6fbe96d9ea6dd733ad979: Status 404 returned error can't find the container with id 2ce63f8e679bbe3e1f6b8d4eba4931504a68f951fec6fbe96d9ea6dd733ad979 Apr 21 07:54:46.307120 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:46.307090 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-967d7fc7f-6g42n_947c3943-080e-4c53-8507-bd9465492df2/metrics-server/0.log" Apr 21 07:54:46.340836 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:46.340797 2570 generic.go:358] "Generic (PLEG): container finished" podID="0e1f90ef-6932-43c1-bb6a-d25451d794e8" containerID="ef362d15a15e481d2aafbfb939e7623717584dbe404e28ec5a40cb4979bce378" exitCode=0 Apr 21 07:54:46.341309 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:46.340898 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mdt7b" event={"ID":"0e1f90ef-6932-43c1-bb6a-d25451d794e8","Type":"ContainerDied","Data":"ef362d15a15e481d2aafbfb939e7623717584dbe404e28ec5a40cb4979bce378"} Apr 21 07:54:46.341309 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:46.341272 2570 scope.go:117] "RemoveContainer" containerID="ef362d15a15e481d2aafbfb939e7623717584dbe404e28ec5a40cb4979bce378" Apr 21 07:54:46.342674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:46.342646 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-578766656d-jdz8k" event={"ID":"cdf2acca-2cd9-4648-aa3f-eefd7bf76627","Type":"ContainerStarted","Data":"83bd1cabf01585cd329340189e794a1ea223e7abeed040b342022ca7bce08e1e"} Apr 21 07:54:46.342781 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:46.342680 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-578766656d-jdz8k" event={"ID":"cdf2acca-2cd9-4648-aa3f-eefd7bf76627","Type":"ContainerStarted","Data":"2ce63f8e679bbe3e1f6b8d4eba4931504a68f951fec6fbe96d9ea6dd733ad979"} Apr 21 07:54:46.344383 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:46.344277 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-8tdj4" event={"ID":"d42b0066-31da-406b-a83f-01a4f0ced05a","Type":"ContainerStarted","Data":"04d40f9b9dbed9f5bd3ae2b124f489d90e26ed44bfc5b8734633399e5dc3caec"} Apr 21 07:54:46.344447 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:46.344409 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-8tdj4" Apr 21 07:54:46.345857 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:46.345827 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66774759c9-86fzv" event={"ID":"5f9ce75b-eb86-4861-bb34-14be36dc43b1","Type":"ContainerStarted","Data":"8efca234824dea57cafda23d6853a0d512b4ae5d940e851666e58267f4c385d3"} Apr 21 07:54:46.354780 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:46.354729 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-8tdj4" Apr 21 07:54:46.377684 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:46.377608 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66774759c9-86fzv" podStartSLOduration=1.893264362 podStartE2EDuration="11.377588953s" podCreationTimestamp="2026-04-21 07:54:35 +0000 UTC" firstStartedPulling="2026-04-21 07:54:36.264842507 +0000 UTC m=+176.103229850" lastFinishedPulling="2026-04-21 07:54:45.749167099 +0000 UTC m=+185.587554441" observedRunningTime="2026-04-21 07:54:46.376397259 +0000 UTC m=+186.214784608" watchObservedRunningTime="2026-04-21 07:54:46.377588953 +0000 UTC m=+186.215976304" Apr 21 07:54:46.405073 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:46.404536 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-8tdj4" podStartSLOduration=1.534536401 podStartE2EDuration="20.404516755s" podCreationTimestamp="2026-04-21 07:54:26 +0000 UTC" firstStartedPulling="2026-04-21 07:54:26.917662346 +0000 UTC m=+166.756049679" lastFinishedPulling="2026-04-21 07:54:45.787642704 +0000 UTC m=+185.626030033" observedRunningTime="2026-04-21 07:54:46.403044377 +0000 UTC m=+186.241431740" watchObservedRunningTime="2026-04-21 07:54:46.404516755 +0000 UTC m=+186.242904105" Apr 21 07:54:46.421877 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:46.421779 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-578766656d-jdz8k" podStartSLOduration=7.4217645789999995 podStartE2EDuration="7.421764579s" podCreationTimestamp="2026-04-21 07:54:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:54:46.4197814 +0000 UTC m=+186.258168750" watchObservedRunningTime="2026-04-21 07:54:46.421764579 +0000 UTC m=+186.260151927" Apr 21 07:54:46.507415 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:46.507383 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-zlxkn_d7157023-0d3e-48a7-b859-da313a4a8bb8/monitoring-plugin/0.log" Apr 21 07:54:47.307023 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:47.306992 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hjljn_812db1cf-487b-4a90-a0c8-a72c028f5c45/init-textfile/0.log" Apr 21 07:54:47.351584 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:47.351543 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mdt7b" event={"ID":"0e1f90ef-6932-43c1-bb6a-d25451d794e8","Type":"ContainerStarted","Data":"2b1370e73df8047b50503e56dcf6056c8e116c10ba668ca821bc77206a871774"} Apr 21 07:54:47.507832 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:47.507792 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hjljn_812db1cf-487b-4a90-a0c8-a72c028f5c45/node-exporter/0.log" Apr 21 07:54:47.707191 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:47.707157 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hjljn_812db1cf-487b-4a90-a0c8-a72c028f5c45/kube-rbac-proxy/0.log" Apr 21 07:54:48.507548 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:48.507514 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-xhclh_c7ea73b2-778a-49e2-ba0d-7d918ce7b31d/kube-rbac-proxy-main/0.log" Apr 21 07:54:48.711291 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:48.711247 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-xhclh_c7ea73b2-778a-49e2-ba0d-7d918ce7b31d/kube-rbac-proxy-self/0.log" Apr 21 07:54:48.907239 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:48.907195 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-xhclh_c7ea73b2-778a-49e2-ba0d-7d918ce7b31d/openshift-state-metrics/0.log" Apr 21 07:54:50.112759 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:50.112718 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:50.113239 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:50.112775 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:50.118270 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:50.118246 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:50.365726 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:50.365642 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:54:50.374063 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:50.374038 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:50.378245 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:50.378223 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-967d7fc7f-6g42n" Apr 21 07:54:50.413958 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:50.413920 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66774759c9-86fzv"] Apr 21 07:54:52.906502 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:52.906469 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-xj2ss_f8912392-4510-4052-97fe-c1f6926ae955/networking-console-plugin/0.log" Apr 21 07:54:53.107255 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:53.107212 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kgsrp_be0cbccd-e9f7-4b37-b3e3-e6ef1514b734/console-operator/2.log" Apr 21 07:54:53.309482 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:53.309397 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kgsrp_be0cbccd-e9f7-4b37-b3e3-e6ef1514b734/console-operator/3.log" Apr 21 07:54:53.506813 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:53.506777 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-578766656d-jdz8k_cdf2acca-2cd9-4648-aa3f-eefd7bf76627/console/0.log" Apr 21 07:54:53.707368 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:53.707340 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66774759c9-86fzv_5f9ce75b-eb86-4861-bb34-14be36dc43b1/console/0.log" Apr 21 07:54:53.908425 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:53.908397 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-8tdj4_d42b0066-31da-406b-a83f-01a4f0ced05a/download-server/0.log" Apr 21 07:54:54.107205 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:54.107180 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7rk6g_6329b105-bf72-40c1-ab25-2ba6f2aea17c/dns/0.log" Apr 21 07:54:54.306667 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:54.306637 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7rk6g_6329b105-bf72-40c1-ab25-2ba6f2aea17c/kube-rbac-proxy/0.log" Apr 21 07:54:55.506723 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:55.506697 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hc6tl_28381821-cb18-40b6-a25f-b3a80e24f27a/dns-node-resolver/0.log" Apr 21 07:54:55.907798 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:55.907768 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-d5f96b85b-ghcsf_9f41707d-3f57-4540-b483-7dd01a7d4ef0/router/0.log" Apr 21 07:54:56.127582 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:56.127550 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:54:56.506018 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:54:56.505990 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-pfqbt_1bcd6517-d770-467f-8536-8c7f4cdc772e/serve-healthcheck-canary/0.log" Apr 21 07:55:15.439823 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:15.439754 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-66774759c9-86fzv" podUID="5f9ce75b-eb86-4861-bb34-14be36dc43b1" containerName="console" containerID="cri-o://8efca234824dea57cafda23d6853a0d512b4ae5d940e851666e58267f4c385d3" gracePeriod=15 Apr 21 07:55:15.705182 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:15.705162 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66774759c9-86fzv_5f9ce75b-eb86-4861-bb34-14be36dc43b1/console/0.log" Apr 21 07:55:15.705289 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:15.705221 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:55:15.781052 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:15.781023 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxth7\" (UniqueName: \"kubernetes.io/projected/5f9ce75b-eb86-4861-bb34-14be36dc43b1-kube-api-access-xxth7\") pod \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " Apr 21 07:55:15.781212 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:15.781068 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f9ce75b-eb86-4861-bb34-14be36dc43b1-service-ca\") pod \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " Apr 21 07:55:15.781212 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:15.781095 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f9ce75b-eb86-4861-bb34-14be36dc43b1-oauth-serving-cert\") pod \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " Apr 21 07:55:15.781212 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:15.781142 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f9ce75b-eb86-4861-bb34-14be36dc43b1-console-oauth-config\") pod \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " Apr 21 07:55:15.781212 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:15.781181 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f9ce75b-eb86-4861-bb34-14be36dc43b1-console-config\") pod \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " Apr 21 07:55:15.781426 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:15.781223 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f9ce75b-eb86-4861-bb34-14be36dc43b1-console-serving-cert\") pod \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\" (UID: \"5f9ce75b-eb86-4861-bb34-14be36dc43b1\") " Apr 21 07:55:15.781481 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:15.781460 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f9ce75b-eb86-4861-bb34-14be36dc43b1-service-ca" (OuterVolumeSpecName: "service-ca") pod "5f9ce75b-eb86-4861-bb34-14be36dc43b1" (UID: "5f9ce75b-eb86-4861-bb34-14be36dc43b1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:55:15.781564 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:15.781536 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f9ce75b-eb86-4861-bb34-14be36dc43b1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5f9ce75b-eb86-4861-bb34-14be36dc43b1" (UID: "5f9ce75b-eb86-4861-bb34-14be36dc43b1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:55:15.781634 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:15.781607 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f9ce75b-eb86-4861-bb34-14be36dc43b1-console-config" (OuterVolumeSpecName: "console-config") pod "5f9ce75b-eb86-4861-bb34-14be36dc43b1" (UID: "5f9ce75b-eb86-4861-bb34-14be36dc43b1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:55:15.783381 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:15.783347 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f9ce75b-eb86-4861-bb34-14be36dc43b1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5f9ce75b-eb86-4861-bb34-14be36dc43b1" (UID: "5f9ce75b-eb86-4861-bb34-14be36dc43b1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:55:15.783481 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:15.783385 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f9ce75b-eb86-4861-bb34-14be36dc43b1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5f9ce75b-eb86-4861-bb34-14be36dc43b1" (UID: "5f9ce75b-eb86-4861-bb34-14be36dc43b1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:55:15.783481 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:15.783405 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f9ce75b-eb86-4861-bb34-14be36dc43b1-kube-api-access-xxth7" (OuterVolumeSpecName: "kube-api-access-xxth7") pod "5f9ce75b-eb86-4861-bb34-14be36dc43b1" (UID: "5f9ce75b-eb86-4861-bb34-14be36dc43b1"). InnerVolumeSpecName "kube-api-access-xxth7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:55:15.887235 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:15.883247 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5f9ce75b-eb86-4861-bb34-14be36dc43b1-console-serving-cert\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:55:15.887235 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:15.883287 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xxth7\" (UniqueName: \"kubernetes.io/projected/5f9ce75b-eb86-4861-bb34-14be36dc43b1-kube-api-access-xxth7\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:55:15.887235 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:15.883312 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5f9ce75b-eb86-4861-bb34-14be36dc43b1-service-ca\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:55:15.887235 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:15.883327 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5f9ce75b-eb86-4861-bb34-14be36dc43b1-oauth-serving-cert\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:55:15.887235 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:15.883343 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5f9ce75b-eb86-4861-bb34-14be36dc43b1-console-oauth-config\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:55:15.887235 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:15.883358 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5f9ce75b-eb86-4861-bb34-14be36dc43b1-console-config\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:55:16.441049 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:16.441022 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66774759c9-86fzv_5f9ce75b-eb86-4861-bb34-14be36dc43b1/console/0.log" Apr 21 07:55:16.441428 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:16.441059 2570 generic.go:358] "Generic (PLEG): container finished" podID="5f9ce75b-eb86-4861-bb34-14be36dc43b1" containerID="8efca234824dea57cafda23d6853a0d512b4ae5d940e851666e58267f4c385d3" exitCode=2 Apr 21 07:55:16.441428 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:16.441154 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66774759c9-86fzv" Apr 21 07:55:16.441428 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:16.441151 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66774759c9-86fzv" event={"ID":"5f9ce75b-eb86-4861-bb34-14be36dc43b1","Type":"ContainerDied","Data":"8efca234824dea57cafda23d6853a0d512b4ae5d940e851666e58267f4c385d3"} Apr 21 07:55:16.441428 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:16.441196 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66774759c9-86fzv" event={"ID":"5f9ce75b-eb86-4861-bb34-14be36dc43b1","Type":"ContainerDied","Data":"9fbec44ec346591c8cce0c5f843ef43e7a7196c7ea151ab57fdcc1c85e3c7991"} Apr 21 07:55:16.441428 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:16.441217 2570 scope.go:117] "RemoveContainer" containerID="8efca234824dea57cafda23d6853a0d512b4ae5d940e851666e58267f4c385d3" Apr 21 07:55:16.450226 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:16.450204 2570 scope.go:117] "RemoveContainer" containerID="8efca234824dea57cafda23d6853a0d512b4ae5d940e851666e58267f4c385d3" Apr 21 07:55:16.450472 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:55:16.450452 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8efca234824dea57cafda23d6853a0d512b4ae5d940e851666e58267f4c385d3\": container with ID starting with 8efca234824dea57cafda23d6853a0d512b4ae5d940e851666e58267f4c385d3 not found: ID does not exist" containerID="8efca234824dea57cafda23d6853a0d512b4ae5d940e851666e58267f4c385d3" Apr 21 07:55:16.450533 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:16.450480 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8efca234824dea57cafda23d6853a0d512b4ae5d940e851666e58267f4c385d3"} err="failed to get container status \"8efca234824dea57cafda23d6853a0d512b4ae5d940e851666e58267f4c385d3\": rpc error: code = NotFound desc = could not find container \"8efca234824dea57cafda23d6853a0d512b4ae5d940e851666e58267f4c385d3\": container with ID starting with 8efca234824dea57cafda23d6853a0d512b4ae5d940e851666e58267f4c385d3 not found: ID does not exist" Apr 21 07:55:16.461437 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:16.461413 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66774759c9-86fzv"] Apr 21 07:55:16.464387 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:16.464361 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-66774759c9-86fzv"] Apr 21 07:55:16.773312 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:16.773234 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f9ce75b-eb86-4861-bb34-14be36dc43b1" path="/var/lib/kubelet/pods/5f9ce75b-eb86-4861-bb34-14be36dc43b1/volumes" Apr 21 07:55:25.142036 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.141995 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6998c9ccdc-h68np"] Apr 21 07:55:25.142540 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.142320 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f9ce75b-eb86-4861-bb34-14be36dc43b1" containerName="console" Apr 21 07:55:25.142540 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.142332 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f9ce75b-eb86-4861-bb34-14be36dc43b1" containerName="console" Apr 21 07:55:25.142540 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.142404 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f9ce75b-eb86-4861-bb34-14be36dc43b1" containerName="console" Apr 21 07:55:25.167347 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.167320 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6998c9ccdc-h68np"] Apr 21 07:55:25.167494 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.167436 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.267569 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.267533 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3bb403e7-fa87-4be8-a2c4-12540f764fb7-oauth-serving-cert\") pod \"console-6998c9ccdc-h68np\" (UID: \"3bb403e7-fa87-4be8-a2c4-12540f764fb7\") " pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.267760 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.267590 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bb403e7-fa87-4be8-a2c4-12540f764fb7-trusted-ca-bundle\") pod \"console-6998c9ccdc-h68np\" (UID: \"3bb403e7-fa87-4be8-a2c4-12540f764fb7\") " pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.267760 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.267622 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3bb403e7-fa87-4be8-a2c4-12540f764fb7-console-serving-cert\") pod \"console-6998c9ccdc-h68np\" (UID: \"3bb403e7-fa87-4be8-a2c4-12540f764fb7\") " pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.267760 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.267677 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3bb403e7-fa87-4be8-a2c4-12540f764fb7-console-oauth-config\") pod \"console-6998c9ccdc-h68np\" (UID: \"3bb403e7-fa87-4be8-a2c4-12540f764fb7\") " pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.267760 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.267695 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4v4z\" (UniqueName: \"kubernetes.io/projected/3bb403e7-fa87-4be8-a2c4-12540f764fb7-kube-api-access-k4v4z\") pod \"console-6998c9ccdc-h68np\" (UID: \"3bb403e7-fa87-4be8-a2c4-12540f764fb7\") " pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.267760 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.267711 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3bb403e7-fa87-4be8-a2c4-12540f764fb7-service-ca\") pod \"console-6998c9ccdc-h68np\" (UID: \"3bb403e7-fa87-4be8-a2c4-12540f764fb7\") " pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.267760 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.267748 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3bb403e7-fa87-4be8-a2c4-12540f764fb7-console-config\") pod \"console-6998c9ccdc-h68np\" (UID: \"3bb403e7-fa87-4be8-a2c4-12540f764fb7\") " pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.368503 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.368461 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3bb403e7-fa87-4be8-a2c4-12540f764fb7-console-serving-cert\") pod \"console-6998c9ccdc-h68np\" (UID: \"3bb403e7-fa87-4be8-a2c4-12540f764fb7\") " pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.368691 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.368543 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3bb403e7-fa87-4be8-a2c4-12540f764fb7-console-oauth-config\") pod \"console-6998c9ccdc-h68np\" (UID: \"3bb403e7-fa87-4be8-a2c4-12540f764fb7\") " pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.368691 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.368568 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4v4z\" (UniqueName: \"kubernetes.io/projected/3bb403e7-fa87-4be8-a2c4-12540f764fb7-kube-api-access-k4v4z\") pod \"console-6998c9ccdc-h68np\" (UID: \"3bb403e7-fa87-4be8-a2c4-12540f764fb7\") " pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.368691 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.368591 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3bb403e7-fa87-4be8-a2c4-12540f764fb7-service-ca\") pod \"console-6998c9ccdc-h68np\" (UID: \"3bb403e7-fa87-4be8-a2c4-12540f764fb7\") " pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.368691 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.368630 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3bb403e7-fa87-4be8-a2c4-12540f764fb7-console-config\") pod \"console-6998c9ccdc-h68np\" (UID: \"3bb403e7-fa87-4be8-a2c4-12540f764fb7\") " pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.368691 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.368680 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3bb403e7-fa87-4be8-a2c4-12540f764fb7-oauth-serving-cert\") pod \"console-6998c9ccdc-h68np\" (UID: \"3bb403e7-fa87-4be8-a2c4-12540f764fb7\") " pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.369001 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.368720 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bb403e7-fa87-4be8-a2c4-12540f764fb7-trusted-ca-bundle\") pod \"console-6998c9ccdc-h68np\" (UID: \"3bb403e7-fa87-4be8-a2c4-12540f764fb7\") " pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.369416 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.369387 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3bb403e7-fa87-4be8-a2c4-12540f764fb7-service-ca\") pod \"console-6998c9ccdc-h68np\" (UID: \"3bb403e7-fa87-4be8-a2c4-12540f764fb7\") " pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.369593 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.369560 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bb403e7-fa87-4be8-a2c4-12540f764fb7-trusted-ca-bundle\") pod \"console-6998c9ccdc-h68np\" (UID: \"3bb403e7-fa87-4be8-a2c4-12540f764fb7\") " pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.369679 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.369411 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3bb403e7-fa87-4be8-a2c4-12540f764fb7-console-config\") pod \"console-6998c9ccdc-h68np\" (UID: \"3bb403e7-fa87-4be8-a2c4-12540f764fb7\") " pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.369836 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.369817 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3bb403e7-fa87-4be8-a2c4-12540f764fb7-oauth-serving-cert\") pod \"console-6998c9ccdc-h68np\" (UID: \"3bb403e7-fa87-4be8-a2c4-12540f764fb7\") " pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.371122 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.371102 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3bb403e7-fa87-4be8-a2c4-12540f764fb7-console-oauth-config\") pod \"console-6998c9ccdc-h68np\" (UID: \"3bb403e7-fa87-4be8-a2c4-12540f764fb7\") " pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.371262 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.371246 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3bb403e7-fa87-4be8-a2c4-12540f764fb7-console-serving-cert\") pod \"console-6998c9ccdc-h68np\" (UID: \"3bb403e7-fa87-4be8-a2c4-12540f764fb7\") " pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.378183 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.378162 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4v4z\" (UniqueName: \"kubernetes.io/projected/3bb403e7-fa87-4be8-a2c4-12540f764fb7-kube-api-access-k4v4z\") pod \"console-6998c9ccdc-h68np\" (UID: \"3bb403e7-fa87-4be8-a2c4-12540f764fb7\") " pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.477510 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.477432 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:25.597038 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.597015 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6998c9ccdc-h68np"] Apr 21 07:55:25.599742 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:55:25.599714 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bb403e7_fa87_4be8_a2c4_12540f764fb7.slice/crio-951eaae97a28a08df5e21c288b5986986aeb6c64d58557f7f1d2835158eb6a33 WatchSource:0}: Error finding container 951eaae97a28a08df5e21c288b5986986aeb6c64d58557f7f1d2835158eb6a33: Status 404 returned error can't find the container with id 951eaae97a28a08df5e21c288b5986986aeb6c64d58557f7f1d2835158eb6a33 Apr 21 07:55:25.995505 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.995472 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 07:55:25.995931 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.995897 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="alertmanager" containerID="cri-o://ca9339aa392673ae12afc0b05cd0687a90f349d56e8fbcdca20f4bbbbce4b4f5" gracePeriod=120 Apr 21 07:55:25.996084 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.995942 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="kube-rbac-proxy-metric" containerID="cri-o://8aaefcdda76482b91d34372ff4387e09cd87e3ee2690ba819e3266321cec22ec" gracePeriod=120 Apr 21 07:55:25.996084 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.995965 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="kube-rbac-proxy-web" containerID="cri-o://aa7ae5fe780bd89b1c84f207589162352028f380c1aec60dc5235c461b3cf5b1" gracePeriod=120 Apr 21 07:55:25.996084 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.995961 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="kube-rbac-proxy" containerID="cri-o://b224a39c766b43852b524d197d32e5e5b4583396ace359c4299a82850b811ae2" gracePeriod=120 Apr 21 07:55:25.996084 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.995985 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="config-reloader" containerID="cri-o://c6e7b32e2a7f6615a0a1a8c39d89c1ef7ae14422c2be8e193a9681f7338834e3" gracePeriod=120 Apr 21 07:55:25.996084 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:25.996049 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="prom-label-proxy" containerID="cri-o://7d3d76783df71f32032152e651ebfb879469a0797fdfb5601c4e0630f84eb2a2" gracePeriod=120 Apr 21 07:55:26.473686 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:26.473648 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6998c9ccdc-h68np" event={"ID":"3bb403e7-fa87-4be8-a2c4-12540f764fb7","Type":"ContainerStarted","Data":"083ee1c7450087353022ed03438784c7e53e92b2aabc3c2912f763cea51f4614"} Apr 21 07:55:26.473686 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:26.473684 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6998c9ccdc-h68np" event={"ID":"3bb403e7-fa87-4be8-a2c4-12540f764fb7","Type":"ContainerStarted","Data":"951eaae97a28a08df5e21c288b5986986aeb6c64d58557f7f1d2835158eb6a33"} Apr 21 07:55:26.476485 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:26.476460 2570 generic.go:358] "Generic (PLEG): container finished" podID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerID="7d3d76783df71f32032152e651ebfb879469a0797fdfb5601c4e0630f84eb2a2" exitCode=0 Apr 21 07:55:26.476485 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:26.476484 2570 generic.go:358] "Generic (PLEG): container finished" podID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerID="b224a39c766b43852b524d197d32e5e5b4583396ace359c4299a82850b811ae2" exitCode=0 Apr 21 07:55:26.476624 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:26.476492 2570 generic.go:358] "Generic (PLEG): container finished" podID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerID="c6e7b32e2a7f6615a0a1a8c39d89c1ef7ae14422c2be8e193a9681f7338834e3" exitCode=0 Apr 21 07:55:26.476624 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:26.476498 2570 generic.go:358] "Generic (PLEG): container finished" podID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerID="ca9339aa392673ae12afc0b05cd0687a90f349d56e8fbcdca20f4bbbbce4b4f5" exitCode=0 Apr 21 07:55:26.476624 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:26.476527 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8afbf877-96ee-4486-8378-8972eb8c7ebd","Type":"ContainerDied","Data":"7d3d76783df71f32032152e651ebfb879469a0797fdfb5601c4e0630f84eb2a2"} Apr 21 07:55:26.476624 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:26.476549 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8afbf877-96ee-4486-8378-8972eb8c7ebd","Type":"ContainerDied","Data":"b224a39c766b43852b524d197d32e5e5b4583396ace359c4299a82850b811ae2"} Apr 21 07:55:26.476624 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:26.476559 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8afbf877-96ee-4486-8378-8972eb8c7ebd","Type":"ContainerDied","Data":"c6e7b32e2a7f6615a0a1a8c39d89c1ef7ae14422c2be8e193a9681f7338834e3"} Apr 21 07:55:26.476624 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:26.476569 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8afbf877-96ee-4486-8378-8972eb8c7ebd","Type":"ContainerDied","Data":"ca9339aa392673ae12afc0b05cd0687a90f349d56e8fbcdca20f4bbbbce4b4f5"} Apr 21 07:55:26.490708 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:26.490664 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6998c9ccdc-h68np" podStartSLOduration=1.490650724 podStartE2EDuration="1.490650724s" podCreationTimestamp="2026-04-21 07:55:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:55:26.48953516 +0000 UTC m=+226.327922535" watchObservedRunningTime="2026-04-21 07:55:26.490650724 +0000 UTC m=+226.329038073" Apr 21 07:55:27.482079 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.482039 2570 generic.go:358] "Generic (PLEG): container finished" podID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerID="8aaefcdda76482b91d34372ff4387e09cd87e3ee2690ba819e3266321cec22ec" exitCode=0 Apr 21 07:55:27.482079 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.482071 2570 generic.go:358] "Generic (PLEG): container finished" podID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerID="aa7ae5fe780bd89b1c84f207589162352028f380c1aec60dc5235c461b3cf5b1" exitCode=0 Apr 21 07:55:27.482470 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.482106 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8afbf877-96ee-4486-8378-8972eb8c7ebd","Type":"ContainerDied","Data":"8aaefcdda76482b91d34372ff4387e09cd87e3ee2690ba819e3266321cec22ec"} Apr 21 07:55:27.482470 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.482140 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8afbf877-96ee-4486-8378-8972eb8c7ebd","Type":"ContainerDied","Data":"aa7ae5fe780bd89b1c84f207589162352028f380c1aec60dc5235c461b3cf5b1"} Apr 21 07:55:27.738938 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.738888 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:27.789737 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.789706 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8afbf877-96ee-4486-8378-8972eb8c7ebd-config-out\") pod \"8afbf877-96ee-4486-8378-8972eb8c7ebd\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " Apr 21 07:55:27.789737 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.789746 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-kube-rbac-proxy\") pod \"8afbf877-96ee-4486-8378-8972eb8c7ebd\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " Apr 21 07:55:27.790001 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.789779 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-kube-rbac-proxy-web\") pod \"8afbf877-96ee-4486-8378-8972eb8c7ebd\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " Apr 21 07:55:27.790001 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.789809 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-cluster-tls-config\") pod \"8afbf877-96ee-4486-8378-8972eb8c7ebd\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " Apr 21 07:55:27.790001 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.789832 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fppxp\" (UniqueName: \"kubernetes.io/projected/8afbf877-96ee-4486-8378-8972eb8c7ebd-kube-api-access-fppxp\") pod \"8afbf877-96ee-4486-8378-8972eb8c7ebd\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " Apr 21 07:55:27.790001 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.789855 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8afbf877-96ee-4486-8378-8972eb8c7ebd-metrics-client-ca\") pod \"8afbf877-96ee-4486-8378-8972eb8c7ebd\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " Apr 21 07:55:27.790001 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.789901 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8afbf877-96ee-4486-8378-8972eb8c7ebd-alertmanager-trusted-ca-bundle\") pod \"8afbf877-96ee-4486-8378-8972eb8c7ebd\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " Apr 21 07:55:27.790001 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.789929 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-web-config\") pod \"8afbf877-96ee-4486-8378-8972eb8c7ebd\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " Apr 21 07:55:27.790001 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.789960 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8afbf877-96ee-4486-8378-8972eb8c7ebd-alertmanager-main-db\") pod \"8afbf877-96ee-4486-8378-8972eb8c7ebd\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " Apr 21 07:55:27.790001 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.789994 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-kube-rbac-proxy-metric\") pod \"8afbf877-96ee-4486-8378-8972eb8c7ebd\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " Apr 21 07:55:27.790403 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.790035 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-main-tls\") pod \"8afbf877-96ee-4486-8378-8972eb8c7ebd\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " Apr 21 07:55:27.790403 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.790080 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8afbf877-96ee-4486-8378-8972eb8c7ebd-tls-assets\") pod \"8afbf877-96ee-4486-8378-8972eb8c7ebd\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " Apr 21 07:55:27.790955 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.790649 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8afbf877-96ee-4486-8378-8972eb8c7ebd-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "8afbf877-96ee-4486-8378-8972eb8c7ebd" (UID: "8afbf877-96ee-4486-8378-8972eb8c7ebd"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:55:27.791320 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.791292 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8afbf877-96ee-4486-8378-8972eb8c7ebd-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "8afbf877-96ee-4486-8378-8972eb8c7ebd" (UID: "8afbf877-96ee-4486-8378-8972eb8c7ebd"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 07:55:27.791462 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.791440 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8afbf877-96ee-4486-8378-8972eb8c7ebd-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "8afbf877-96ee-4486-8378-8972eb8c7ebd" (UID: "8afbf877-96ee-4486-8378-8972eb8c7ebd"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:55:27.793454 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.793415 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8afbf877-96ee-4486-8378-8972eb8c7ebd-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "8afbf877-96ee-4486-8378-8972eb8c7ebd" (UID: "8afbf877-96ee-4486-8378-8972eb8c7ebd"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:55:27.793543 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.793444 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "8afbf877-96ee-4486-8378-8972eb8c7ebd" (UID: "8afbf877-96ee-4486-8378-8972eb8c7ebd"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:55:27.793590 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.793546 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8afbf877-96ee-4486-8378-8972eb8c7ebd-config-out" (OuterVolumeSpecName: "config-out") pod "8afbf877-96ee-4486-8378-8972eb8c7ebd" (UID: "8afbf877-96ee-4486-8378-8972eb8c7ebd"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 07:55:27.794561 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.794524 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "8afbf877-96ee-4486-8378-8972eb8c7ebd" (UID: "8afbf877-96ee-4486-8378-8972eb8c7ebd"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:55:27.794776 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.794751 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "8afbf877-96ee-4486-8378-8972eb8c7ebd" (UID: "8afbf877-96ee-4486-8378-8972eb8c7ebd"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:55:27.795235 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.795212 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8afbf877-96ee-4486-8378-8972eb8c7ebd-kube-api-access-fppxp" (OuterVolumeSpecName: "kube-api-access-fppxp") pod "8afbf877-96ee-4486-8378-8972eb8c7ebd" (UID: "8afbf877-96ee-4486-8378-8972eb8c7ebd"). InnerVolumeSpecName "kube-api-access-fppxp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:55:27.795725 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.795705 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "8afbf877-96ee-4486-8378-8972eb8c7ebd" (UID: "8afbf877-96ee-4486-8378-8972eb8c7ebd"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:55:27.799054 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.799032 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "8afbf877-96ee-4486-8378-8972eb8c7ebd" (UID: "8afbf877-96ee-4486-8378-8972eb8c7ebd"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:55:27.805817 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.805794 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-web-config" (OuterVolumeSpecName: "web-config") pod "8afbf877-96ee-4486-8378-8972eb8c7ebd" (UID: "8afbf877-96ee-4486-8378-8972eb8c7ebd"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:55:27.890688 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.890651 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-config-volume\") pod \"8afbf877-96ee-4486-8378-8972eb8c7ebd\" (UID: \"8afbf877-96ee-4486-8378-8972eb8c7ebd\") " Apr 21 07:55:27.890891 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.890815 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:55:27.890891 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.890827 2570 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-cluster-tls-config\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:55:27.890891 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.890837 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fppxp\" (UniqueName: \"kubernetes.io/projected/8afbf877-96ee-4486-8378-8972eb8c7ebd-kube-api-access-fppxp\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:55:27.890891 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.890847 2570 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8afbf877-96ee-4486-8378-8972eb8c7ebd-metrics-client-ca\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:55:27.890891 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.890857 2570 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8afbf877-96ee-4486-8378-8972eb8c7ebd-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:55:27.890891 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.890891 2570 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-web-config\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:55:27.891148 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.890905 2570 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8afbf877-96ee-4486-8378-8972eb8c7ebd-alertmanager-main-db\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:55:27.891148 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.890916 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:55:27.891148 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.890925 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-main-tls\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:55:27.891148 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.890934 2570 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8afbf877-96ee-4486-8378-8972eb8c7ebd-tls-assets\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:55:27.891148 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.890942 2570 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8afbf877-96ee-4486-8378-8972eb8c7ebd-config-out\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:55:27.891148 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.890950 2570 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:55:27.892654 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.892628 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-config-volume" (OuterVolumeSpecName: "config-volume") pod "8afbf877-96ee-4486-8378-8972eb8c7ebd" (UID: "8afbf877-96ee-4486-8378-8972eb8c7ebd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:55:27.991789 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:27.991703 2570 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8afbf877-96ee-4486-8378-8972eb8c7ebd-config-volume\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:55:28.488209 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.488172 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8afbf877-96ee-4486-8378-8972eb8c7ebd","Type":"ContainerDied","Data":"a656690e65b49520a8ac3a95885f14255e9c4e5e69b10f2fe3b2593e1ed91c55"} Apr 21 07:55:28.488625 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.488219 2570 scope.go:117] "RemoveContainer" containerID="7d3d76783df71f32032152e651ebfb879469a0797fdfb5601c4e0630f84eb2a2" Apr 21 07:55:28.488625 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.488247 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.495754 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.495614 2570 scope.go:117] "RemoveContainer" containerID="8aaefcdda76482b91d34372ff4387e09cd87e3ee2690ba819e3266321cec22ec" Apr 21 07:55:28.502405 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.502387 2570 scope.go:117] "RemoveContainer" containerID="b224a39c766b43852b524d197d32e5e5b4583396ace359c4299a82850b811ae2" Apr 21 07:55:28.510555 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.510083 2570 scope.go:117] "RemoveContainer" containerID="aa7ae5fe780bd89b1c84f207589162352028f380c1aec60dc5235c461b3cf5b1" Apr 21 07:55:28.511762 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.511741 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 07:55:28.514550 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.514532 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 07:55:28.517207 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.517190 2570 scope.go:117] "RemoveContainer" containerID="c6e7b32e2a7f6615a0a1a8c39d89c1ef7ae14422c2be8e193a9681f7338834e3" Apr 21 07:55:28.523404 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.523387 2570 scope.go:117] "RemoveContainer" containerID="ca9339aa392673ae12afc0b05cd0687a90f349d56e8fbcdca20f4bbbbce4b4f5" Apr 21 07:55:28.529349 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.529333 2570 scope.go:117] "RemoveContainer" containerID="074840e062f372612950ec8255a4d7c1a35426d545e0f33a37c0dce5d711bf2f" Apr 21 07:55:28.538331 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.538311 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 07:55:28.538586 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.538575 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="kube-rbac-proxy-metric" Apr 21 07:55:28.538647 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.538587 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="kube-rbac-proxy-metric" Apr 21 07:55:28.538647 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.538596 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="prom-label-proxy" Apr 21 07:55:28.538647 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.538603 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="prom-label-proxy" Apr 21 07:55:28.538647 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.538610 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="init-config-reloader" Apr 21 07:55:28.538647 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.538616 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="init-config-reloader" Apr 21 07:55:28.538647 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.538623 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="kube-rbac-proxy-web" Apr 21 07:55:28.538647 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.538628 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="kube-rbac-proxy-web" Apr 21 07:55:28.538647 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.538635 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="config-reloader" Apr 21 07:55:28.538647 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.538640 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="config-reloader" Apr 21 07:55:28.538647 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.538650 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="kube-rbac-proxy" Apr 21 07:55:28.539090 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.538655 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="kube-rbac-proxy" Apr 21 07:55:28.539090 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.538664 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="alertmanager" Apr 21 07:55:28.539090 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.538669 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="alertmanager" Apr 21 07:55:28.539090 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.538738 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="kube-rbac-proxy-web" Apr 21 07:55:28.539090 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.538746 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="kube-rbac-proxy" Apr 21 07:55:28.539090 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.538754 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="config-reloader" Apr 21 07:55:28.539090 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.538761 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="alertmanager" Apr 21 07:55:28.539090 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.538767 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="kube-rbac-proxy-metric" Apr 21 07:55:28.539090 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.538772 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" containerName="prom-label-proxy" Apr 21 07:55:28.575714 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.575693 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 07:55:28.575854 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.575842 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.578744 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.578722 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 21 07:55:28.578744 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.578735 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-xbxr6\"" Apr 21 07:55:28.579055 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.579027 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 21 07:55:28.579158 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.579065 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 21 07:55:28.579158 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.579118 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 21 07:55:28.579158 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.579146 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 21 07:55:28.579503 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.579481 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 21 07:55:28.579590 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.579542 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 21 07:55:28.579651 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.579542 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 21 07:55:28.583582 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.583539 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 21 07:55:28.596183 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.596162 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7b7561b5-9b61-4960-9e16-e8702e51254a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.596260 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.596186 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b7561b5-9b61-4960-9e16-e8702e51254a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.596260 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.596205 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp4rr\" (UniqueName: \"kubernetes.io/projected/7b7561b5-9b61-4960-9e16-e8702e51254a-kube-api-access-vp4rr\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.596260 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.596227 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7b7561b5-9b61-4960-9e16-e8702e51254a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.596378 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.596275 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7b7561b5-9b61-4960-9e16-e8702e51254a-config-out\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.596378 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.596318 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7b7561b5-9b61-4960-9e16-e8702e51254a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.596378 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.596361 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7b7561b5-9b61-4960-9e16-e8702e51254a-config-volume\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.596464 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.596382 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7b7561b5-9b61-4960-9e16-e8702e51254a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.596464 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.596404 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7b7561b5-9b61-4960-9e16-e8702e51254a-web-config\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.596464 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.596423 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7b7561b5-9b61-4960-9e16-e8702e51254a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.596556 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.596467 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7b7561b5-9b61-4960-9e16-e8702e51254a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.596556 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.596510 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7b7561b5-9b61-4960-9e16-e8702e51254a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.596556 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.596526 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7b7561b5-9b61-4960-9e16-e8702e51254a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.697166 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.697137 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7b7561b5-9b61-4960-9e16-e8702e51254a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.697166 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.697170 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7b7561b5-9b61-4960-9e16-e8702e51254a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.697322 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.697198 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7b7561b5-9b61-4960-9e16-e8702e51254a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.697322 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.697294 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b7561b5-9b61-4960-9e16-e8702e51254a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.697418 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.697331 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vp4rr\" (UniqueName: \"kubernetes.io/projected/7b7561b5-9b61-4960-9e16-e8702e51254a-kube-api-access-vp4rr\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.697418 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.697374 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7b7561b5-9b61-4960-9e16-e8702e51254a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.697418 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.697402 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7b7561b5-9b61-4960-9e16-e8702e51254a-config-out\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.697565 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.697446 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7b7561b5-9b61-4960-9e16-e8702e51254a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.697565 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.697494 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7b7561b5-9b61-4960-9e16-e8702e51254a-config-volume\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.697565 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.697524 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7b7561b5-9b61-4960-9e16-e8702e51254a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.697565 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.697531 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7b7561b5-9b61-4960-9e16-e8702e51254a-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.697565 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.697550 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7b7561b5-9b61-4960-9e16-e8702e51254a-web-config\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.697882 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.697582 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7b7561b5-9b61-4960-9e16-e8702e51254a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.697882 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.697628 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7b7561b5-9b61-4960-9e16-e8702e51254a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.697990 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.697947 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7b7561b5-9b61-4960-9e16-e8702e51254a-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.698617 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.698307 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7b7561b5-9b61-4960-9e16-e8702e51254a-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.700264 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.700244 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7b7561b5-9b61-4960-9e16-e8702e51254a-web-config\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.700371 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.700352 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7b7561b5-9b61-4960-9e16-e8702e51254a-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.700434 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.700401 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7b7561b5-9b61-4960-9e16-e8702e51254a-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.700624 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.700604 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7b7561b5-9b61-4960-9e16-e8702e51254a-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.700670 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.700651 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7b7561b5-9b61-4960-9e16-e8702e51254a-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.700891 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.700851 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7b7561b5-9b61-4960-9e16-e8702e51254a-config-out\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.701092 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.701076 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7b7561b5-9b61-4960-9e16-e8702e51254a-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.701177 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.701157 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/7b7561b5-9b61-4960-9e16-e8702e51254a-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.702299 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.702283 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7b7561b5-9b61-4960-9e16-e8702e51254a-config-volume\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.705202 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.705186 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp4rr\" (UniqueName: \"kubernetes.io/projected/7b7561b5-9b61-4960-9e16-e8702e51254a-kube-api-access-vp4rr\") pod \"alertmanager-main-0\" (UID: \"7b7561b5-9b61-4960-9e16-e8702e51254a\") " pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:28.774006 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.773926 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8afbf877-96ee-4486-8378-8972eb8c7ebd" path="/var/lib/kubelet/pods/8afbf877-96ee-4486-8378-8972eb8c7ebd/volumes" Apr 21 07:55:28.885768 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:28.885736 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 21 07:55:29.014995 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:29.014972 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 21 07:55:29.017298 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:55:29.017269 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b7561b5_9b61_4960_9e16_e8702e51254a.slice/crio-f4ef6ee07ad2adb0cac14c6984cf1924e412f5d9ef3627180c84f307d18250d2 WatchSource:0}: Error finding container f4ef6ee07ad2adb0cac14c6984cf1924e412f5d9ef3627180c84f307d18250d2: Status 404 returned error can't find the container with id f4ef6ee07ad2adb0cac14c6984cf1924e412f5d9ef3627180c84f307d18250d2 Apr 21 07:55:29.492242 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:29.492201 2570 generic.go:358] "Generic (PLEG): container finished" podID="7b7561b5-9b61-4960-9e16-e8702e51254a" containerID="c94c4a0e625816c4e3886d924f539327399e4df54b1520197159919883791270" exitCode=0 Apr 21 07:55:29.492676 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:29.492289 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b7561b5-9b61-4960-9e16-e8702e51254a","Type":"ContainerDied","Data":"c94c4a0e625816c4e3886d924f539327399e4df54b1520197159919883791270"} Apr 21 07:55:29.492676 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:29.492322 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b7561b5-9b61-4960-9e16-e8702e51254a","Type":"ContainerStarted","Data":"f4ef6ee07ad2adb0cac14c6984cf1924e412f5d9ef3627180c84f307d18250d2"} Apr 21 07:55:30.498761 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:30.498726 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b7561b5-9b61-4960-9e16-e8702e51254a","Type":"ContainerStarted","Data":"4818c5dc7e7b3f82a4ecd14b52693cc67e1ecf9f79fdc9514f9b4e6bf4a50f31"} Apr 21 07:55:30.498761 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:30.498762 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b7561b5-9b61-4960-9e16-e8702e51254a","Type":"ContainerStarted","Data":"29e8062955ad6bb72daee1a6f95770aaf04dd84c2abb28e30e15e8eab11908f1"} Apr 21 07:55:30.499198 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:30.498771 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b7561b5-9b61-4960-9e16-e8702e51254a","Type":"ContainerStarted","Data":"0a2e4ef78d7012bc0176caa65d4aa5d4fac31bd4046b51fa81abe703383721ed"} Apr 21 07:55:30.499198 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:30.498780 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b7561b5-9b61-4960-9e16-e8702e51254a","Type":"ContainerStarted","Data":"78612b5f5d543bd0c9a9930b2b73fd2bc45c9214533c4b2babde7d452327e15c"} Apr 21 07:55:30.499198 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:30.498788 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b7561b5-9b61-4960-9e16-e8702e51254a","Type":"ContainerStarted","Data":"eb7ceaec72b35c40e19affe88415e8e86905a41fe841c2f545f18a4eb0163adf"} Apr 21 07:55:30.499198 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:30.498796 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7b7561b5-9b61-4960-9e16-e8702e51254a","Type":"ContainerStarted","Data":"e99962c454a60456d4ad9f05fe156bb19a7b9496c29b2d4d9f0164185ff5663b"} Apr 21 07:55:30.531395 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:30.531344 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.5313300290000003 podStartE2EDuration="2.531330029s" podCreationTimestamp="2026-04-21 07:55:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:55:30.530447167 +0000 UTC m=+230.368834517" watchObservedRunningTime="2026-04-21 07:55:30.531330029 +0000 UTC m=+230.369717378" Apr 21 07:55:35.477902 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:35.477849 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:35.478387 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:35.477925 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:35.482731 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:35.482711 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:35.517183 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:35.517160 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6998c9ccdc-h68np" Apr 21 07:55:35.561186 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:35.561153 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-578766656d-jdz8k"] Apr 21 07:55:52.614713 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:52.614627 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs\") pod \"network-metrics-daemon-bbxng\" (UID: \"62e778a8-8270-4560-9d0d-41a95a3c9c5f\") " pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:55:52.616902 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:52.616880 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/62e778a8-8270-4560-9d0d-41a95a3c9c5f-metrics-certs\") pod \"network-metrics-daemon-bbxng\" (UID: \"62e778a8-8270-4560-9d0d-41a95a3c9c5f\") " pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:55:52.773492 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:52.773459 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9jd8h\"" Apr 21 07:55:52.781399 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:52.781381 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-bbxng" Apr 21 07:55:52.913112 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:52.913082 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-bbxng"] Apr 21 07:55:52.917224 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:55:52.917194 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62e778a8_8270_4560_9d0d_41a95a3c9c5f.slice/crio-81786420e7829e284fc3927cc1c217e987bd9725c1fd5b68b99f184b14106127 WatchSource:0}: Error finding container 81786420e7829e284fc3927cc1c217e987bd9725c1fd5b68b99f184b14106127: Status 404 returned error can't find the container with id 81786420e7829e284fc3927cc1c217e987bd9725c1fd5b68b99f184b14106127 Apr 21 07:55:53.566556 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:53.566518 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bbxng" event={"ID":"62e778a8-8270-4560-9d0d-41a95a3c9c5f","Type":"ContainerStarted","Data":"81786420e7829e284fc3927cc1c217e987bd9725c1fd5b68b99f184b14106127"} Apr 21 07:55:54.570886 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:54.570833 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bbxng" event={"ID":"62e778a8-8270-4560-9d0d-41a95a3c9c5f","Type":"ContainerStarted","Data":"f4551bc2c5ef1167594ad0e265b23f1968a3eef0ecca938b150817d3bd240cdb"} Apr 21 07:55:54.570886 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:54.570891 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-bbxng" event={"ID":"62e778a8-8270-4560-9d0d-41a95a3c9c5f","Type":"ContainerStarted","Data":"7f95e8984d5051a9d2334ce6b126d7e8927aed41a1e6df0df7719e34cf14f46d"} Apr 21 07:55:54.588123 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:55:54.588077 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-bbxng" podStartSLOduration=253.618167721 podStartE2EDuration="4m14.588065253s" podCreationTimestamp="2026-04-21 07:51:40 +0000 UTC" firstStartedPulling="2026-04-21 07:55:52.919410193 +0000 UTC m=+252.757797522" lastFinishedPulling="2026-04-21 07:55:53.889307716 +0000 UTC m=+253.727695054" observedRunningTime="2026-04-21 07:55:54.587290385 +0000 UTC m=+254.425677736" watchObservedRunningTime="2026-04-21 07:55:54.588065253 +0000 UTC m=+254.426452602" Apr 21 07:56:00.580030 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:00.579982 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-578766656d-jdz8k" podUID="cdf2acca-2cd9-4648-aa3f-eefd7bf76627" containerName="console" containerID="cri-o://83bd1cabf01585cd329340189e794a1ea223e7abeed040b342022ca7bce08e1e" gracePeriod=15 Apr 21 07:56:00.818181 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:00.818159 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-578766656d-jdz8k_cdf2acca-2cd9-4648-aa3f-eefd7bf76627/console/0.log" Apr 21 07:56:00.818286 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:00.818215 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:56:00.988031 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:00.988001 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nmrw\" (UniqueName: \"kubernetes.io/projected/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-kube-api-access-5nmrw\") pod \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " Apr 21 07:56:00.988199 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:00.988045 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-console-oauth-config\") pod \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " Apr 21 07:56:00.988199 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:00.988119 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-console-serving-cert\") pod \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " Apr 21 07:56:00.988199 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:00.988154 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-oauth-serving-cert\") pod \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " Apr 21 07:56:00.988199 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:00.988178 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-trusted-ca-bundle\") pod \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " Apr 21 07:56:00.988392 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:00.988203 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-service-ca\") pod \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " Apr 21 07:56:00.988392 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:00.988241 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-console-config\") pod \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\" (UID: \"cdf2acca-2cd9-4648-aa3f-eefd7bf76627\") " Apr 21 07:56:00.988649 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:00.988619 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-service-ca" (OuterVolumeSpecName: "service-ca") pod "cdf2acca-2cd9-4648-aa3f-eefd7bf76627" (UID: "cdf2acca-2cd9-4648-aa3f-eefd7bf76627"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:56:00.988758 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:00.988621 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "cdf2acca-2cd9-4648-aa3f-eefd7bf76627" (UID: "cdf2acca-2cd9-4648-aa3f-eefd7bf76627"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:56:00.988758 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:00.988743 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "cdf2acca-2cd9-4648-aa3f-eefd7bf76627" (UID: "cdf2acca-2cd9-4648-aa3f-eefd7bf76627"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:56:00.988893 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:00.988789 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-console-config" (OuterVolumeSpecName: "console-config") pod "cdf2acca-2cd9-4648-aa3f-eefd7bf76627" (UID: "cdf2acca-2cd9-4648-aa3f-eefd7bf76627"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 07:56:00.990391 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:00.990356 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "cdf2acca-2cd9-4648-aa3f-eefd7bf76627" (UID: "cdf2acca-2cd9-4648-aa3f-eefd7bf76627"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:56:00.990491 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:00.990401 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-kube-api-access-5nmrw" (OuterVolumeSpecName: "kube-api-access-5nmrw") pod "cdf2acca-2cd9-4648-aa3f-eefd7bf76627" (UID: "cdf2acca-2cd9-4648-aa3f-eefd7bf76627"). InnerVolumeSpecName "kube-api-access-5nmrw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:56:00.990491 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:00.990425 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "cdf2acca-2cd9-4648-aa3f-eefd7bf76627" (UID: "cdf2acca-2cd9-4648-aa3f-eefd7bf76627"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 07:56:01.088884 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:01.088837 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5nmrw\" (UniqueName: \"kubernetes.io/projected/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-kube-api-access-5nmrw\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:56:01.088884 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:01.088884 2570 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-console-oauth-config\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:56:01.089068 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:01.088896 2570 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-console-serving-cert\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:56:01.089068 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:01.088905 2570 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-oauth-serving-cert\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:56:01.089068 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:01.088914 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-trusted-ca-bundle\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:56:01.089068 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:01.088925 2570 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-service-ca\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:56:01.089068 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:01.088933 2570 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/cdf2acca-2cd9-4648-aa3f-eefd7bf76627-console-config\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:56:01.594574 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:01.594545 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-578766656d-jdz8k_cdf2acca-2cd9-4648-aa3f-eefd7bf76627/console/0.log" Apr 21 07:56:01.594967 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:01.594585 2570 generic.go:358] "Generic (PLEG): container finished" podID="cdf2acca-2cd9-4648-aa3f-eefd7bf76627" containerID="83bd1cabf01585cd329340189e794a1ea223e7abeed040b342022ca7bce08e1e" exitCode=2 Apr 21 07:56:01.594967 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:01.594632 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-578766656d-jdz8k" event={"ID":"cdf2acca-2cd9-4648-aa3f-eefd7bf76627","Type":"ContainerDied","Data":"83bd1cabf01585cd329340189e794a1ea223e7abeed040b342022ca7bce08e1e"} Apr 21 07:56:01.594967 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:01.594655 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-578766656d-jdz8k" event={"ID":"cdf2acca-2cd9-4648-aa3f-eefd7bf76627","Type":"ContainerDied","Data":"2ce63f8e679bbe3e1f6b8d4eba4931504a68f951fec6fbe96d9ea6dd733ad979"} Apr 21 07:56:01.594967 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:01.594669 2570 scope.go:117] "RemoveContainer" containerID="83bd1cabf01585cd329340189e794a1ea223e7abeed040b342022ca7bce08e1e" Apr 21 07:56:01.594967 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:01.594676 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-578766656d-jdz8k" Apr 21 07:56:01.602562 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:01.602544 2570 scope.go:117] "RemoveContainer" containerID="83bd1cabf01585cd329340189e794a1ea223e7abeed040b342022ca7bce08e1e" Apr 21 07:56:01.602811 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:56:01.602792 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83bd1cabf01585cd329340189e794a1ea223e7abeed040b342022ca7bce08e1e\": container with ID starting with 83bd1cabf01585cd329340189e794a1ea223e7abeed040b342022ca7bce08e1e not found: ID does not exist" containerID="83bd1cabf01585cd329340189e794a1ea223e7abeed040b342022ca7bce08e1e" Apr 21 07:56:01.602858 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:01.602818 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83bd1cabf01585cd329340189e794a1ea223e7abeed040b342022ca7bce08e1e"} err="failed to get container status \"83bd1cabf01585cd329340189e794a1ea223e7abeed040b342022ca7bce08e1e\": rpc error: code = NotFound desc = could not find container \"83bd1cabf01585cd329340189e794a1ea223e7abeed040b342022ca7bce08e1e\": container with ID starting with 83bd1cabf01585cd329340189e794a1ea223e7abeed040b342022ca7bce08e1e not found: ID does not exist" Apr 21 07:56:01.617325 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:01.617302 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-578766656d-jdz8k"] Apr 21 07:56:01.621261 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:01.621241 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-578766656d-jdz8k"] Apr 21 07:56:02.774007 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:02.773972 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdf2acca-2cd9-4648-aa3f-eefd7bf76627" path="/var/lib/kubelet/pods/cdf2acca-2cd9-4648-aa3f-eefd7bf76627/volumes" Apr 21 07:56:17.570313 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:17.570279 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fz8g7/must-gather-l7k2v"] Apr 21 07:56:17.570737 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:17.570597 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cdf2acca-2cd9-4648-aa3f-eefd7bf76627" containerName="console" Apr 21 07:56:17.570737 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:17.570609 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdf2acca-2cd9-4648-aa3f-eefd7bf76627" containerName="console" Apr 21 07:56:17.570737 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:17.570675 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="cdf2acca-2cd9-4648-aa3f-eefd7bf76627" containerName="console" Apr 21 07:56:17.576315 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:17.576295 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fz8g7/must-gather-l7k2v" Apr 21 07:56:17.579302 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:17.579276 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-fz8g7\"/\"openshift-service-ca.crt\"" Apr 21 07:56:17.579399 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:17.579314 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-fz8g7\"/\"kube-root-ca.crt\"" Apr 21 07:56:17.592292 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:17.592269 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fz8g7/must-gather-l7k2v"] Apr 21 07:56:17.621173 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:17.621149 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d2d1caea-bf3b-4465-9de3-ed2ff4accca3-must-gather-output\") pod \"must-gather-l7k2v\" (UID: \"d2d1caea-bf3b-4465-9de3-ed2ff4accca3\") " pod="openshift-must-gather-fz8g7/must-gather-l7k2v" Apr 21 07:56:17.621294 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:17.621178 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-642dv\" (UniqueName: \"kubernetes.io/projected/d2d1caea-bf3b-4465-9de3-ed2ff4accca3-kube-api-access-642dv\") pod \"must-gather-l7k2v\" (UID: \"d2d1caea-bf3b-4465-9de3-ed2ff4accca3\") " pod="openshift-must-gather-fz8g7/must-gather-l7k2v" Apr 21 07:56:17.722131 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:17.722094 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d2d1caea-bf3b-4465-9de3-ed2ff4accca3-must-gather-output\") pod \"must-gather-l7k2v\" (UID: \"d2d1caea-bf3b-4465-9de3-ed2ff4accca3\") " pod="openshift-must-gather-fz8g7/must-gather-l7k2v" Apr 21 07:56:17.722131 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:17.722129 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-642dv\" (UniqueName: \"kubernetes.io/projected/d2d1caea-bf3b-4465-9de3-ed2ff4accca3-kube-api-access-642dv\") pod \"must-gather-l7k2v\" (UID: \"d2d1caea-bf3b-4465-9de3-ed2ff4accca3\") " pod="openshift-must-gather-fz8g7/must-gather-l7k2v" Apr 21 07:56:17.722495 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:17.722471 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d2d1caea-bf3b-4465-9de3-ed2ff4accca3-must-gather-output\") pod \"must-gather-l7k2v\" (UID: \"d2d1caea-bf3b-4465-9de3-ed2ff4accca3\") " pod="openshift-must-gather-fz8g7/must-gather-l7k2v" Apr 21 07:56:17.730569 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:17.730541 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-642dv\" (UniqueName: \"kubernetes.io/projected/d2d1caea-bf3b-4465-9de3-ed2ff4accca3-kube-api-access-642dv\") pod \"must-gather-l7k2v\" (UID: \"d2d1caea-bf3b-4465-9de3-ed2ff4accca3\") " pod="openshift-must-gather-fz8g7/must-gather-l7k2v" Apr 21 07:56:17.897840 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:17.897807 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fz8g7/must-gather-l7k2v" Apr 21 07:56:18.016655 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:18.016557 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fz8g7/must-gather-l7k2v"] Apr 21 07:56:18.019282 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:56:18.019249 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2d1caea_bf3b_4465_9de3_ed2ff4accca3.slice/crio-58a1ba755a2e8d9168a245fdbff2bb774047b7cd2f4b667814b8b9dc32c7d16e WatchSource:0}: Error finding container 58a1ba755a2e8d9168a245fdbff2bb774047b7cd2f4b667814b8b9dc32c7d16e: Status 404 returned error can't find the container with id 58a1ba755a2e8d9168a245fdbff2bb774047b7cd2f4b667814b8b9dc32c7d16e Apr 21 07:56:18.648451 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:18.648399 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fz8g7/must-gather-l7k2v" event={"ID":"d2d1caea-bf3b-4465-9de3-ed2ff4accca3","Type":"ContainerStarted","Data":"58a1ba755a2e8d9168a245fdbff2bb774047b7cd2f4b667814b8b9dc32c7d16e"} Apr 21 07:56:23.665619 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:23.665581 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fz8g7/must-gather-l7k2v" event={"ID":"d2d1caea-bf3b-4465-9de3-ed2ff4accca3","Type":"ContainerStarted","Data":"739af6b2cea96333cca0863e7098b1b2b9cdd4ce5d8df2b2c98506918631618a"} Apr 21 07:56:23.665619 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:23.665622 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fz8g7/must-gather-l7k2v" event={"ID":"d2d1caea-bf3b-4465-9de3-ed2ff4accca3","Type":"ContainerStarted","Data":"427e6de76c25befa8549d510e3238844a513d574ef3a2fe6791c2e1e483e97cf"} Apr 21 07:56:23.681032 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:23.680986 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fz8g7/must-gather-l7k2v" podStartSLOduration=1.921819583 podStartE2EDuration="6.680971164s" podCreationTimestamp="2026-04-21 07:56:17 +0000 UTC" firstStartedPulling="2026-04-21 07:56:18.020913014 +0000 UTC m=+277.859300355" lastFinishedPulling="2026-04-21 07:56:22.780064605 +0000 UTC m=+282.618451936" observedRunningTime="2026-04-21 07:56:23.679310465 +0000 UTC m=+283.517697817" watchObservedRunningTime="2026-04-21 07:56:23.680971164 +0000 UTC m=+283.519358512" Apr 21 07:56:30.688043 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:30.688005 2570 generic.go:358] "Generic (PLEG): container finished" podID="d2d1caea-bf3b-4465-9de3-ed2ff4accca3" containerID="427e6de76c25befa8549d510e3238844a513d574ef3a2fe6791c2e1e483e97cf" exitCode=0 Apr 21 07:56:30.688466 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:30.688070 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fz8g7/must-gather-l7k2v" event={"ID":"d2d1caea-bf3b-4465-9de3-ed2ff4accca3","Type":"ContainerDied","Data":"427e6de76c25befa8549d510e3238844a513d574ef3a2fe6791c2e1e483e97cf"} Apr 21 07:56:30.688466 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:30.688379 2570 scope.go:117] "RemoveContainer" containerID="427e6de76c25befa8549d510e3238844a513d574ef3a2fe6791c2e1e483e97cf" Apr 21 07:56:31.503591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:31.503560 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fz8g7_must-gather-l7k2v_d2d1caea-bf3b-4465-9de3-ed2ff4accca3/gather/0.log" Apr 21 07:56:34.562970 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:34.562940 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-pnsks_a242e9c5-b6e9-4008-8745-35d3c8424baf/global-pull-secret-syncer/0.log" Apr 21 07:56:34.631218 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:34.631187 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-gc4bz_dd018993-e8b6-4e3f-b574-2a1a71e75ce7/konnectivity-agent/0.log" Apr 21 07:56:34.699698 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:34.699668 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-194.ec2.internal_dc75abb31154ae5273a07d0d5f2959e8/haproxy/0.log" Apr 21 07:56:36.817289 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:36.817252 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fz8g7/must-gather-l7k2v"] Apr 21 07:56:36.817764 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:36.817551 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-fz8g7/must-gather-l7k2v" podUID="d2d1caea-bf3b-4465-9de3-ed2ff4accca3" containerName="copy" containerID="cri-o://739af6b2cea96333cca0863e7098b1b2b9cdd4ce5d8df2b2c98506918631618a" gracePeriod=2 Apr 21 07:56:36.819089 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:36.819064 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fz8g7/must-gather-l7k2v"] Apr 21 07:56:36.819661 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:36.819633 2570 status_manager.go:895] "Failed to get status for pod" podUID="d2d1caea-bf3b-4465-9de3-ed2ff4accca3" pod="openshift-must-gather-fz8g7/must-gather-l7k2v" err="pods \"must-gather-l7k2v\" is forbidden: User \"system:node:ip-10-0-137-194.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-fz8g7\": no relationship found between node 'ip-10-0-137-194.ec2.internal' and this object" Apr 21 07:56:37.041299 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.041276 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fz8g7_must-gather-l7k2v_d2d1caea-bf3b-4465-9de3-ed2ff4accca3/copy/0.log" Apr 21 07:56:37.041621 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.041605 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fz8g7/must-gather-l7k2v" Apr 21 07:56:37.043568 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.043546 2570 status_manager.go:895] "Failed to get status for pod" podUID="d2d1caea-bf3b-4465-9de3-ed2ff4accca3" pod="openshift-must-gather-fz8g7/must-gather-l7k2v" err="pods \"must-gather-l7k2v\" is forbidden: User \"system:node:ip-10-0-137-194.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-fz8g7\": no relationship found between node 'ip-10-0-137-194.ec2.internal' and this object" Apr 21 07:56:37.092069 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.092012 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d2d1caea-bf3b-4465-9de3-ed2ff4accca3-must-gather-output\") pod \"d2d1caea-bf3b-4465-9de3-ed2ff4accca3\" (UID: \"d2d1caea-bf3b-4465-9de3-ed2ff4accca3\") " Apr 21 07:56:37.092069 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.092061 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-642dv\" (UniqueName: \"kubernetes.io/projected/d2d1caea-bf3b-4465-9de3-ed2ff4accca3-kube-api-access-642dv\") pod \"d2d1caea-bf3b-4465-9de3-ed2ff4accca3\" (UID: \"d2d1caea-bf3b-4465-9de3-ed2ff4accca3\") " Apr 21 07:56:37.092363 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.092340 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2d1caea-bf3b-4465-9de3-ed2ff4accca3-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d2d1caea-bf3b-4465-9de3-ed2ff4accca3" (UID: "d2d1caea-bf3b-4465-9de3-ed2ff4accca3"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 07:56:37.094174 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.094153 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2d1caea-bf3b-4465-9de3-ed2ff4accca3-kube-api-access-642dv" (OuterVolumeSpecName: "kube-api-access-642dv") pod "d2d1caea-bf3b-4465-9de3-ed2ff4accca3" (UID: "d2d1caea-bf3b-4465-9de3-ed2ff4accca3"). InnerVolumeSpecName "kube-api-access-642dv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 07:56:37.193130 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.193085 2570 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d2d1caea-bf3b-4465-9de3-ed2ff4accca3-must-gather-output\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:56:37.193130 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.193128 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-642dv\" (UniqueName: \"kubernetes.io/projected/d2d1caea-bf3b-4465-9de3-ed2ff4accca3-kube-api-access-642dv\") on node \"ip-10-0-137-194.ec2.internal\" DevicePath \"\"" Apr 21 07:56:37.537828 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.537803 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7b7561b5-9b61-4960-9e16-e8702e51254a/alertmanager/0.log" Apr 21 07:56:37.568190 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.568161 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7b7561b5-9b61-4960-9e16-e8702e51254a/config-reloader/0.log" Apr 21 07:56:37.592174 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.592138 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7b7561b5-9b61-4960-9e16-e8702e51254a/kube-rbac-proxy-web/0.log" Apr 21 07:56:37.624486 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.624464 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7b7561b5-9b61-4960-9e16-e8702e51254a/kube-rbac-proxy/0.log" Apr 21 07:56:37.646537 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.646513 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7b7561b5-9b61-4960-9e16-e8702e51254a/kube-rbac-proxy-metric/0.log" Apr 21 07:56:37.671519 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.671491 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7b7561b5-9b61-4960-9e16-e8702e51254a/prom-label-proxy/0.log" Apr 21 07:56:37.699715 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.699692 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_7b7561b5-9b61-4960-9e16-e8702e51254a/init-config-reloader/0.log" Apr 21 07:56:37.707919 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.707899 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fz8g7_must-gather-l7k2v_d2d1caea-bf3b-4465-9de3-ed2ff4accca3/copy/0.log" Apr 21 07:56:37.708267 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.708250 2570 generic.go:358] "Generic (PLEG): container finished" podID="d2d1caea-bf3b-4465-9de3-ed2ff4accca3" containerID="739af6b2cea96333cca0863e7098b1b2b9cdd4ce5d8df2b2c98506918631618a" exitCode=143 Apr 21 07:56:37.708322 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.708304 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fz8g7/must-gather-l7k2v" Apr 21 07:56:37.708370 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.708356 2570 scope.go:117] "RemoveContainer" containerID="739af6b2cea96333cca0863e7098b1b2b9cdd4ce5d8df2b2c98506918631618a" Apr 21 07:56:37.710512 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.710486 2570 status_manager.go:895] "Failed to get status for pod" podUID="d2d1caea-bf3b-4465-9de3-ed2ff4accca3" pod="openshift-must-gather-fz8g7/must-gather-l7k2v" err="pods \"must-gather-l7k2v\" is forbidden: User \"system:node:ip-10-0-137-194.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-fz8g7\": no relationship found between node 'ip-10-0-137-194.ec2.internal' and this object" Apr 21 07:56:37.715741 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.715723 2570 scope.go:117] "RemoveContainer" containerID="427e6de76c25befa8549d510e3238844a513d574ef3a2fe6791c2e1e483e97cf" Apr 21 07:56:37.718081 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.718052 2570 status_manager.go:895] "Failed to get status for pod" podUID="d2d1caea-bf3b-4465-9de3-ed2ff4accca3" pod="openshift-must-gather-fz8g7/must-gather-l7k2v" err="pods \"must-gather-l7k2v\" is forbidden: User \"system:node:ip-10-0-137-194.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-fz8g7\": no relationship found between node 'ip-10-0-137-194.ec2.internal' and this object" Apr 21 07:56:37.726896 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.726881 2570 scope.go:117] "RemoveContainer" containerID="739af6b2cea96333cca0863e7098b1b2b9cdd4ce5d8df2b2c98506918631618a" Apr 21 07:56:37.727195 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:56:37.727176 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"739af6b2cea96333cca0863e7098b1b2b9cdd4ce5d8df2b2c98506918631618a\": container with ID starting with 739af6b2cea96333cca0863e7098b1b2b9cdd4ce5d8df2b2c98506918631618a not found: ID does not exist" containerID="739af6b2cea96333cca0863e7098b1b2b9cdd4ce5d8df2b2c98506918631618a" Apr 21 07:56:37.727287 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.727204 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"739af6b2cea96333cca0863e7098b1b2b9cdd4ce5d8df2b2c98506918631618a"} err="failed to get container status \"739af6b2cea96333cca0863e7098b1b2b9cdd4ce5d8df2b2c98506918631618a\": rpc error: code = NotFound desc = could not find container \"739af6b2cea96333cca0863e7098b1b2b9cdd4ce5d8df2b2c98506918631618a\": container with ID starting with 739af6b2cea96333cca0863e7098b1b2b9cdd4ce5d8df2b2c98506918631618a not found: ID does not exist" Apr 21 07:56:37.727287 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.727228 2570 scope.go:117] "RemoveContainer" containerID="427e6de76c25befa8549d510e3238844a513d574ef3a2fe6791c2e1e483e97cf" Apr 21 07:56:37.727476 ip-10-0-137-194 kubenswrapper[2570]: E0421 07:56:37.727461 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"427e6de76c25befa8549d510e3238844a513d574ef3a2fe6791c2e1e483e97cf\": container with ID starting with 427e6de76c25befa8549d510e3238844a513d574ef3a2fe6791c2e1e483e97cf not found: ID does not exist" containerID="427e6de76c25befa8549d510e3238844a513d574ef3a2fe6791c2e1e483e97cf" Apr 21 07:56:37.727520 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.727482 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"427e6de76c25befa8549d510e3238844a513d574ef3a2fe6791c2e1e483e97cf"} err="failed to get container status \"427e6de76c25befa8549d510e3238844a513d574ef3a2fe6791c2e1e483e97cf\": rpc error: code = NotFound desc = could not find container \"427e6de76c25befa8549d510e3238844a513d574ef3a2fe6791c2e1e483e97cf\": container with ID starting with 427e6de76c25befa8549d510e3238844a513d574ef3a2fe6791c2e1e483e97cf not found: ID does not exist" Apr 21 07:56:37.846329 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.846222 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-967d7fc7f-6g42n_947c3943-080e-4c53-8507-bd9465492df2/metrics-server/0.log" Apr 21 07:56:37.874579 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.874552 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-zlxkn_d7157023-0d3e-48a7-b859-da313a4a8bb8/monitoring-plugin/0.log" Apr 21 07:56:37.984901 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:37.984855 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hjljn_812db1cf-487b-4a90-a0c8-a72c028f5c45/node-exporter/0.log" Apr 21 07:56:38.009881 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:38.009827 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hjljn_812db1cf-487b-4a90-a0c8-a72c028f5c45/kube-rbac-proxy/0.log" Apr 21 07:56:38.038739 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:38.038713 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hjljn_812db1cf-487b-4a90-a0c8-a72c028f5c45/init-textfile/0.log" Apr 21 07:56:38.131675 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:38.131647 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-xhclh_c7ea73b2-778a-49e2-ba0d-7d918ce7b31d/kube-rbac-proxy-main/0.log" Apr 21 07:56:38.151994 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:38.151967 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-xhclh_c7ea73b2-778a-49e2-ba0d-7d918ce7b31d/kube-rbac-proxy-self/0.log" Apr 21 07:56:38.173190 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:38.173158 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-xhclh_c7ea73b2-778a-49e2-ba0d-7d918ce7b31d/openshift-state-metrics/0.log" Apr 21 07:56:38.773539 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:38.773505 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2d1caea-bf3b-4465-9de3-ed2ff4accca3" path="/var/lib/kubelet/pods/d2d1caea-bf3b-4465-9de3-ed2ff4accca3/volumes" Apr 21 07:56:39.714591 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:39.714557 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-xj2ss_f8912392-4510-4052-97fe-c1f6926ae955/networking-console-plugin/0.log" Apr 21 07:56:40.111507 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:40.111487 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kgsrp_be0cbccd-e9f7-4b37-b3e3-e6ef1514b734/console-operator/2.log" Apr 21 07:56:40.114539 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:40.114522 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kgsrp_be0cbccd-e9f7-4b37-b3e3-e6ef1514b734/console-operator/3.log" Apr 21 07:56:40.457893 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:40.457794 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6998c9ccdc-h68np_3bb403e7-fa87-4be8-a2c4-12540f764fb7/console/0.log" Apr 21 07:56:40.479247 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:40.479218 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-8tdj4_d42b0066-31da-406b-a83f-01a4f0ced05a/download-server/0.log" Apr 21 07:56:40.682081 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:40.682045 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kgsrp_be0cbccd-e9f7-4b37-b3e3-e6ef1514b734/console-operator/2.log" Apr 21 07:56:40.682265 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:40.682106 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kgsrp_be0cbccd-e9f7-4b37-b3e3-e6ef1514b734/console-operator/2.log" Apr 21 07:56:40.685226 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:40.685207 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kshzg_cc312109-c0ed-49ee-b44b-aebf49f43c92/ovn-acl-logging/0.log" Apr 21 07:56:40.685360 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:40.685241 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kshzg_cc312109-c0ed-49ee-b44b-aebf49f43c92/ovn-acl-logging/0.log" Apr 21 07:56:40.691184 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:40.691166 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 07:56:41.192434 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.192404 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p"] Apr 21 07:56:41.194819 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.192691 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2d1caea-bf3b-4465-9de3-ed2ff4accca3" containerName="copy" Apr 21 07:56:41.194819 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.192701 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d1caea-bf3b-4465-9de3-ed2ff4accca3" containerName="copy" Apr 21 07:56:41.194819 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.192723 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d2d1caea-bf3b-4465-9de3-ed2ff4accca3" containerName="gather" Apr 21 07:56:41.194819 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.192729 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2d1caea-bf3b-4465-9de3-ed2ff4accca3" containerName="gather" Apr 21 07:56:41.194819 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.192774 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2d1caea-bf3b-4465-9de3-ed2ff4accca3" containerName="gather" Apr 21 07:56:41.194819 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.192784 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="d2d1caea-bf3b-4465-9de3-ed2ff4accca3" containerName="copy" Apr 21 07:56:41.195768 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.195752 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p" Apr 21 07:56:41.198114 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.198089 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9fzd8\"/\"openshift-service-ca.crt\"" Apr 21 07:56:41.198227 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.198195 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9fzd8\"/\"kube-root-ca.crt\"" Apr 21 07:56:41.199076 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.199060 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-9fzd8\"/\"default-dockercfg-v97gr\"" Apr 21 07:56:41.206503 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.206482 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p"] Apr 21 07:56:41.333943 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.333905 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9d8263e5-a9ad-480f-9f14-d87cbb80b4a1-sys\") pod \"perf-node-gather-daemonset-8rx7p\" (UID: \"9d8263e5-a9ad-480f-9f14-d87cbb80b4a1\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p" Apr 21 07:56:41.334124 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.334007 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9d8263e5-a9ad-480f-9f14-d87cbb80b4a1-podres\") pod \"perf-node-gather-daemonset-8rx7p\" (UID: \"9d8263e5-a9ad-480f-9f14-d87cbb80b4a1\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p" Apr 21 07:56:41.334124 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.334036 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb2j7\" (UniqueName: \"kubernetes.io/projected/9d8263e5-a9ad-480f-9f14-d87cbb80b4a1-kube-api-access-kb2j7\") pod \"perf-node-gather-daemonset-8rx7p\" (UID: \"9d8263e5-a9ad-480f-9f14-d87cbb80b4a1\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p" Apr 21 07:56:41.334124 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.334094 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9d8263e5-a9ad-480f-9f14-d87cbb80b4a1-lib-modules\") pod \"perf-node-gather-daemonset-8rx7p\" (UID: \"9d8263e5-a9ad-480f-9f14-d87cbb80b4a1\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p" Apr 21 07:56:41.334247 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.334127 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9d8263e5-a9ad-480f-9f14-d87cbb80b4a1-proc\") pod \"perf-node-gather-daemonset-8rx7p\" (UID: \"9d8263e5-a9ad-480f-9f14-d87cbb80b4a1\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p" Apr 21 07:56:41.390125 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.390103 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7rk6g_6329b105-bf72-40c1-ab25-2ba6f2aea17c/dns/0.log" Apr 21 07:56:41.410032 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.410010 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7rk6g_6329b105-bf72-40c1-ab25-2ba6f2aea17c/kube-rbac-proxy/0.log" Apr 21 07:56:41.435309 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.435288 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kb2j7\" (UniqueName: \"kubernetes.io/projected/9d8263e5-a9ad-480f-9f14-d87cbb80b4a1-kube-api-access-kb2j7\") pod \"perf-node-gather-daemonset-8rx7p\" (UID: \"9d8263e5-a9ad-480f-9f14-d87cbb80b4a1\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p" Apr 21 07:56:41.435410 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.435321 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9d8263e5-a9ad-480f-9f14-d87cbb80b4a1-lib-modules\") pod \"perf-node-gather-daemonset-8rx7p\" (UID: \"9d8263e5-a9ad-480f-9f14-d87cbb80b4a1\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p" Apr 21 07:56:41.435410 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.435342 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9d8263e5-a9ad-480f-9f14-d87cbb80b4a1-proc\") pod \"perf-node-gather-daemonset-8rx7p\" (UID: \"9d8263e5-a9ad-480f-9f14-d87cbb80b4a1\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p" Apr 21 07:56:41.435410 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.435364 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9d8263e5-a9ad-480f-9f14-d87cbb80b4a1-sys\") pod \"perf-node-gather-daemonset-8rx7p\" (UID: \"9d8263e5-a9ad-480f-9f14-d87cbb80b4a1\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p" Apr 21 07:56:41.435558 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.435412 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9d8263e5-a9ad-480f-9f14-d87cbb80b4a1-podres\") pod \"perf-node-gather-daemonset-8rx7p\" (UID: \"9d8263e5-a9ad-480f-9f14-d87cbb80b4a1\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p" Apr 21 07:56:41.435558 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.435426 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9d8263e5-a9ad-480f-9f14-d87cbb80b4a1-proc\") pod \"perf-node-gather-daemonset-8rx7p\" (UID: \"9d8263e5-a9ad-480f-9f14-d87cbb80b4a1\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p" Apr 21 07:56:41.435558 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.435476 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9d8263e5-a9ad-480f-9f14-d87cbb80b4a1-lib-modules\") pod \"perf-node-gather-daemonset-8rx7p\" (UID: \"9d8263e5-a9ad-480f-9f14-d87cbb80b4a1\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p" Apr 21 07:56:41.435558 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.435498 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9d8263e5-a9ad-480f-9f14-d87cbb80b4a1-podres\") pod \"perf-node-gather-daemonset-8rx7p\" (UID: \"9d8263e5-a9ad-480f-9f14-d87cbb80b4a1\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p" Apr 21 07:56:41.435558 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.435507 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9d8263e5-a9ad-480f-9f14-d87cbb80b4a1-sys\") pod \"perf-node-gather-daemonset-8rx7p\" (UID: \"9d8263e5-a9ad-480f-9f14-d87cbb80b4a1\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p" Apr 21 07:56:41.442538 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.442484 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb2j7\" (UniqueName: \"kubernetes.io/projected/9d8263e5-a9ad-480f-9f14-d87cbb80b4a1-kube-api-access-kb2j7\") pod \"perf-node-gather-daemonset-8rx7p\" (UID: \"9d8263e5-a9ad-480f-9f14-d87cbb80b4a1\") " pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p" Apr 21 07:56:41.505580 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.505549 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p" Apr 21 07:56:41.517382 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.517359 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-hc6tl_28381821-cb18-40b6-a25f-b3a80e24f27a/dns-node-resolver/0.log" Apr 21 07:56:41.618797 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.618757 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p"] Apr 21 07:56:41.620937 ip-10-0-137-194 kubenswrapper[2570]: W0421 07:56:41.620908 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9d8263e5_a9ad_480f_9f14_d87cbb80b4a1.slice/crio-1f3d17948483a9ea1516503ca50e4a471dd57e532d71e94ab6d78e5ceaf6f024 WatchSource:0}: Error finding container 1f3d17948483a9ea1516503ca50e4a471dd57e532d71e94ab6d78e5ceaf6f024: Status 404 returned error can't find the container with id 1f3d17948483a9ea1516503ca50e4a471dd57e532d71e94ab6d78e5ceaf6f024 Apr 21 07:56:41.721581 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.721499 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p" event={"ID":"9d8263e5-a9ad-480f-9f14-d87cbb80b4a1","Type":"ContainerStarted","Data":"82904f0bc87405f5da43a37528b62769e229659f049c4ced5652a6a4ef701ee4"} Apr 21 07:56:41.721581 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.721537 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p" event={"ID":"9d8263e5-a9ad-480f-9f14-d87cbb80b4a1","Type":"ContainerStarted","Data":"1f3d17948483a9ea1516503ca50e4a471dd57e532d71e94ab6d78e5ceaf6f024"} Apr 21 07:56:41.721581 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.721570 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p" Apr 21 07:56:41.738654 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.738613 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p" podStartSLOduration=0.738598301 podStartE2EDuration="738.598301ms" podCreationTimestamp="2026-04-21 07:56:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 07:56:41.737407096 +0000 UTC m=+301.575794437" watchObservedRunningTime="2026-04-21 07:56:41.738598301 +0000 UTC m=+301.576985649" Apr 21 07:56:41.900519 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.900492 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6459cccc96-5nt4w_3d259e3f-23a3-4b2f-8d21-5ddf47a2d960/registry/0.log" Apr 21 07:56:41.957609 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:41.957582 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-v596t_a7dd7ba4-8933-440b-9c8a-6710f4843c4d/node-ca/0.log" Apr 21 07:56:42.570122 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:42.570082 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-d5f96b85b-ghcsf_9f41707d-3f57-4540-b483-7dd01a7d4ef0/router/0.log" Apr 21 07:56:42.871705 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:42.871686 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-pfqbt_1bcd6517-d770-467f-8536-8c7f4cdc772e/serve-healthcheck-canary/0.log" Apr 21 07:56:43.321373 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:43.321290 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l68lw_6e65eff4-f7d9-4bb7-9121-580138a669e3/kube-rbac-proxy/0.log" Apr 21 07:56:43.339029 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:43.339002 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l68lw_6e65eff4-f7d9-4bb7-9121-580138a669e3/exporter/0.log" Apr 21 07:56:43.355674 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:43.355648 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-l68lw_6e65eff4-f7d9-4bb7-9121-580138a669e3/extractor/0.log" Apr 21 07:56:47.046357 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:47.046315 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-mdt7b_0e1f90ef-6932-43c1-bb6a-d25451d794e8/kube-storage-version-migrator-operator/1.log" Apr 21 07:56:47.047327 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:47.047307 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-mdt7b_0e1f90ef-6932-43c1-bb6a-d25451d794e8/kube-storage-version-migrator-operator/0.log" Apr 21 07:56:47.733996 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:47.733970 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-9fzd8/perf-node-gather-daemonset-8rx7p" Apr 21 07:56:47.898318 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:47.898290 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6p77p_f1b4cd3f-6d6c-4602-aff3-3e7007b411bc/kube-multus-additional-cni-plugins/0.log" Apr 21 07:56:47.918926 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:47.918899 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6p77p_f1b4cd3f-6d6c-4602-aff3-3e7007b411bc/egress-router-binary-copy/0.log" Apr 21 07:56:47.939618 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:47.939590 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6p77p_f1b4cd3f-6d6c-4602-aff3-3e7007b411bc/cni-plugins/0.log" Apr 21 07:56:47.960280 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:47.960254 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6p77p_f1b4cd3f-6d6c-4602-aff3-3e7007b411bc/bond-cni-plugin/0.log" Apr 21 07:56:47.980441 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:47.980415 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6p77p_f1b4cd3f-6d6c-4602-aff3-3e7007b411bc/routeoverride-cni/0.log" Apr 21 07:56:48.000141 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:48.000077 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6p77p_f1b4cd3f-6d6c-4602-aff3-3e7007b411bc/whereabouts-cni-bincopy/0.log" Apr 21 07:56:48.018763 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:48.018733 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6p77p_f1b4cd3f-6d6c-4602-aff3-3e7007b411bc/whereabouts-cni/0.log" Apr 21 07:56:48.362734 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:48.362706 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-p59jc_c7b371b9-c08d-40a3-b1c7-71f402fdf061/kube-multus/0.log" Apr 21 07:56:48.379661 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:48.379637 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bbxng_62e778a8-8270-4560-9d0d-41a95a3c9c5f/network-metrics-daemon/0.log" Apr 21 07:56:48.397721 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:48.397696 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-bbxng_62e778a8-8270-4560-9d0d-41a95a3c9c5f/kube-rbac-proxy/0.log" Apr 21 07:56:49.635630 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:49.635605 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kshzg_cc312109-c0ed-49ee-b44b-aebf49f43c92/ovn-controller/0.log" Apr 21 07:56:49.651677 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:49.651652 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kshzg_cc312109-c0ed-49ee-b44b-aebf49f43c92/ovn-acl-logging/0.log" Apr 21 07:56:49.653232 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:49.653218 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kshzg_cc312109-c0ed-49ee-b44b-aebf49f43c92/ovn-acl-logging/1.log" Apr 21 07:56:49.669549 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:49.669520 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kshzg_cc312109-c0ed-49ee-b44b-aebf49f43c92/kube-rbac-proxy-node/0.log" Apr 21 07:56:49.690232 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:49.690209 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kshzg_cc312109-c0ed-49ee-b44b-aebf49f43c92/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 07:56:49.709106 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:49.709039 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kshzg_cc312109-c0ed-49ee-b44b-aebf49f43c92/northd/0.log" Apr 21 07:56:49.729205 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:49.729185 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kshzg_cc312109-c0ed-49ee-b44b-aebf49f43c92/nbdb/0.log" Apr 21 07:56:49.750530 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:49.750508 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kshzg_cc312109-c0ed-49ee-b44b-aebf49f43c92/sbdb/0.log" Apr 21 07:56:49.834134 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:49.834101 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kshzg_cc312109-c0ed-49ee-b44b-aebf49f43c92/ovnkube-controller/0.log" Apr 21 07:56:50.863973 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:50.863931 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-dkmsp_29603244-ac93-448d-b27a-816d739bf681/check-endpoints/0.log" Apr 21 07:56:50.902857 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:50.902824 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-htrsr_9cfb4a7f-bf8c-41ae-9276-00c43802c31a/network-check-target-container/0.log" Apr 21 07:56:51.781613 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:51.781541 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-x9g68_b142b7ae-a1de-435f-a326-eb9c1b40ba2e/iptables-alerter/0.log" Apr 21 07:56:52.401080 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:52.401011 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-ghbj6_4ec4ea1e-b376-4d45-b221-3c152bf5f6ec/tuned/0.log" Apr 21 07:56:53.764706 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:53.764662 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-xnjzr_e3ee1e09-3f66-4942-b704-81077b9efa31/cluster-samples-operator/0.log" Apr 21 07:56:53.778506 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:53.778485 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-xnjzr_e3ee1e09-3f66-4942-b704-81077b9efa31/cluster-samples-operator-watch/0.log" Apr 21 07:56:55.210353 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:55.210319 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-2j5gd_eb9ab596-4aad-415a-8f6b-3a6340b43812/csi-driver/0.log" Apr 21 07:56:55.226075 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:55.226047 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-2j5gd_eb9ab596-4aad-415a-8f6b-3a6340b43812/csi-node-driver-registrar/0.log" Apr 21 07:56:55.242909 ip-10-0-137-194 kubenswrapper[2570]: I0421 07:56:55.242881 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-2j5gd_eb9ab596-4aad-415a-8f6b-3a6340b43812/csi-liveness-probe/0.log"