Apr 20 22:23:46.913118 ip-10-0-130-91 systemd[1]: Starting Kubernetes Kubelet... Apr 20 22:23:47.406762 ip-10-0-130-91 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 22:23:47.406762 ip-10-0-130-91 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 22:23:47.406762 ip-10-0-130-91 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 22:23:47.406762 ip-10-0-130-91 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 22:23:47.406762 ip-10-0-130-91 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 22:23:47.408398 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.408309 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 22:23:47.413227 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413212 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 22:23:47.413227 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413227 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 22:23:47.413295 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413232 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 22:23:47.413295 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413236 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 22:23:47.413295 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413239 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 22:23:47.413295 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413243 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 22:23:47.413295 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413246 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 22:23:47.413295 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413249 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 22:23:47.413295 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413252 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 22:23:47.413295 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413259 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 22:23:47.413295 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413261 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 22:23:47.413295 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413264 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 22:23:47.413295 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413267 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 22:23:47.413295 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413269 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 22:23:47.413295 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413272 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 22:23:47.413295 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413280 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 22:23:47.413295 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413282 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 22:23:47.413295 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413285 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 22:23:47.413295 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413287 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 22:23:47.413295 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413290 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 22:23:47.413295 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413292 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 22:23:47.413295 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413295 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 22:23:47.413766 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413298 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 22:23:47.413766 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413301 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 22:23:47.413766 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413304 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 22:23:47.413766 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413307 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 22:23:47.413766 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413310 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 22:23:47.413766 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413312 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 22:23:47.413766 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413315 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 22:23:47.413766 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413317 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 22:23:47.413766 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413320 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 22:23:47.413766 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413322 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 22:23:47.413766 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413325 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 22:23:47.413766 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413329 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 22:23:47.413766 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413333 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 22:23:47.413766 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413336 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 22:23:47.413766 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413338 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 22:23:47.413766 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413341 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 22:23:47.413766 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413344 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 22:23:47.413766 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413346 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 22:23:47.413766 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413349 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 22:23:47.414236 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413351 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 22:23:47.414236 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413354 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 22:23:47.414236 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413356 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 22:23:47.414236 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413359 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 22:23:47.414236 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413361 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 22:23:47.414236 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413364 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 22:23:47.414236 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413366 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 22:23:47.414236 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413374 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 22:23:47.414236 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413377 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 22:23:47.414236 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413379 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 22:23:47.414236 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413382 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 22:23:47.414236 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413384 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 22:23:47.414236 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413387 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 22:23:47.414236 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413389 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 22:23:47.414236 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413392 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 22:23:47.414236 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413395 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 22:23:47.414236 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413398 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 22:23:47.414236 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413400 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 22:23:47.414236 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413403 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 22:23:47.414236 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413406 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 22:23:47.414732 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413409 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 22:23:47.414732 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413411 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 22:23:47.414732 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413414 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 22:23:47.414732 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413416 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 22:23:47.414732 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413420 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 22:23:47.414732 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413423 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 22:23:47.414732 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413426 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 22:23:47.414732 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413428 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 22:23:47.414732 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413431 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 22:23:47.414732 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413433 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 22:23:47.414732 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413436 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 22:23:47.414732 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413438 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 22:23:47.414732 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413442 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 22:23:47.414732 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413447 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 22:23:47.414732 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413450 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 22:23:47.414732 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413453 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 22:23:47.414732 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413455 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 22:23:47.414732 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413458 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 22:23:47.414732 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413462 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 22:23:47.414732 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413464 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 22:23:47.415269 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413467 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 22:23:47.415269 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413470 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 22:23:47.415269 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413472 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 22:23:47.415269 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413475 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 22:23:47.415269 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413478 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 22:23:47.415269 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413884 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 22:23:47.415269 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413890 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 22:23:47.415269 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413893 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 22:23:47.415269 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413897 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 22:23:47.415269 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413900 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 22:23:47.415269 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413903 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 22:23:47.415269 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413905 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 22:23:47.415269 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413908 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 22:23:47.415269 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413911 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 22:23:47.415269 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413913 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 22:23:47.415269 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413916 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 22:23:47.415269 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413918 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 22:23:47.415269 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413922 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 22:23:47.415269 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413924 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 22:23:47.415269 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413927 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 22:23:47.415752 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413929 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 22:23:47.415752 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413931 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 22:23:47.415752 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413934 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 22:23:47.415752 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413936 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 22:23:47.415752 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413939 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 22:23:47.415752 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413941 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 22:23:47.415752 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413944 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 22:23:47.415752 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413947 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 22:23:47.415752 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413949 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 22:23:47.415752 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413952 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 22:23:47.415752 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413955 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 22:23:47.415752 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413958 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 22:23:47.415752 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413960 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 22:23:47.415752 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413962 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 22:23:47.415752 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413965 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 22:23:47.415752 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413968 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 22:23:47.415752 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413970 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 22:23:47.415752 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413973 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 22:23:47.415752 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413976 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 22:23:47.415752 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413979 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 22:23:47.416275 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413983 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 22:23:47.416275 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413986 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 22:23:47.416275 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413989 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 22:23:47.416275 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413992 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 22:23:47.416275 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413994 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 22:23:47.416275 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413997 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 22:23:47.416275 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.413999 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 22:23:47.416275 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414002 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 22:23:47.416275 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414005 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 22:23:47.416275 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414007 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 22:23:47.416275 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414010 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 22:23:47.416275 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414013 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 22:23:47.416275 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414015 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 22:23:47.416275 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414018 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 22:23:47.416275 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414020 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 22:23:47.416275 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414023 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 22:23:47.416275 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414025 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 22:23:47.416275 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414028 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 22:23:47.416275 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414030 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 22:23:47.416743 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414034 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 22:23:47.416743 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414036 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 22:23:47.416743 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414039 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 22:23:47.416743 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414041 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 22:23:47.416743 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414044 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 22:23:47.416743 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414047 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 22:23:47.416743 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414049 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 22:23:47.416743 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414051 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 22:23:47.416743 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414054 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 22:23:47.416743 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414057 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 22:23:47.416743 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414060 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 22:23:47.416743 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414062 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 22:23:47.416743 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414065 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 22:23:47.416743 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414067 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 22:23:47.416743 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414071 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 22:23:47.416743 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414074 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 22:23:47.416743 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414077 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 22:23:47.416743 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414080 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 22:23:47.416743 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414083 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 22:23:47.417232 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414086 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 22:23:47.417232 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414089 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 22:23:47.417232 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414092 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 22:23:47.417232 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414095 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 22:23:47.417232 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414097 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 22:23:47.417232 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414100 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 22:23:47.417232 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414102 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 22:23:47.417232 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414105 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 22:23:47.417232 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414107 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 22:23:47.417232 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414110 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 22:23:47.417232 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414112 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 22:23:47.417232 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414115 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 22:23:47.417232 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.414117 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 22:23:47.417232 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415218 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 22:23:47.417232 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415227 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 22:23:47.417232 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415235 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 22:23:47.417232 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415252 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 22:23:47.417232 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415258 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 22:23:47.417232 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415261 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 22:23:47.417232 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415265 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 22:23:47.417232 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415270 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415273 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415276 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415280 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415284 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415287 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415291 2573 flags.go:64] FLAG: --cgroup-root="" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415294 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415297 2573 flags.go:64] FLAG: --client-ca-file="" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415300 2573 flags.go:64] FLAG: --cloud-config="" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415303 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415306 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415310 2573 flags.go:64] FLAG: --cluster-domain="" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415315 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415318 2573 flags.go:64] FLAG: --config-dir="" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415321 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415325 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415329 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415333 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415336 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415340 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415343 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415346 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415349 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415352 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 22:23:47.417748 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415356 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415360 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415363 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415366 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415369 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415372 2573 flags.go:64] FLAG: --enable-server="true" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415375 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415380 2573 flags.go:64] FLAG: --event-burst="100" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415384 2573 flags.go:64] FLAG: --event-qps="50" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415386 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415390 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415393 2573 flags.go:64] FLAG: --eviction-hard="" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415397 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415400 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415403 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415405 2573 flags.go:64] FLAG: --eviction-soft="" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415408 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415411 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415414 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415417 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415421 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415424 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415427 2573 flags.go:64] FLAG: --feature-gates="" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415431 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415434 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 22:23:47.418366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415437 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415441 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415444 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415447 2573 flags.go:64] FLAG: --help="false" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415450 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-130-91.ec2.internal" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415453 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415456 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415459 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415463 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415466 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415469 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415472 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415475 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415478 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415480 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415483 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415486 2573 flags.go:64] FLAG: --kube-reserved="" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415489 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415492 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415495 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415498 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415501 2573 flags.go:64] FLAG: --lock-file="" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415504 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415507 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 22:23:47.418988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415510 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415516 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415519 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415523 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415526 2573 flags.go:64] FLAG: --logging-format="text" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415529 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415532 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415535 2573 flags.go:64] FLAG: --manifest-url="" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415538 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415543 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415546 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415550 2573 flags.go:64] FLAG: --max-pods="110" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415553 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415556 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415560 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415563 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415566 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415569 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415572 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415579 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415582 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415585 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415588 2573 flags.go:64] FLAG: --pod-cidr="" Apr 20 22:23:47.419639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415591 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415598 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415601 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415604 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415607 2573 flags.go:64] FLAG: --port="10250" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415610 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415613 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-019ecc4184aa6b9bf" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415616 2573 flags.go:64] FLAG: --qos-reserved="" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415619 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415622 2573 flags.go:64] FLAG: --register-node="true" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415625 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415628 2573 flags.go:64] FLAG: --register-with-taints="" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415633 2573 flags.go:64] FLAG: --registry-burst="10" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415636 2573 flags.go:64] FLAG: --registry-qps="5" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415639 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415642 2573 flags.go:64] FLAG: --reserved-memory="" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415645 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415649 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415652 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415655 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415658 2573 flags.go:64] FLAG: --runonce="false" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415661 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415664 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415667 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415673 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415676 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 22:23:47.420253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415679 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415682 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415685 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415688 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415691 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415694 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415697 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415700 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415703 2573 flags.go:64] FLAG: --system-cgroups="" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415706 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415711 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415714 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415718 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415722 2573 flags.go:64] FLAG: --tls-min-version="" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415724 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415727 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415730 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415733 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415738 2573 flags.go:64] FLAG: --v="2" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415742 2573 flags.go:64] FLAG: --version="false" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415749 2573 flags.go:64] FLAG: --vmodule="" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415753 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.415756 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415882 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 22:23:47.420887 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415887 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 22:23:47.421474 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415890 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 22:23:47.421474 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415893 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 22:23:47.421474 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415896 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 22:23:47.421474 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415899 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 22:23:47.421474 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415902 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 22:23:47.421474 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415907 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 22:23:47.421474 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415910 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 22:23:47.421474 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415913 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 22:23:47.421474 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415915 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 22:23:47.421474 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415918 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 22:23:47.421474 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415921 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 22:23:47.421474 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415923 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 22:23:47.421474 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415926 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 22:23:47.421474 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415929 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 22:23:47.421474 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415931 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 22:23:47.421474 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415934 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 22:23:47.421474 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415936 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 22:23:47.421474 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415939 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 22:23:47.421474 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415942 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 22:23:47.421474 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415944 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 22:23:47.421989 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415947 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 22:23:47.421989 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415949 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 22:23:47.421989 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415953 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 22:23:47.421989 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415957 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 22:23:47.421989 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415960 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 22:23:47.421989 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415964 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 22:23:47.421989 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415967 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 22:23:47.421989 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415970 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 22:23:47.421989 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415973 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 22:23:47.421989 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415975 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 22:23:47.421989 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415978 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 22:23:47.421989 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415981 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 22:23:47.421989 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415983 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 22:23:47.421989 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415986 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 22:23:47.421989 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415988 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 22:23:47.421989 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415992 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 22:23:47.421989 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415994 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 22:23:47.421989 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.415999 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 22:23:47.421989 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416002 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 22:23:47.422463 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416005 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 22:23:47.422463 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416007 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 22:23:47.422463 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416010 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 22:23:47.422463 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416012 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 22:23:47.422463 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416015 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 22:23:47.422463 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416018 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 22:23:47.422463 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416020 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 22:23:47.422463 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416023 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 22:23:47.422463 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416026 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 22:23:47.422463 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416029 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 22:23:47.422463 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416031 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 22:23:47.422463 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416034 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 22:23:47.422463 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416036 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 22:23:47.422463 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416039 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 22:23:47.422463 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416041 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 22:23:47.422463 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416044 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 22:23:47.422463 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416047 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 22:23:47.422463 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416050 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 22:23:47.422463 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416053 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 22:23:47.422463 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416056 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 22:23:47.422988 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416058 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 22:23:47.422988 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416061 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 22:23:47.422988 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416063 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 22:23:47.422988 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416066 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 22:23:47.422988 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416068 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 22:23:47.422988 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416071 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 22:23:47.422988 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416073 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 22:23:47.422988 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416076 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 22:23:47.422988 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416078 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 22:23:47.422988 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416082 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 22:23:47.422988 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416085 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 22:23:47.422988 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416088 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 22:23:47.422988 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416090 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 22:23:47.422988 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416094 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 22:23:47.422988 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416098 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 22:23:47.422988 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416101 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 22:23:47.422988 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416104 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 22:23:47.422988 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416107 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 22:23:47.422988 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416109 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 22:23:47.422988 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416112 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 22:23:47.423479 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416114 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 22:23:47.423479 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416117 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 22:23:47.423479 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416120 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 22:23:47.423479 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416122 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 22:23:47.423479 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.416125 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 22:23:47.423479 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.416837 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 22:23:47.425803 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.425784 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 22:23:47.425837 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.425803 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 22:23:47.425882 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425878 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 22:23:47.425912 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425884 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 22:23:47.425912 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425887 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 22:23:47.425912 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425891 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 22:23:47.425912 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425894 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 22:23:47.425912 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425897 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 22:23:47.425912 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425900 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 22:23:47.425912 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425903 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 22:23:47.425912 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425906 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 22:23:47.425912 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425908 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 22:23:47.425912 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425911 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 22:23:47.425912 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425915 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 22:23:47.425912 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425918 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 22:23:47.426203 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425921 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 22:23:47.426203 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425923 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 22:23:47.426203 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425926 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 22:23:47.426203 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425928 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 22:23:47.426203 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425931 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 22:23:47.426203 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425934 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 22:23:47.426203 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425937 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 22:23:47.426203 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425939 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 22:23:47.426203 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425942 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 22:23:47.426203 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425944 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 22:23:47.426203 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425947 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 22:23:47.426203 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425949 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 22:23:47.426203 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425952 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 22:23:47.426203 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425954 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 22:23:47.426203 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425957 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 22:23:47.426203 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425960 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 22:23:47.426203 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425962 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 22:23:47.426203 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425965 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 22:23:47.426203 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425968 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 22:23:47.426668 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425971 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 22:23:47.426668 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425974 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 22:23:47.426668 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425976 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 22:23:47.426668 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425979 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 22:23:47.426668 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425981 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 22:23:47.426668 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425984 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 22:23:47.426668 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425987 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 22:23:47.426668 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425989 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 22:23:47.426668 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425992 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 22:23:47.426668 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425995 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 22:23:47.426668 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.425997 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 22:23:47.426668 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426000 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 22:23:47.426668 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426002 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 22:23:47.426668 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426005 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 22:23:47.426668 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426008 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 22:23:47.426668 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426010 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 22:23:47.426668 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426013 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 22:23:47.426668 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426016 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 22:23:47.426668 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426019 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 22:23:47.426668 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426021 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 22:23:47.427181 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426024 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 22:23:47.427181 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426027 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 22:23:47.427181 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426029 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 22:23:47.427181 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426032 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 22:23:47.427181 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426034 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 22:23:47.427181 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426037 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 22:23:47.427181 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426039 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 22:23:47.427181 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426042 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 22:23:47.427181 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426044 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 22:23:47.427181 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426047 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 22:23:47.427181 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426050 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 22:23:47.427181 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426052 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 22:23:47.427181 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426055 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 22:23:47.427181 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426058 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 22:23:47.427181 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426061 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 22:23:47.427181 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426063 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 22:23:47.427181 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426066 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 22:23:47.427181 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426068 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 22:23:47.427181 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426071 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 22:23:47.427181 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426074 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 22:23:47.427667 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426076 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 22:23:47.427667 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426079 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 22:23:47.427667 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426082 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 22:23:47.427667 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426084 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 22:23:47.427667 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426087 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 22:23:47.427667 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426089 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 22:23:47.427667 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426093 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 22:23:47.427667 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426095 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 22:23:47.427667 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426099 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 22:23:47.427667 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426103 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 22:23:47.427667 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426107 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 22:23:47.427667 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426109 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 22:23:47.427667 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426112 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 22:23:47.427667 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426117 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 22:23:47.427667 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.426123 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 22:23:47.428069 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426221 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 22:23:47.428069 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426226 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 22:23:47.428069 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426229 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 22:23:47.428069 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426231 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 22:23:47.428069 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426234 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 22:23:47.428069 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426237 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 22:23:47.428069 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426240 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 22:23:47.428069 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426242 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 22:23:47.428069 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426245 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 22:23:47.428069 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426248 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 22:23:47.428069 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426250 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 22:23:47.428069 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426253 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 22:23:47.428069 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426255 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 22:23:47.428069 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426258 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 22:23:47.428069 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426261 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 22:23:47.428069 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426263 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 22:23:47.428069 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426266 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 22:23:47.428069 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426269 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 20 22:23:47.428069 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426273 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 22:23:47.428069 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426277 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 22:23:47.428544 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426280 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 22:23:47.428544 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426283 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 22:23:47.428544 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426286 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 22:23:47.428544 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426289 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 22:23:47.428544 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426292 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 22:23:47.428544 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426295 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 22:23:47.428544 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426297 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 22:23:47.428544 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426300 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 22:23:47.428544 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426302 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 22:23:47.428544 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426305 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 22:23:47.428544 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426308 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 22:23:47.428544 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426310 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 22:23:47.428544 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426313 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 22:23:47.428544 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426315 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 22:23:47.428544 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426318 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 22:23:47.428544 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426320 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 22:23:47.428544 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426323 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 22:23:47.428544 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426325 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 22:23:47.428544 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426328 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 22:23:47.428544 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426330 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 22:23:47.429209 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426333 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 22:23:47.429209 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426335 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 22:23:47.429209 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426338 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 22:23:47.429209 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426341 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 22:23:47.429209 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426343 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 22:23:47.429209 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426346 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 22:23:47.429209 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426348 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 22:23:47.429209 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426351 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 22:23:47.429209 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426353 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 22:23:47.429209 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426356 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 22:23:47.429209 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426358 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 22:23:47.429209 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426361 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 22:23:47.429209 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426363 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 22:23:47.429209 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426366 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 22:23:47.429209 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426368 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 22:23:47.429209 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426372 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 22:23:47.429209 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426374 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 22:23:47.429209 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426377 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 22:23:47.429209 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426379 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 22:23:47.429209 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426381 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 22:23:47.429701 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426384 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 22:23:47.429701 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426386 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 22:23:47.429701 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426389 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 22:23:47.429701 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426391 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 22:23:47.429701 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426394 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 22:23:47.429701 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426397 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 22:23:47.429701 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426400 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 22:23:47.429701 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426404 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 22:23:47.429701 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426407 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 22:23:47.429701 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426409 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 22:23:47.429701 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426412 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 22:23:47.429701 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426414 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 22:23:47.429701 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426417 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 22:23:47.429701 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426419 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 22:23:47.429701 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426421 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 22:23:47.429701 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426424 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 22:23:47.429701 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426427 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 22:23:47.429701 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426430 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 22:23:47.429701 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426432 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 22:23:47.430183 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426435 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 22:23:47.430183 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426438 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 22:23:47.430183 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426440 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 22:23:47.430183 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426443 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 22:23:47.430183 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426445 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 22:23:47.430183 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426448 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 22:23:47.430183 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:47.426451 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 22:23:47.430183 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.426456 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 22:23:47.430183 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.427613 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 22:23:47.430183 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.429462 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 22:23:47.430654 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.430642 2573 server.go:1019] "Starting client certificate rotation" Apr 20 22:23:47.430753 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.430738 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 22:23:47.430789 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.430780 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 22:23:47.461403 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.461385 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 22:23:47.468943 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.468920 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 22:23:47.480559 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.480543 2573 log.go:25] "Validated CRI v1 runtime API" Apr 20 22:23:47.487948 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.487925 2573 log.go:25] "Validated CRI v1 image API" Apr 20 22:23:47.489105 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.489080 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 22:23:47.491640 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.491623 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 22:23:47.493791 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.493764 2573 fs.go:135] Filesystem UUIDs: map[21977a62-715c-45bf-8ba5-b538e0a3b423:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 e1ad7e23-0967-42ca-b06e-9a1fa1bdc31d:/dev/nvme0n1p3] Apr 20 22:23:47.493867 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.493790 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 22:23:47.499811 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.499704 2573 manager.go:217] Machine: {Timestamp:2026-04-20 22:23:47.497474891 +0000 UTC m=+0.450429501 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100628 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec21437ceaa3181180a52768ca45f97c SystemUUID:ec21437c-eaa3-1811-80a5-2768ca45f97c BootID:a174d27d-8daa-4af2-8055-8a6c8e881013 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:f2:6f:6e:7d:db Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:f2:6f:6e:7d:db Speed:0 Mtu:9001} {Name:ovs-system MacAddress:22:5c:a7:c4:df:16 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 22:23:47.499811 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.499803 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 22:23:47.499946 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.499927 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 22:23:47.500995 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.500970 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 22:23:47.501129 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.500996 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-91.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 22:23:47.501182 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.501138 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 22:23:47.501182 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.501147 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 22:23:47.501182 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.501160 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 22:23:47.502429 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.502419 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 22:23:47.503791 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.503781 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 20 22:23:47.503925 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.503916 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 22:23:47.506308 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.506299 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 20 22:23:47.506361 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.506336 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 22:23:47.506361 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.506351 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 22:23:47.506361 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.506360 2573 kubelet.go:397] "Adding apiserver pod source" Apr 20 22:23:47.506487 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.506369 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 22:23:47.507466 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.507452 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 22:23:47.507527 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.507478 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 22:23:47.510780 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.510767 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 22:23:47.514401 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.514380 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 22:23:47.516735 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.516718 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 22:23:47.516735 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.516738 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 22:23:47.516843 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.516745 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 22:23:47.516843 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.516752 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 22:23:47.516843 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.516761 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 22:23:47.516843 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.516766 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 22:23:47.516843 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.516772 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 22:23:47.516843 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.516777 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 22:23:47.516843 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.516785 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 22:23:47.516843 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.516791 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 22:23:47.516843 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.516799 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 22:23:47.516843 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.516808 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 22:23:47.517727 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.517716 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 22:23:47.517727 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.517728 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 22:23:47.517953 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:47.517930 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-91.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 22:23:47.518085 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:47.518064 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 22:23:47.521226 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.521212 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 22:23:47.521293 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.521246 2573 server.go:1295] "Started kubelet" Apr 20 22:23:47.521354 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.521329 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 22:23:47.521406 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.521333 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 22:23:47.521406 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.521394 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 22:23:47.522159 ip-10-0-130-91 systemd[1]: Started Kubernetes Kubelet. Apr 20 22:23:47.522702 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.522523 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 22:23:47.523735 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.523723 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 20 22:23:47.527129 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.527106 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 22:23:47.527586 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.527566 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 22:23:47.527990 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.527872 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 22:23:47.527990 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.527899 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 22:23:47.527990 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.527884 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 22:23:47.527990 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.527974 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 20 22:23:47.527990 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.527982 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 20 22:23:47.527990 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.527990 2573 factory.go:55] Registering systemd factory Apr 20 22:23:47.528292 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.528010 2573 factory.go:223] Registration of the systemd container factory successfully Apr 20 22:23:47.528292 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:47.528226 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-91.ec2.internal\" not found" Apr 20 22:23:47.528698 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.528673 2573 factory.go:153] Registering CRI-O factory Apr 20 22:23:47.528786 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.528694 2573 factory.go:223] Registration of the crio container factory successfully Apr 20 22:23:47.528865 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.528797 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 22:23:47.528865 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.528823 2573 factory.go:103] Registering Raw factory Apr 20 22:23:47.528974 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.528881 2573 manager.go:1196] Started watching for new ooms in manager Apr 20 22:23:47.529391 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.529372 2573 manager.go:319] Starting recovery of all containers Apr 20 22:23:47.529825 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:47.529802 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 22:23:47.530154 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.530135 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-91.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 22:23:47.530274 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:47.529344 2573 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-130-91.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 22:23:47.531206 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:47.529661 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-91.ec2.internal.18a830e85b5ffe6a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-91.ec2.internal,UID:ip-10-0-130-91.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-91.ec2.internal,},FirstTimestamp:2026-04-20 22:23:47.521224298 +0000 UTC m=+0.474178909,LastTimestamp:2026-04-20 22:23:47.521224298 +0000 UTC m=+0.474178909,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-91.ec2.internal,}" Apr 20 22:23:47.533282 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:47.533250 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 22:23:47.535770 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.535746 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kknvt" Apr 20 22:23:47.539460 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.539438 2573 manager.go:324] Recovery completed Apr 20 22:23:47.541941 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.541917 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-kknvt" Apr 20 22:23:47.544923 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.544880 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 22:23:47.547408 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.547390 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-91.ec2.internal" event="NodeHasSufficientMemory" Apr 20 22:23:47.547477 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.547423 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-91.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 22:23:47.547477 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.547433 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-91.ec2.internal" event="NodeHasSufficientPID" Apr 20 22:23:47.547932 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.547921 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 22:23:47.547932 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.547932 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 22:23:47.548013 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.547947 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 20 22:23:47.548904 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:47.548824 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-91.ec2.internal.18a830e85cef8c32 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-91.ec2.internal,UID:ip-10-0-130-91.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-130-91.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-130-91.ec2.internal,},FirstTimestamp:2026-04-20 22:23:47.547409458 +0000 UTC m=+0.500364074,LastTimestamp:2026-04-20 22:23:47.547409458 +0000 UTC m=+0.500364074,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-91.ec2.internal,}" Apr 20 22:23:47.551166 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.551154 2573 policy_none.go:49] "None policy: Start" Apr 20 22:23:47.551219 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.551170 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 22:23:47.551219 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.551179 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 20 22:23:47.586423 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.586407 2573 manager.go:341] "Starting Device Plugin manager" Apr 20 22:23:47.608454 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:47.586474 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 22:23:47.608454 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.586488 2573 server.go:85] "Starting device plugin registration server" Apr 20 22:23:47.608454 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.586702 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 22:23:47.608454 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.586712 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 22:23:47.608454 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.586807 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 22:23:47.608454 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.586907 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 22:23:47.608454 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.586916 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 22:23:47.608454 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:47.587349 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 22:23:47.608454 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:47.587387 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-91.ec2.internal\" not found" Apr 20 22:23:47.608454 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.594604 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 22:23:47.608454 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.595664 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 22:23:47.608454 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.595693 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 22:23:47.608454 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.595710 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 22:23:47.608454 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.595716 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 22:23:47.608454 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:47.595744 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 22:23:47.608454 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.599071 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 22:23:47.687468 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.687403 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 22:23:47.688385 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.688366 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-91.ec2.internal" event="NodeHasSufficientMemory" Apr 20 22:23:47.688493 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.688395 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-91.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 22:23:47.688493 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.688408 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-91.ec2.internal" event="NodeHasSufficientPID" Apr 20 22:23:47.688493 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.688436 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-91.ec2.internal" Apr 20 22:23:47.696096 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.696075 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-130-91.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-91.ec2.internal"] Apr 20 22:23:47.696181 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.696141 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 22:23:47.696838 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.696820 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-91.ec2.internal" event="NodeHasSufficientMemory" Apr 20 22:23:47.696932 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.696847 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-91.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 22:23:47.696932 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.696883 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-91.ec2.internal" event="NodeHasSufficientPID" Apr 20 22:23:47.697154 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.697135 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-91.ec2.internal" Apr 20 22:23:47.697244 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:47.697156 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-91.ec2.internal\": node \"ip-10-0-130-91.ec2.internal\" not found" Apr 20 22:23:47.699071 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.699054 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 22:23:47.699209 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.699195 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-91.ec2.internal" Apr 20 22:23:47.699263 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.699226 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 22:23:47.699760 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.699745 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-91.ec2.internal" event="NodeHasSufficientMemory" Apr 20 22:23:47.699760 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.699745 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-91.ec2.internal" event="NodeHasSufficientMemory" Apr 20 22:23:47.699760 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.699772 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-91.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 22:23:47.699922 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.699782 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-91.ec2.internal" event="NodeHasSufficientPID" Apr 20 22:23:47.699922 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.699791 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-91.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 22:23:47.699922 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.699803 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-91.ec2.internal" event="NodeHasSufficientPID" Apr 20 22:23:47.701987 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.701970 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-91.ec2.internal" Apr 20 22:23:47.702075 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.701999 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 22:23:47.702696 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.702675 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-91.ec2.internal" event="NodeHasSufficientMemory" Apr 20 22:23:47.702696 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.702697 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-91.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 22:23:47.702822 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.702710 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-91.ec2.internal" event="NodeHasSufficientPID" Apr 20 22:23:47.711041 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:47.711023 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-91.ec2.internal\" not found" Apr 20 22:23:47.720452 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:47.720431 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-91.ec2.internal\" not found" node="ip-10-0-130-91.ec2.internal" Apr 20 22:23:47.723728 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:47.723713 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-91.ec2.internal\" not found" node="ip-10-0-130-91.ec2.internal" Apr 20 22:23:47.811818 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:47.811794 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-91.ec2.internal\" not found" Apr 20 22:23:47.828959 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.828930 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0d7985ee616f47b485ff61a5cec01dca-config\") pod \"kube-apiserver-proxy-ip-10-0-130-91.ec2.internal\" (UID: \"0d7985ee616f47b485ff61a5cec01dca\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-91.ec2.internal" Apr 20 22:23:47.828959 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.828963 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/17685fe779018045b34417c0aceeb6de-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-91.ec2.internal\" (UID: \"17685fe779018045b34417c0aceeb6de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-91.ec2.internal" Apr 20 22:23:47.829082 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.828982 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17685fe779018045b34417c0aceeb6de-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-91.ec2.internal\" (UID: \"17685fe779018045b34417c0aceeb6de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-91.ec2.internal" Apr 20 22:23:47.912104 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:47.912074 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-91.ec2.internal\" not found" Apr 20 22:23:47.929399 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.929377 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0d7985ee616f47b485ff61a5cec01dca-config\") pod \"kube-apiserver-proxy-ip-10-0-130-91.ec2.internal\" (UID: \"0d7985ee616f47b485ff61a5cec01dca\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-91.ec2.internal" Apr 20 22:23:47.929507 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.929409 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/17685fe779018045b34417c0aceeb6de-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-91.ec2.internal\" (UID: \"17685fe779018045b34417c0aceeb6de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-91.ec2.internal" Apr 20 22:23:47.929507 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.929434 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17685fe779018045b34417c0aceeb6de-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-91.ec2.internal\" (UID: \"17685fe779018045b34417c0aceeb6de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-91.ec2.internal" Apr 20 22:23:47.929507 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.929481 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/17685fe779018045b34417c0aceeb6de-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-91.ec2.internal\" (UID: \"17685fe779018045b34417c0aceeb6de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-91.ec2.internal" Apr 20 22:23:47.929507 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.929480 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17685fe779018045b34417c0aceeb6de-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-91.ec2.internal\" (UID: \"17685fe779018045b34417c0aceeb6de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-91.ec2.internal" Apr 20 22:23:47.929651 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:47.929481 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/0d7985ee616f47b485ff61a5cec01dca-config\") pod \"kube-apiserver-proxy-ip-10-0-130-91.ec2.internal\" (UID: \"0d7985ee616f47b485ff61a5cec01dca\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-91.ec2.internal" Apr 20 22:23:48.012785 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:48.012712 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-91.ec2.internal\" not found" Apr 20 22:23:48.022926 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.022909 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-91.ec2.internal" Apr 20 22:23:48.026383 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.026367 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-91.ec2.internal" Apr 20 22:23:48.113017 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:48.112980 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-91.ec2.internal\" not found" Apr 20 22:23:48.213488 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:48.213459 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-91.ec2.internal\" not found" Apr 20 22:23:48.313955 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:48.313883 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-91.ec2.internal\" not found" Apr 20 22:23:48.411350 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.411323 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 22:23:48.428676 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.428651 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-91.ec2.internal" Apr 20 22:23:48.430810 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.430788 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 22:23:48.430944 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.430918 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 22:23:48.430944 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.430925 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 22:23:48.430944 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:48.430933 2573 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://a5c3af94c92304f43803c023123a4c99-0c6b9fdf5277f82e.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/kube-system/pods\": read tcp 10.0.130.91:34900->13.222.43.110:6443: use of closed network connection" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-91.ec2.internal" Apr 20 22:23:48.431054 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.430953 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-91.ec2.internal" Apr 20 22:23:48.452179 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.452155 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 22:23:48.506652 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.506630 2573 apiserver.go:52] "Watching apiserver" Apr 20 22:23:48.517003 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.516985 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 22:23:48.519229 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.519206 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-2wb2c","openshift-cluster-node-tuning-operator/tuned-hcxd5","openshift-image-registry/node-ca-m4npw","openshift-network-operator/iptables-alerter-r76sg","openshift-ovn-kubernetes/ovnkube-node-2trzq","kube-system/konnectivity-agent-n2bhm","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565","openshift-dns/node-resolver-56xnh","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-91.ec2.internal","openshift-multus/multus-additional-cni-plugins-qvjgp","openshift-multus/multus-q9f2s","openshift-multus/network-metrics-daemon-rl87j"] Apr 20 22:23:48.521966 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.521947 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.524078 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.524061 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-m4npw" Apr 20 22:23:48.524341 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.524321 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 22:23:48.524499 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.524484 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-nnvlk\"" Apr 20 22:23:48.524545 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.524498 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 22:23:48.526214 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.526201 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 22:23:48.526321 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.526302 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-n79gs\"" Apr 20 22:23:48.526567 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.526549 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 22:23:48.526646 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.526575 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 22:23:48.526779 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.526766 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-r76sg" Apr 20 22:23:48.527639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.527626 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 22:23:48.528899 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.528882 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.528962 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.528926 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 22:23:48.529176 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.529145 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 22:23:48.529272 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.529257 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 22:23:48.529402 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.529387 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-qdkbs\"" Apr 20 22:23:48.530846 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.530829 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-56xnh" Apr 20 22:23:48.531034 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531018 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 22:23:48.531321 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531306 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 22:23:48.531408 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531326 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 22:23:48.531408 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531329 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-hzkrk\"" Apr 20 22:23:48.531509 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531416 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d341b24-8cdf-4d59-a97c-54cecc195860-host\") pod \"node-ca-m4npw\" (UID: \"1d341b24-8cdf-4d59-a97c-54cecc195860\") " pod="openshift-image-registry/node-ca-m4npw" Apr 20 22:23:48.531509 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531449 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1d341b24-8cdf-4d59-a97c-54cecc195860-serviceca\") pod \"node-ca-m4npw\" (UID: \"1d341b24-8cdf-4d59-a97c-54cecc195860\") " pod="openshift-image-registry/node-ca-m4npw" Apr 20 22:23:48.531509 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531474 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/764d33cb-2655-4fbe-a337-a91cf3ac5633-registration-dir\") pod \"aws-ebs-csi-driver-node-4f565\" (UID: \"764d33cb-2655-4fbe-a337-a91cf3ac5633\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.531509 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531499 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/764d33cb-2655-4fbe-a337-a91cf3ac5633-device-dir\") pod \"aws-ebs-csi-driver-node-4f565\" (UID: \"764d33cb-2655-4fbe-a337-a91cf3ac5633\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.531654 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531519 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-etc-sysconfig\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.531654 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531534 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-etc-sysctl-d\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.531654 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531547 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-etc-systemd\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.531654 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531588 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ghcl\" (UniqueName: \"kubernetes.io/projected/1d341b24-8cdf-4d59-a97c-54cecc195860-kube-api-access-5ghcl\") pod \"node-ca-m4npw\" (UID: \"1d341b24-8cdf-4d59-a97c-54cecc195860\") " pod="openshift-image-registry/node-ca-m4npw" Apr 20 22:23:48.531654 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531604 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-etc-modprobe-d\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.531654 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531621 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg6l5\" (UniqueName: \"kubernetes.io/projected/764d33cb-2655-4fbe-a337-a91cf3ac5633-kube-api-access-wg6l5\") pod \"aws-ebs-csi-driver-node-4f565\" (UID: \"764d33cb-2655-4fbe-a337-a91cf3ac5633\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.531654 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531634 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-run\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.531654 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531648 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-var-lib-kubelet\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.532050 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531681 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0fb278ab-931f-4221-bed3-819801bca936-etc-tuned\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.532050 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531714 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhdps\" (UniqueName: \"kubernetes.io/projected/0fb278ab-931f-4221-bed3-819801bca936-kube-api-access-mhdps\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.532050 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531743 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/764d33cb-2655-4fbe-a337-a91cf3ac5633-etc-selinux\") pod \"aws-ebs-csi-driver-node-4f565\" (UID: \"764d33cb-2655-4fbe-a337-a91cf3ac5633\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.532050 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531772 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e9f5856-134e-449f-afaf-3ab93b2577c2-host-slash\") pod \"iptables-alerter-r76sg\" (UID: \"0e9f5856-134e-449f-afaf-3ab93b2577c2\") " pod="openshift-network-operator/iptables-alerter-r76sg" Apr 20 22:23:48.532050 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531799 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/764d33cb-2655-4fbe-a337-a91cf3ac5633-socket-dir\") pod \"aws-ebs-csi-driver-node-4f565\" (UID: \"764d33cb-2655-4fbe-a337-a91cf3ac5633\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.532050 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531822 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-host\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.532050 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531845 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/764d33cb-2655-4fbe-a337-a91cf3ac5633-sys-fs\") pod \"aws-ebs-csi-driver-node-4f565\" (UID: \"764d33cb-2655-4fbe-a337-a91cf3ac5633\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.532050 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531883 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-etc-sysctl-conf\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.532050 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531908 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x79qp\" (UniqueName: \"kubernetes.io/projected/0e9f5856-134e-449f-afaf-3ab93b2577c2-kube-api-access-x79qp\") pod \"iptables-alerter-r76sg\" (UID: \"0e9f5856-134e-449f-afaf-3ab93b2577c2\") " pod="openshift-network-operator/iptables-alerter-r76sg" Apr 20 22:23:48.532050 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531932 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-etc-kubernetes\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.532050 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531955 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-sys\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.532050 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531975 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fb278ab-931f-4221-bed3-819801bca936-tmp\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.532050 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.531997 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-lib-modules\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.532050 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.532022 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0e9f5856-134e-449f-afaf-3ab93b2577c2-iptables-alerter-script\") pod \"iptables-alerter-r76sg\" (UID: \"0e9f5856-134e-449f-afaf-3ab93b2577c2\") " pod="openshift-network-operator/iptables-alerter-r76sg" Apr 20 22:23:48.532050 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.532047 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/764d33cb-2655-4fbe-a337-a91cf3ac5633-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4f565\" (UID: \"764d33cb-2655-4fbe-a337-a91cf3ac5633\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.533082 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.533068 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qg7l5\"" Apr 20 22:23:48.533393 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.533376 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 22:23:48.533450 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.533412 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 22:23:48.533742 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.533730 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.535945 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.535929 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:23:48.536053 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:48.535987 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rl87j" podUID="75df7794-7926-4023-a9fe-c8bb08e18219" Apr 20 22:23:48.536208 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.536192 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 22:23:48.536283 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.536234 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-7bhr2\"" Apr 20 22:23:48.536283 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.536245 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 22:23:48.536283 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.536238 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 22:23:48.536434 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.536303 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 22:23:48.536434 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.536428 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 22:23:48.538111 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.538096 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:23:48.538205 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:48.538176 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wb2c" podUID="601775e4-554d-4221-907f-4a5d646c32e4" Apr 20 22:23:48.539018 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.538996 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 22:23:48.540359 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.540342 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.542605 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.542588 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-n2bhm" Apr 20 22:23:48.542994 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.542977 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 22:23:48.543238 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.543221 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 22:23:48.543298 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.543269 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pf45g\"" Apr 20 22:23:48.543363 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.543346 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 22:23:48.543572 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.543557 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 22:23:48.543754 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.543741 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 22:23:48.543879 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.543847 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 22:23:48.543974 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.543949 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 22:18:47 +0000 UTC" deadline="2028-01-16 17:33:13.082727634 +0000 UTC" Apr 20 22:23:48.544023 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.543977 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15259h9m24.538755939s" Apr 20 22:23:48.544870 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.544834 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.545037 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.545018 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-5j57c\"" Apr 20 22:23:48.545037 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.545032 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 22:23:48.546503 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.546119 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 22:23:48.548265 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.547514 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-n5pr5\"" Apr 20 22:23:48.548265 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.548014 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 22:23:48.561061 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.561039 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8xq8m" Apr 20 22:23:48.570551 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.570503 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8xq8m" Apr 20 22:23:48.591011 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:48.590987 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17685fe779018045b34417c0aceeb6de.slice/crio-c6ea7e705844bf669edf2a48455ff6c1d3242b7023fcbcc62206fed88d10b058 WatchSource:0}: Error finding container c6ea7e705844bf669edf2a48455ff6c1d3242b7023fcbcc62206fed88d10b058: Status 404 returned error can't find the container with id c6ea7e705844bf669edf2a48455ff6c1d3242b7023fcbcc62206fed88d10b058 Apr 20 22:23:48.591321 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:48.591305 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d7985ee616f47b485ff61a5cec01dca.slice/crio-860d4ae79438715fc3fac959c5843650a88efc66e255ca092a709143e8b6ee8c WatchSource:0}: Error finding container 860d4ae79438715fc3fac959c5843650a88efc66e255ca092a709143e8b6ee8c: Status 404 returned error can't find the container with id 860d4ae79438715fc3fac959c5843650a88efc66e255ca092a709143e8b6ee8c Apr 20 22:23:48.595994 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.595978 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 22:23:48.598569 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.598524 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-91.ec2.internal" event={"ID":"0d7985ee616f47b485ff61a5cec01dca","Type":"ContainerStarted","Data":"860d4ae79438715fc3fac959c5843650a88efc66e255ca092a709143e8b6ee8c"} Apr 20 22:23:48.599669 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.599644 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-91.ec2.internal" event={"ID":"17685fe779018045b34417c0aceeb6de","Type":"ContainerStarted","Data":"c6ea7e705844bf669edf2a48455ff6c1d3242b7023fcbcc62206fed88d10b058"} Apr 20 22:23:48.629310 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.629294 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 22:23:48.632812 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.632795 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/42a32769-748f-43a7-95c5-8aea7b36621e-konnectivity-ca\") pod \"konnectivity-agent-n2bhm\" (UID: \"42a32769-748f-43a7-95c5-8aea7b36621e\") " pod="kube-system/konnectivity-agent-n2bhm" Apr 20 22:23:48.632886 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.632821 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-system-cni-dir\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.632886 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.632839 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-host-run-netns\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.632886 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.632874 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5fe8d086-787c-4e3e-ac72-53b9ac48d390-multus-daemon-config\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.633037 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.632896 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-host-run-multus-certs\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.633037 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.632944 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-node-log\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.633037 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.632979 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-host-cni-bin\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.633037 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633011 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-host\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.633196 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633041 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/764d33cb-2655-4fbe-a337-a91cf3ac5633-sys-fs\") pod \"aws-ebs-csi-driver-node-4f565\" (UID: \"764d33cb-2655-4fbe-a337-a91cf3ac5633\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.633196 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633080 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-etc-sysctl-conf\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.633196 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633090 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-host\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.633196 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633134 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-os-release\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.633357 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633188 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-systemd-units\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.633357 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633227 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/764d33cb-2655-4fbe-a337-a91cf3ac5633-sys-fs\") pod \"aws-ebs-csi-driver-node-4f565\" (UID: \"764d33cb-2655-4fbe-a337-a91cf3ac5633\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.633357 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633233 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-etc-sysctl-conf\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.633357 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633273 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x79qp\" (UniqueName: \"kubernetes.io/projected/0e9f5856-134e-449f-afaf-3ab93b2577c2-kube-api-access-x79qp\") pod \"iptables-alerter-r76sg\" (UID: \"0e9f5856-134e-449f-afaf-3ab93b2577c2\") " pod="openshift-network-operator/iptables-alerter-r76sg" Apr 20 22:23:48.633560 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633379 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-etc-kubernetes\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.633560 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633406 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fb278ab-931f-4221-bed3-819801bca936-tmp\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.633560 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633417 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-etc-kubernetes\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.633560 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633432 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-cnibin\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.633560 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633467 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.633560 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633505 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-host-kubelet\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.633560 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633528 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-log-socket\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.633560 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633553 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.633946 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633579 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-system-cni-dir\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.633946 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633604 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-os-release\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.633946 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633631 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 22:23:48.633946 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633638 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-var-lib-openvswitch\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.633946 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633684 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0e9f5856-134e-449f-afaf-3ab93b2577c2-iptables-alerter-script\") pod \"iptables-alerter-r76sg\" (UID: \"0e9f5856-134e-449f-afaf-3ab93b2577c2\") " pod="openshift-network-operator/iptables-alerter-r76sg" Apr 20 22:23:48.633946 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633749 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/764d33cb-2655-4fbe-a337-a91cf3ac5633-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4f565\" (UID: \"764d33cb-2655-4fbe-a337-a91cf3ac5633\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.633946 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633820 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/764d33cb-2655-4fbe-a337-a91cf3ac5633-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4f565\" (UID: \"764d33cb-2655-4fbe-a337-a91cf3ac5633\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.633946 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633879 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-multus-socket-dir-parent\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.633946 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633907 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d341b24-8cdf-4d59-a97c-54cecc195860-host\") pod \"node-ca-m4npw\" (UID: \"1d341b24-8cdf-4d59-a97c-54cecc195860\") " pod="openshift-image-registry/node-ca-m4npw" Apr 20 22:23:48.633946 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633932 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1d341b24-8cdf-4d59-a97c-54cecc195860-serviceca\") pod \"node-ca-m4npw\" (UID: \"1d341b24-8cdf-4d59-a97c-54cecc195860\") " pod="openshift-image-registry/node-ca-m4npw" Apr 20 22:23:48.634396 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633968 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1d341b24-8cdf-4d59-a97c-54cecc195860-host\") pod \"node-ca-m4npw\" (UID: \"1d341b24-8cdf-4d59-a97c-54cecc195860\") " pod="openshift-image-registry/node-ca-m4npw" Apr 20 22:23:48.634396 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.633981 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/764d33cb-2655-4fbe-a337-a91cf3ac5633-registration-dir\") pod \"aws-ebs-csi-driver-node-4f565\" (UID: \"764d33cb-2655-4fbe-a337-a91cf3ac5633\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.634396 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634007 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/764d33cb-2655-4fbe-a337-a91cf3ac5633-device-dir\") pod \"aws-ebs-csi-driver-node-4f565\" (UID: \"764d33cb-2655-4fbe-a337-a91cf3ac5633\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.634396 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634031 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-etc-sysctl-d\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.634396 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634047 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/764d33cb-2655-4fbe-a337-a91cf3ac5633-registration-dir\") pod \"aws-ebs-csi-driver-node-4f565\" (UID: \"764d33cb-2655-4fbe-a337-a91cf3ac5633\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.634396 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634058 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.634396 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634101 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/764d33cb-2655-4fbe-a337-a91cf3ac5633-device-dir\") pod \"aws-ebs-csi-driver-node-4f565\" (UID: \"764d33cb-2655-4fbe-a337-a91cf3ac5633\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.634396 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634121 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svn47\" (UniqueName: \"kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47\") pod \"network-check-target-2wb2c\" (UID: \"601775e4-554d-4221-907f-4a5d646c32e4\") " pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:23:48.634396 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634163 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-cni-binary-copy\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.634396 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634184 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-etc-sysctl-d\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.634396 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634198 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/42a32769-748f-43a7-95c5-8aea7b36621e-agent-certs\") pod \"konnectivity-agent-n2bhm\" (UID: \"42a32769-748f-43a7-95c5-8aea7b36621e\") " pod="kube-system/konnectivity-agent-n2bhm" Apr 20 22:23:48.634396 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634223 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs\") pod \"network-metrics-daemon-rl87j\" (UID: \"75df7794-7926-4023-a9fe-c8bb08e18219\") " pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:23:48.634396 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634255 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0e9f5856-134e-449f-afaf-3ab93b2577c2-iptables-alerter-script\") pod \"iptables-alerter-r76sg\" (UID: \"0e9f5856-134e-449f-afaf-3ab93b2577c2\") " pod="openshift-network-operator/iptables-alerter-r76sg" Apr 20 22:23:48.634396 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634247 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-host-run-netns\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.634396 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634302 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svw9p\" (UniqueName: \"kubernetes.io/projected/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-kube-api-access-svw9p\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.634396 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634329 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wg6l5\" (UniqueName: \"kubernetes.io/projected/764d33cb-2655-4fbe-a337-a91cf3ac5633-kube-api-access-wg6l5\") pod \"aws-ebs-csi-driver-node-4f565\" (UID: \"764d33cb-2655-4fbe-a337-a91cf3ac5633\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.635080 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634341 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/1d341b24-8cdf-4d59-a97c-54cecc195860-serviceca\") pod \"node-ca-m4npw\" (UID: \"1d341b24-8cdf-4d59-a97c-54cecc195860\") " pod="openshift-image-registry/node-ca-m4npw" Apr 20 22:23:48.635080 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634366 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-var-lib-kubelet\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.635080 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634400 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0fb278ab-931f-4221-bed3-819801bca936-etc-tuned\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.635080 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634435 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mhdps\" (UniqueName: \"kubernetes.io/projected/0fb278ab-931f-4221-bed3-819801bca936-kube-api-access-mhdps\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.635080 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634476 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-var-lib-kubelet\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.635080 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634483 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm8qv\" (UniqueName: \"kubernetes.io/projected/5fe8d086-787c-4e3e-ac72-53b9ac48d390-kube-api-access-pm8qv\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.635080 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634522 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3533f633-4984-403c-9826-8812fe861cca-tmp-dir\") pod \"node-resolver-56xnh\" (UID: \"3533f633-4984-403c-9826-8812fe861cca\") " pod="openshift-dns/node-resolver-56xnh" Apr 20 22:23:48.635080 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634576 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jwkr\" (UniqueName: \"kubernetes.io/projected/75df7794-7926-4023-a9fe-c8bb08e18219-kube-api-access-9jwkr\") pod \"network-metrics-daemon-rl87j\" (UID: \"75df7794-7926-4023-a9fe-c8bb08e18219\") " pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:23:48.635080 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634608 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-etc-openvswitch\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.635080 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634643 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/764d33cb-2655-4fbe-a337-a91cf3ac5633-etc-selinux\") pod \"aws-ebs-csi-driver-node-4f565\" (UID: \"764d33cb-2655-4fbe-a337-a91cf3ac5633\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.635080 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634696 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e9f5856-134e-449f-afaf-3ab93b2577c2-host-slash\") pod \"iptables-alerter-r76sg\" (UID: \"0e9f5856-134e-449f-afaf-3ab93b2577c2\") " pod="openshift-network-operator/iptables-alerter-r76sg" Apr 20 22:23:48.635080 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634713 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.635080 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634730 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/764d33cb-2655-4fbe-a337-a91cf3ac5633-etc-selinux\") pod \"aws-ebs-csi-driver-node-4f565\" (UID: \"764d33cb-2655-4fbe-a337-a91cf3ac5633\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.635080 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634737 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e9f5856-134e-449f-afaf-3ab93b2577c2-host-slash\") pod \"iptables-alerter-r76sg\" (UID: \"0e9f5856-134e-449f-afaf-3ab93b2577c2\") " pod="openshift-network-operator/iptables-alerter-r76sg" Apr 20 22:23:48.635080 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634739 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-cnibin\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.635080 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634783 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-run-systemd\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.635080 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634817 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-ovnkube-config\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.635685 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634833 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-host-slash\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.635685 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634881 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-ovnkube-script-lib\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.635685 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634926 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-sys\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.635685 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634958 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-run-openvswitch\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.635685 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634976 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-host-run-ovn-kubernetes\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.635685 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.634991 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-host-cni-netd\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.635685 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635006 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-lib-modules\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.635685 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635026 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dg7r\" (UniqueName: \"kubernetes.io/projected/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-kube-api-access-6dg7r\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.635685 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635066 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-host-var-lib-cni-bin\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.635685 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635052 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-sys\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.635685 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635098 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-host-var-lib-cni-multus\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.635685 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635143 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-lib-modules\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.635685 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635146 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-hostroot\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.635685 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635189 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3533f633-4984-403c-9826-8812fe861cca-hosts-file\") pod \"node-resolver-56xnh\" (UID: \"3533f633-4984-403c-9826-8812fe861cca\") " pod="openshift-dns/node-resolver-56xnh" Apr 20 22:23:48.635685 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635216 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-host-var-lib-kubelet\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.635685 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635239 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-multus-conf-dir\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.635685 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635261 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-run-ovn\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.636186 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635288 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-etc-sysconfig\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.636186 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635311 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-etc-systemd\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.636186 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635355 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-multus-cni-dir\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.636186 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635381 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-etc-systemd\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.636186 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635392 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-etc-kubernetes\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.636186 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635425 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-etc-sysconfig\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.636186 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635427 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ghcl\" (UniqueName: \"kubernetes.io/projected/1d341b24-8cdf-4d59-a97c-54cecc195860-kube-api-access-5ghcl\") pod \"node-ca-m4npw\" (UID: \"1d341b24-8cdf-4d59-a97c-54cecc195860\") " pod="openshift-image-registry/node-ca-m4npw" Apr 20 22:23:48.636186 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635454 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-etc-modprobe-d\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.636186 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635479 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5fe8d086-787c-4e3e-ac72-53b9ac48d390-cni-binary-copy\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.636186 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635502 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2mf6\" (UniqueName: \"kubernetes.io/projected/3533f633-4984-403c-9826-8812fe861cca-kube-api-access-z2mf6\") pod \"node-resolver-56xnh\" (UID: \"3533f633-4984-403c-9826-8812fe861cca\") " pod="openshift-dns/node-resolver-56xnh" Apr 20 22:23:48.636186 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635536 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-run\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.636186 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635570 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-host-run-k8s-cni-cncf-io\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.636186 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635614 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-run\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.636186 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635626 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-env-overrides\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.636186 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635616 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0fb278ab-931f-4221-bed3-819801bca936-etc-modprobe-d\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.636186 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635647 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-ovn-node-metrics-cert\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.636186 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635664 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/764d33cb-2655-4fbe-a337-a91cf3ac5633-socket-dir\") pod \"aws-ebs-csi-driver-node-4f565\" (UID: \"764d33cb-2655-4fbe-a337-a91cf3ac5633\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.636670 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.635789 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/764d33cb-2655-4fbe-a337-a91cf3ac5633-socket-dir\") pod \"aws-ebs-csi-driver-node-4f565\" (UID: \"764d33cb-2655-4fbe-a337-a91cf3ac5633\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.636670 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.636402 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0fb278ab-931f-4221-bed3-819801bca936-tmp\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.636670 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.636490 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0fb278ab-931f-4221-bed3-819801bca936-etc-tuned\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.644636 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.644482 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x79qp\" (UniqueName: \"kubernetes.io/projected/0e9f5856-134e-449f-afaf-3ab93b2577c2-kube-api-access-x79qp\") pod \"iptables-alerter-r76sg\" (UID: \"0e9f5856-134e-449f-afaf-3ab93b2577c2\") " pod="openshift-network-operator/iptables-alerter-r76sg" Apr 20 22:23:48.644726 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.644514 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhdps\" (UniqueName: \"kubernetes.io/projected/0fb278ab-931f-4221-bed3-819801bca936-kube-api-access-mhdps\") pod \"tuned-hcxd5\" (UID: \"0fb278ab-931f-4221-bed3-819801bca936\") " pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.644726 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.644520 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg6l5\" (UniqueName: \"kubernetes.io/projected/764d33cb-2655-4fbe-a337-a91cf3ac5633-kube-api-access-wg6l5\") pod \"aws-ebs-csi-driver-node-4f565\" (UID: \"764d33cb-2655-4fbe-a337-a91cf3ac5633\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.644794 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.644737 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ghcl\" (UniqueName: \"kubernetes.io/projected/1d341b24-8cdf-4d59-a97c-54cecc195860-kube-api-access-5ghcl\") pod \"node-ca-m4npw\" (UID: \"1d341b24-8cdf-4d59-a97c-54cecc195860\") " pod="openshift-image-registry/node-ca-m4npw" Apr 20 22:23:48.736310 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.736287 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-multus-socket-dir-parent\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.736397 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.736316 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.736397 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.736371 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-multus-socket-dir-parent\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.736467 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.736412 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svn47\" (UniqueName: \"kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47\") pod \"network-check-target-2wb2c\" (UID: \"601775e4-554d-4221-907f-4a5d646c32e4\") " pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:23:48.736467 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.736433 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-cni-binary-copy\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.736467 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.736448 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/42a32769-748f-43a7-95c5-8aea7b36621e-agent-certs\") pod \"konnectivity-agent-n2bhm\" (UID: \"42a32769-748f-43a7-95c5-8aea7b36621e\") " pod="kube-system/konnectivity-agent-n2bhm" Apr 20 22:23:48.736563 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.736464 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs\") pod \"network-metrics-daemon-rl87j\" (UID: \"75df7794-7926-4023-a9fe-c8bb08e18219\") " pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:23:48.736563 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.736488 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-host-run-netns\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.736654 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.736578 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svw9p\" (UniqueName: \"kubernetes.io/projected/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-kube-api-access-svw9p\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.736654 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:48.736578 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:23:48.736742 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:48.736689 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs podName:75df7794-7926-4023-a9fe-c8bb08e18219 nodeName:}" failed. No retries permitted until 2026-04-20 22:23:49.236665997 +0000 UTC m=+2.189620595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs") pod "network-metrics-daemon-rl87j" (UID: "75df7794-7926-4023-a9fe-c8bb08e18219") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:23:48.736742 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.736718 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pm8qv\" (UniqueName: \"kubernetes.io/projected/5fe8d086-787c-4e3e-ac72-53b9ac48d390-kube-api-access-pm8qv\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.736876 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.736761 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3533f633-4984-403c-9826-8812fe861cca-tmp-dir\") pod \"node-resolver-56xnh\" (UID: \"3533f633-4984-403c-9826-8812fe861cca\") " pod="openshift-dns/node-resolver-56xnh" Apr 20 22:23:48.736876 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.736726 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-host-run-netns\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.736876 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.736791 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jwkr\" (UniqueName: \"kubernetes.io/projected/75df7794-7926-4023-a9fe-c8bb08e18219-kube-api-access-9jwkr\") pod \"network-metrics-daemon-rl87j\" (UID: \"75df7794-7926-4023-a9fe-c8bb08e18219\") " pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:23:48.736876 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.736779 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.737119 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.736888 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-etc-openvswitch\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.737119 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.736927 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.737119 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.736930 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-etc-openvswitch\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.737119 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.736965 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-cnibin\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.737119 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.736992 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-run-systemd\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.737119 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737017 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-ovnkube-config\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.737119 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737041 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-host-slash\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.737119 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737048 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-cni-binary-copy\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.737119 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737066 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-ovnkube-script-lib\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.737119 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737069 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.737119 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737077 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3533f633-4984-403c-9826-8812fe861cca-tmp-dir\") pod \"node-resolver-56xnh\" (UID: \"3533f633-4984-403c-9826-8812fe861cca\") " pod="openshift-dns/node-resolver-56xnh" Apr 20 22:23:48.737119 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737069 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-run-systemd\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.737119 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737102 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-run-openvswitch\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.737119 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737067 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-cnibin\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.737765 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737143 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-host-run-ovn-kubernetes\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.737765 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737107 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-host-slash\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.737765 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737162 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-run-openvswitch\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.737765 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737174 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-host-cni-netd\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.737765 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737203 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-host-run-ovn-kubernetes\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.737765 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737216 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dg7r\" (UniqueName: \"kubernetes.io/projected/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-kube-api-access-6dg7r\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.737765 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737205 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-host-cni-netd\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.737765 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737246 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-host-var-lib-cni-bin\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.737765 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737270 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-host-var-lib-cni-multus\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.737765 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737305 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-hostroot\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.737765 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737316 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-host-var-lib-cni-bin\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.737765 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737324 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3533f633-4984-403c-9826-8812fe861cca-hosts-file\") pod \"node-resolver-56xnh\" (UID: \"3533f633-4984-403c-9826-8812fe861cca\") " pod="openshift-dns/node-resolver-56xnh" Apr 20 22:23:48.737765 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737355 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-host-var-lib-cni-multus\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.737765 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737360 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-hostroot\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.737765 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737377 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3533f633-4984-403c-9826-8812fe861cca-hosts-file\") pod \"node-resolver-56xnh\" (UID: \"3533f633-4984-403c-9826-8812fe861cca\") " pod="openshift-dns/node-resolver-56xnh" Apr 20 22:23:48.737765 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737382 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-host-var-lib-kubelet\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.737765 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737404 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-multus-conf-dir\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.737765 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737414 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-host-var-lib-kubelet\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.738376 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737419 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-run-ovn\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.738376 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737436 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-multus-cni-dir\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.738376 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737440 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-multus-conf-dir\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.738376 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737478 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-multus-cni-dir\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.738376 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737477 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-run-ovn\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.738376 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737497 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-etc-kubernetes\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.738376 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737524 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5fe8d086-787c-4e3e-ac72-53b9ac48d390-cni-binary-copy\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.738376 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737527 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-etc-kubernetes\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.738376 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737539 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2mf6\" (UniqueName: \"kubernetes.io/projected/3533f633-4984-403c-9826-8812fe861cca-kube-api-access-z2mf6\") pod \"node-resolver-56xnh\" (UID: \"3533f633-4984-403c-9826-8812fe861cca\") " pod="openshift-dns/node-resolver-56xnh" Apr 20 22:23:48.738376 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737555 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-host-run-k8s-cni-cncf-io\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.738376 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737570 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-env-overrides\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.738376 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737599 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-ovn-node-metrics-cert\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.738376 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737623 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-host-run-k8s-cni-cncf-io\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.738376 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737711 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/42a32769-748f-43a7-95c5-8aea7b36621e-konnectivity-ca\") pod \"konnectivity-agent-n2bhm\" (UID: \"42a32769-748f-43a7-95c5-8aea7b36621e\") " pod="kube-system/konnectivity-agent-n2bhm" Apr 20 22:23:48.738376 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737736 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-system-cni-dir\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.738376 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737750 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-host-run-netns\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.738376 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737766 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5fe8d086-787c-4e3e-ac72-53b9ac48d390-multus-daemon-config\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.738376 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737808 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-system-cni-dir\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.739070 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737841 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-host-run-multus-certs\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.739070 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737892 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-node-log\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.739070 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737921 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-host-cni-bin\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.739070 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737955 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-os-release\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.739070 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.737983 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-systemd-units\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.739070 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738017 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-cnibin\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.739070 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738046 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.739070 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738078 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-host-kubelet\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.739070 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738105 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-log-socket\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.739070 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738136 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.739070 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738166 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-system-cni-dir\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.739070 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738196 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-os-release\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.739070 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738249 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-var-lib-openvswitch\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.739070 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738259 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5fe8d086-787c-4e3e-ac72-53b9ac48d390-multus-daemon-config\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.739070 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738272 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-ovnkube-script-lib\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.739070 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738325 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.739070 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738338 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-log-socket\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.739618 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738341 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-host-run-netns\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.739618 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738369 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-ovnkube-config\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.739618 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738376 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-host-kubelet\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.739618 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738382 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-os-release\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.739618 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738398 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-var-lib-openvswitch\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.739618 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738410 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-system-cni-dir\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.739618 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738416 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-systemd-units\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.739618 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738488 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-host-cni-bin\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.739618 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738524 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5fe8d086-787c-4e3e-ac72-53b9ac48d390-host-run-multus-certs\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.739618 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738534 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-node-log\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.739618 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738558 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-cnibin\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.739618 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738589 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-os-release\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.739618 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738613 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5fe8d086-787c-4e3e-ac72-53b9ac48d390-cni-binary-copy\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.739618 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738883 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-env-overrides\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.739618 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738909 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.739618 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.738989 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/42a32769-748f-43a7-95c5-8aea7b36621e-konnectivity-ca\") pod \"konnectivity-agent-n2bhm\" (UID: \"42a32769-748f-43a7-95c5-8aea7b36621e\") " pod="kube-system/konnectivity-agent-n2bhm" Apr 20 22:23:48.739618 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.739231 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/42a32769-748f-43a7-95c5-8aea7b36621e-agent-certs\") pod \"konnectivity-agent-n2bhm\" (UID: \"42a32769-748f-43a7-95c5-8aea7b36621e\") " pod="kube-system/konnectivity-agent-n2bhm" Apr 20 22:23:48.740237 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.740220 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-ovn-node-metrics-cert\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.743705 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:48.743690 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 22:23:48.743781 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:48.743708 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 22:23:48.743781 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:48.743720 2573 projected.go:194] Error preparing data for projected volume kube-api-access-svn47 for pod openshift-network-diagnostics/network-check-target-2wb2c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:23:48.743926 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:48.743786 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47 podName:601775e4-554d-4221-907f-4a5d646c32e4 nodeName:}" failed. No retries permitted until 2026-04-20 22:23:49.243768749 +0000 UTC m=+2.196723349 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-svn47" (UniqueName: "kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47") pod "network-check-target-2wb2c" (UID: "601775e4-554d-4221-907f-4a5d646c32e4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:23:48.745593 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.745573 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dg7r\" (UniqueName: \"kubernetes.io/projected/d7ecb730-4be8-4cc2-86d1-47a71c9e25e7-kube-api-access-6dg7r\") pod \"multus-additional-cni-plugins-qvjgp\" (UID: \"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7\") " pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.746013 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.745989 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jwkr\" (UniqueName: \"kubernetes.io/projected/75df7794-7926-4023-a9fe-c8bb08e18219-kube-api-access-9jwkr\") pod \"network-metrics-daemon-rl87j\" (UID: \"75df7794-7926-4023-a9fe-c8bb08e18219\") " pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:23:48.746265 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.746250 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svw9p\" (UniqueName: \"kubernetes.io/projected/bffb7e6c-ecd8-45cd-a238-8bbc21a4553b-kube-api-access-svw9p\") pod \"ovnkube-node-2trzq\" (UID: \"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b\") " pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.746704 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.746689 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm8qv\" (UniqueName: \"kubernetes.io/projected/5fe8d086-787c-4e3e-ac72-53b9ac48d390-kube-api-access-pm8qv\") pod \"multus-q9f2s\" (UID: \"5fe8d086-787c-4e3e-ac72-53b9ac48d390\") " pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.746744 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.746712 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2mf6\" (UniqueName: \"kubernetes.io/projected/3533f633-4984-403c-9826-8812fe861cca-kube-api-access-z2mf6\") pod \"node-resolver-56xnh\" (UID: \"3533f633-4984-403c-9826-8812fe861cca\") " pod="openshift-dns/node-resolver-56xnh" Apr 20 22:23:48.850420 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.850372 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" Apr 20 22:23:48.855451 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.855435 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-m4npw" Apr 20 22:23:48.855790 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:48.855761 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fb278ab_931f_4221_bed3_819801bca936.slice/crio-a21e7180b0e694379358fe2ccb2cf6575b8dbda61e0a20627d61e5765f8cd103 WatchSource:0}: Error finding container a21e7180b0e694379358fe2ccb2cf6575b8dbda61e0a20627d61e5765f8cd103: Status 404 returned error can't find the container with id a21e7180b0e694379358fe2ccb2cf6575b8dbda61e0a20627d61e5765f8cd103 Apr 20 22:23:48.861765 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:48.861747 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d341b24_8cdf_4d59_a97c_54cecc195860.slice/crio-7483218e9eddf8e4c993eb1666d5232a053c68d4ebbf2a6ea7d9e6910f414a6b WatchSource:0}: Error finding container 7483218e9eddf8e4c993eb1666d5232a053c68d4ebbf2a6ea7d9e6910f414a6b: Status 404 returned error can't find the container with id 7483218e9eddf8e4c993eb1666d5232a053c68d4ebbf2a6ea7d9e6910f414a6b Apr 20 22:23:48.868823 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.868807 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 22:23:48.879134 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.879118 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-r76sg" Apr 20 22:23:48.884846 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:48.884826 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e9f5856_134e_449f_afaf_3ab93b2577c2.slice/crio-65e9134666ad1f7bc2e3ffee450a950eae246efb9c88da64c73ce32c0e9d9608 WatchSource:0}: Error finding container 65e9134666ad1f7bc2e3ffee450a950eae246efb9c88da64c73ce32c0e9d9608: Status 404 returned error can't find the container with id 65e9134666ad1f7bc2e3ffee450a950eae246efb9c88da64c73ce32c0e9d9608 Apr 20 22:23:48.886862 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.886832 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 22:23:48.898148 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.898133 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" Apr 20 22:23:48.903399 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.903381 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-56xnh" Apr 20 22:23:48.903641 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:48.903619 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod764d33cb_2655_4fbe_a337_a91cf3ac5633.slice/crio-8d19307e46b120434d3b31463d6b3e40a8904bd693e8c7e61685ef0c10cbb245 WatchSource:0}: Error finding container 8d19307e46b120434d3b31463d6b3e40a8904bd693e8c7e61685ef0c10cbb245: Status 404 returned error can't find the container with id 8d19307e46b120434d3b31463d6b3e40a8904bd693e8c7e61685ef0c10cbb245 Apr 20 22:23:48.909693 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.909670 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qvjgp" Apr 20 22:23:48.909998 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:48.909979 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3533f633_4984_403c_9826_8812fe861cca.slice/crio-e452bf0dc3bd9b3867189694f8e78fa2d7fbe707d8d79c88eec30defc544d38f WatchSource:0}: Error finding container e452bf0dc3bd9b3867189694f8e78fa2d7fbe707d8d79c88eec30defc544d38f: Status 404 returned error can't find the container with id e452bf0dc3bd9b3867189694f8e78fa2d7fbe707d8d79c88eec30defc544d38f Apr 20 22:23:48.915686 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.915670 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:23:48.916645 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:48.916543 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7ecb730_4be8_4cc2_86d1_47a71c9e25e7.slice/crio-a026de35cfb0a70add60e1af6dfa7d210064d79069f7e6fe531d67bc1d628284 WatchSource:0}: Error finding container a026de35cfb0a70add60e1af6dfa7d210064d79069f7e6fe531d67bc1d628284: Status 404 returned error can't find the container with id a026de35cfb0a70add60e1af6dfa7d210064d79069f7e6fe531d67bc1d628284 Apr 20 22:23:48.921434 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.921420 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-n2bhm" Apr 20 22:23:48.922092 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:48.921921 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbffb7e6c_ecd8_45cd_a238_8bbc21a4553b.slice/crio-9fc0861a42c00b41b4edfaf7b8c30e872a4dc5a958e839821f1089193eedb161 WatchSource:0}: Error finding container 9fc0861a42c00b41b4edfaf7b8c30e872a4dc5a958e839821f1089193eedb161: Status 404 returned error can't find the container with id 9fc0861a42c00b41b4edfaf7b8c30e872a4dc5a958e839821f1089193eedb161 Apr 20 22:23:48.925693 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:48.925677 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-q9f2s" Apr 20 22:23:48.926767 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:48.926750 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42a32769_748f_43a7_95c5_8aea7b36621e.slice/crio-3e8f6162ec0cacfa4ab9ff6281b75d8ec5f3b3d2989f87c5706a852c5e32513a WatchSource:0}: Error finding container 3e8f6162ec0cacfa4ab9ff6281b75d8ec5f3b3d2989f87c5706a852c5e32513a: Status 404 returned error can't find the container with id 3e8f6162ec0cacfa4ab9ff6281b75d8ec5f3b3d2989f87c5706a852c5e32513a Apr 20 22:23:48.931129 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:23:48.931107 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe8d086_787c_4e3e_ac72_53b9ac48d390.slice/crio-7bf1c2064e7df05c8ff467b8cdde9b9de9e50f6b8a92dd3b632d0d3d71674726 WatchSource:0}: Error finding container 7bf1c2064e7df05c8ff467b8cdde9b9de9e50f6b8a92dd3b632d0d3d71674726: Status 404 returned error can't find the container with id 7bf1c2064e7df05c8ff467b8cdde9b9de9e50f6b8a92dd3b632d0d3d71674726 Apr 20 22:23:49.241885 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:49.241781 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs\") pod \"network-metrics-daemon-rl87j\" (UID: \"75df7794-7926-4023-a9fe-c8bb08e18219\") " pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:23:49.242031 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:49.241946 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:23:49.242031 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:49.242006 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs podName:75df7794-7926-4023-a9fe-c8bb08e18219 nodeName:}" failed. No retries permitted until 2026-04-20 22:23:50.241987961 +0000 UTC m=+3.194942562 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs") pod "network-metrics-daemon-rl87j" (UID: "75df7794-7926-4023-a9fe-c8bb08e18219") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:23:49.342863 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:49.342807 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svn47\" (UniqueName: \"kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47\") pod \"network-check-target-2wb2c\" (UID: \"601775e4-554d-4221-907f-4a5d646c32e4\") " pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:23:49.343054 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:49.343034 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 22:23:49.343127 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:49.343062 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 22:23:49.343127 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:49.343076 2573 projected.go:194] Error preparing data for projected volume kube-api-access-svn47 for pod openshift-network-diagnostics/network-check-target-2wb2c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:23:49.343232 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:49.343131 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47 podName:601775e4-554d-4221-907f-4a5d646c32e4 nodeName:}" failed. No retries permitted until 2026-04-20 22:23:50.343114209 +0000 UTC m=+3.296068810 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-svn47" (UniqueName: "kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47") pod "network-check-target-2wb2c" (UID: "601775e4-554d-4221-907f-4a5d646c32e4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:23:49.572046 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:49.571945 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 22:18:48 +0000 UTC" deadline="2028-01-04 09:37:04.378338956 +0000 UTC" Apr 20 22:23:49.572046 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:49.571997 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14963h13m14.806346526s" Apr 20 22:23:49.637275 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:49.637233 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" event={"ID":"764d33cb-2655-4fbe-a337-a91cf3ac5633","Type":"ContainerStarted","Data":"8d19307e46b120434d3b31463d6b3e40a8904bd693e8c7e61685ef0c10cbb245"} Apr 20 22:23:49.648493 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:49.648455 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" event={"ID":"0fb278ab-931f-4221-bed3-819801bca936","Type":"ContainerStarted","Data":"a21e7180b0e694379358fe2ccb2cf6575b8dbda61e0a20627d61e5765f8cd103"} Apr 20 22:23:49.657757 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:49.657698 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvjgp" event={"ID":"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7","Type":"ContainerStarted","Data":"a026de35cfb0a70add60e1af6dfa7d210064d79069f7e6fe531d67bc1d628284"} Apr 20 22:23:49.660208 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:49.660176 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-r76sg" event={"ID":"0e9f5856-134e-449f-afaf-3ab93b2577c2","Type":"ContainerStarted","Data":"65e9134666ad1f7bc2e3ffee450a950eae246efb9c88da64c73ce32c0e9d9608"} Apr 20 22:23:49.671529 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:49.671496 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-m4npw" event={"ID":"1d341b24-8cdf-4d59-a97c-54cecc195860","Type":"ContainerStarted","Data":"7483218e9eddf8e4c993eb1666d5232a053c68d4ebbf2a6ea7d9e6910f414a6b"} Apr 20 22:23:49.683426 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:49.683343 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q9f2s" event={"ID":"5fe8d086-787c-4e3e-ac72-53b9ac48d390","Type":"ContainerStarted","Data":"7bf1c2064e7df05c8ff467b8cdde9b9de9e50f6b8a92dd3b632d0d3d71674726"} Apr 20 22:23:49.697955 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:49.697887 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-n2bhm" event={"ID":"42a32769-748f-43a7-95c5-8aea7b36621e","Type":"ContainerStarted","Data":"3e8f6162ec0cacfa4ab9ff6281b75d8ec5f3b3d2989f87c5706a852c5e32513a"} Apr 20 22:23:49.706557 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:49.706458 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" event={"ID":"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b","Type":"ContainerStarted","Data":"9fc0861a42c00b41b4edfaf7b8c30e872a4dc5a958e839821f1089193eedb161"} Apr 20 22:23:49.726865 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:49.726776 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-56xnh" event={"ID":"3533f633-4984-403c-9826-8812fe861cca","Type":"ContainerStarted","Data":"e452bf0dc3bd9b3867189694f8e78fa2d7fbe707d8d79c88eec30defc544d38f"} Apr 20 22:23:49.761458 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:49.760669 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 22:23:50.254847 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:50.254775 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs\") pod \"network-metrics-daemon-rl87j\" (UID: \"75df7794-7926-4023-a9fe-c8bb08e18219\") " pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:23:50.255049 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:50.254956 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:23:50.255049 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:50.255026 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs podName:75df7794-7926-4023-a9fe-c8bb08e18219 nodeName:}" failed. No retries permitted until 2026-04-20 22:23:52.255006087 +0000 UTC m=+5.207960690 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs") pod "network-metrics-daemon-rl87j" (UID: "75df7794-7926-4023-a9fe-c8bb08e18219") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:23:50.355421 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:50.355383 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svn47\" (UniqueName: \"kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47\") pod \"network-check-target-2wb2c\" (UID: \"601775e4-554d-4221-907f-4a5d646c32e4\") " pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:23:50.355586 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:50.355571 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 22:23:50.355663 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:50.355591 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 22:23:50.355663 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:50.355604 2573 projected.go:194] Error preparing data for projected volume kube-api-access-svn47 for pod openshift-network-diagnostics/network-check-target-2wb2c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:23:50.355760 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:50.355663 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47 podName:601775e4-554d-4221-907f-4a5d646c32e4 nodeName:}" failed. No retries permitted until 2026-04-20 22:23:52.355643843 +0000 UTC m=+5.308598443 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-svn47" (UniqueName: "kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47") pod "network-check-target-2wb2c" (UID: "601775e4-554d-4221-907f-4a5d646c32e4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:23:50.572896 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:50.572747 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 22:18:48 +0000 UTC" deadline="2027-10-10 23:53:34.235352034 +0000 UTC" Apr 20 22:23:50.572896 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:50.572790 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12913h29m43.662565608s" Apr 20 22:23:50.596458 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:50.596394 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:23:50.596621 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:50.596519 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wb2c" podUID="601775e4-554d-4221-907f-4a5d646c32e4" Apr 20 22:23:50.596999 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:50.596977 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:23:50.597111 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:50.597090 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rl87j" podUID="75df7794-7926-4023-a9fe-c8bb08e18219" Apr 20 22:23:51.508298 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:51.508261 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 22:23:52.275812 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:52.275770 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs\") pod \"network-metrics-daemon-rl87j\" (UID: \"75df7794-7926-4023-a9fe-c8bb08e18219\") " pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:23:52.276341 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:52.275953 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:23:52.276341 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:52.276032 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs podName:75df7794-7926-4023-a9fe-c8bb08e18219 nodeName:}" failed. No retries permitted until 2026-04-20 22:23:56.276010614 +0000 UTC m=+9.228965214 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs") pod "network-metrics-daemon-rl87j" (UID: "75df7794-7926-4023-a9fe-c8bb08e18219") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:23:52.377162 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:52.377121 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svn47\" (UniqueName: \"kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47\") pod \"network-check-target-2wb2c\" (UID: \"601775e4-554d-4221-907f-4a5d646c32e4\") " pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:23:52.377341 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:52.377279 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 22:23:52.377341 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:52.377304 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 22:23:52.377341 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:52.377318 2573 projected.go:194] Error preparing data for projected volume kube-api-access-svn47 for pod openshift-network-diagnostics/network-check-target-2wb2c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:23:52.377494 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:52.377386 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47 podName:601775e4-554d-4221-907f-4a5d646c32e4 nodeName:}" failed. No retries permitted until 2026-04-20 22:23:56.377365976 +0000 UTC m=+9.330320577 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-svn47" (UniqueName: "kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47") pod "network-check-target-2wb2c" (UID: "601775e4-554d-4221-907f-4a5d646c32e4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:23:52.597142 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:52.596595 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:23:52.597142 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:52.596719 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wb2c" podUID="601775e4-554d-4221-907f-4a5d646c32e4" Apr 20 22:23:52.597142 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:52.597079 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:23:52.597428 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:52.597191 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rl87j" podUID="75df7794-7926-4023-a9fe-c8bb08e18219" Apr 20 22:23:54.596202 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:54.596162 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:23:54.596643 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:54.596304 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wb2c" podUID="601775e4-554d-4221-907f-4a5d646c32e4" Apr 20 22:23:54.596643 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:54.596360 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:23:54.596643 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:54.596489 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rl87j" podUID="75df7794-7926-4023-a9fe-c8bb08e18219" Apr 20 22:23:56.310103 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:56.310033 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs\") pod \"network-metrics-daemon-rl87j\" (UID: \"75df7794-7926-4023-a9fe-c8bb08e18219\") " pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:23:56.310568 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:56.310241 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:23:56.310568 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:56.310313 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs podName:75df7794-7926-4023-a9fe-c8bb08e18219 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:04.310291256 +0000 UTC m=+17.263245875 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs") pod "network-metrics-daemon-rl87j" (UID: "75df7794-7926-4023-a9fe-c8bb08e18219") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:23:56.411349 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:56.411309 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svn47\" (UniqueName: \"kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47\") pod \"network-check-target-2wb2c\" (UID: \"601775e4-554d-4221-907f-4a5d646c32e4\") " pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:23:56.411893 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:56.411513 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 22:23:56.411893 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:56.411537 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 22:23:56.411893 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:56.411550 2573 projected.go:194] Error preparing data for projected volume kube-api-access-svn47 for pod openshift-network-diagnostics/network-check-target-2wb2c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:23:56.411893 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:56.411612 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47 podName:601775e4-554d-4221-907f-4a5d646c32e4 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:04.411593442 +0000 UTC m=+17.364548055 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-svn47" (UniqueName: "kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47") pod "network-check-target-2wb2c" (UID: "601775e4-554d-4221-907f-4a5d646c32e4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:23:56.597104 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:56.596930 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:23:56.597104 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:56.596953 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:23:56.597104 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:56.597077 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rl87j" podUID="75df7794-7926-4023-a9fe-c8bb08e18219" Apr 20 22:23:56.597355 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:56.597217 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wb2c" podUID="601775e4-554d-4221-907f-4a5d646c32e4" Apr 20 22:23:58.595937 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:58.595902 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:23:58.596381 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:58.596017 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wb2c" podUID="601775e4-554d-4221-907f-4a5d646c32e4" Apr 20 22:23:58.596381 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:23:58.596069 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:23:58.596381 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:23:58.596170 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rl87j" podUID="75df7794-7926-4023-a9fe-c8bb08e18219" Apr 20 22:24:00.596373 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:00.596332 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:24:00.596833 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:00.596481 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rl87j" podUID="75df7794-7926-4023-a9fe-c8bb08e18219" Apr 20 22:24:00.596833 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:00.596524 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:24:00.596833 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:00.596650 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wb2c" podUID="601775e4-554d-4221-907f-4a5d646c32e4" Apr 20 22:24:02.596600 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:02.596567 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:24:02.596600 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:02.596585 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:24:02.597124 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:02.596717 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wb2c" podUID="601775e4-554d-4221-907f-4a5d646c32e4" Apr 20 22:24:02.597124 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:02.596867 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rl87j" podUID="75df7794-7926-4023-a9fe-c8bb08e18219" Apr 20 22:24:04.369155 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:04.369114 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs\") pod \"network-metrics-daemon-rl87j\" (UID: \"75df7794-7926-4023-a9fe-c8bb08e18219\") " pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:24:04.369674 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:04.369285 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:04.369674 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:04.369366 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs podName:75df7794-7926-4023-a9fe-c8bb08e18219 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:20.369343091 +0000 UTC m=+33.322297691 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs") pod "network-metrics-daemon-rl87j" (UID: "75df7794-7926-4023-a9fe-c8bb08e18219") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:04.469674 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:04.469642 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svn47\" (UniqueName: \"kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47\") pod \"network-check-target-2wb2c\" (UID: \"601775e4-554d-4221-907f-4a5d646c32e4\") " pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:24:04.469831 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:04.469817 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 22:24:04.469925 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:04.469840 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 22:24:04.469925 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:04.469868 2573 projected.go:194] Error preparing data for projected volume kube-api-access-svn47 for pod openshift-network-diagnostics/network-check-target-2wb2c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:04.470015 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:04.469932 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47 podName:601775e4-554d-4221-907f-4a5d646c32e4 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:20.469912955 +0000 UTC m=+33.422867554 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-svn47" (UniqueName: "kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47") pod "network-check-target-2wb2c" (UID: "601775e4-554d-4221-907f-4a5d646c32e4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:04.596735 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:04.596700 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:24:04.596919 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:04.596707 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:24:04.596919 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:04.596814 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wb2c" podUID="601775e4-554d-4221-907f-4a5d646c32e4" Apr 20 22:24:04.597024 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:04.596923 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rl87j" podUID="75df7794-7926-4023-a9fe-c8bb08e18219" Apr 20 22:24:06.596487 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:06.596453 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:24:06.596957 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:06.596491 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:24:06.596957 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:06.596580 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rl87j" podUID="75df7794-7926-4023-a9fe-c8bb08e18219" Apr 20 22:24:06.596957 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:06.596706 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wb2c" podUID="601775e4-554d-4221-907f-4a5d646c32e4" Apr 20 22:24:07.769580 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:07.769275 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" event={"ID":"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b","Type":"ContainerStarted","Data":"15e2bfcdd43d9876018e20e6b6a5850119148487e70e8a87222b74a69ab255cd"} Apr 20 22:24:07.770272 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:07.769599 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" event={"ID":"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b","Type":"ContainerStarted","Data":"b927d5afe18fe29e45186ea4047720ba223439e73e03d5b743499a99452f7352"} Apr 20 22:24:07.770899 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:07.770845 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" event={"ID":"0fb278ab-931f-4221-bed3-819801bca936","Type":"ContainerStarted","Data":"4ee94cfc01c2f72df65059139254d1497304efc557cf6ba1da1fbcff6d64f1a0"} Apr 20 22:24:07.772361 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:07.772332 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-91.ec2.internal" event={"ID":"0d7985ee616f47b485ff61a5cec01dca","Type":"ContainerStarted","Data":"54dca0c895ac63e01b502096cc2136fd9e9da0c94893fc14d402c10c60af0939"} Apr 20 22:24:07.772484 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:07.772467 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-91.ec2.internal" Apr 20 22:24:07.773791 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:07.773769 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-q9f2s" event={"ID":"5fe8d086-787c-4e3e-ac72-53b9ac48d390","Type":"ContainerStarted","Data":"4e353acae2881383fa355a51f714913336565b33307340e9b586cdb1ed0b0898"} Apr 20 22:24:07.783978 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:07.783957 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 22:24:07.784766 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:07.784724 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-130-91.ec2.internal"] Apr 20 22:24:07.788533 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:07.787310 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-hcxd5" podStartSLOduration=2.7510905770000003 podStartE2EDuration="20.787293037s" podCreationTimestamp="2026-04-20 22:23:47 +0000 UTC" firstStartedPulling="2026-04-20 22:23:48.858057189 +0000 UTC m=+1.811011791" lastFinishedPulling="2026-04-20 22:24:06.89425965 +0000 UTC m=+19.847214251" observedRunningTime="2026-04-20 22:24:07.786501645 +0000 UTC m=+20.739456267" watchObservedRunningTime="2026-04-20 22:24:07.787293037 +0000 UTC m=+20.740247661" Apr 20 22:24:07.798566 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:07.798521 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-91.ec2.internal" podStartSLOduration=0.798506759 podStartE2EDuration="798.506759ms" podCreationTimestamp="2026-04-20 22:24:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:24:07.797799666 +0000 UTC m=+20.750754286" watchObservedRunningTime="2026-04-20 22:24:07.798506759 +0000 UTC m=+20.751461379" Apr 20 22:24:07.817216 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:07.817153 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-q9f2s" podStartSLOduration=1.824142508 podStartE2EDuration="19.817135371s" podCreationTimestamp="2026-04-20 22:23:48 +0000 UTC" firstStartedPulling="2026-04-20 22:23:48.932673461 +0000 UTC m=+1.885628072" lastFinishedPulling="2026-04-20 22:24:06.925666324 +0000 UTC m=+19.878620935" observedRunningTime="2026-04-20 22:24:07.816733287 +0000 UTC m=+20.769687938" watchObservedRunningTime="2026-04-20 22:24:07.817135371 +0000 UTC m=+20.770089993" Apr 20 22:24:08.596051 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:08.596020 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:24:08.596217 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:08.596020 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:24:08.596217 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:08.596152 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rl87j" podUID="75df7794-7926-4023-a9fe-c8bb08e18219" Apr 20 22:24:08.596217 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:08.596186 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wb2c" podUID="601775e4-554d-4221-907f-4a5d646c32e4" Apr 20 22:24:08.777022 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:08.776984 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-m4npw" event={"ID":"1d341b24-8cdf-4d59-a97c-54cecc195860","Type":"ContainerStarted","Data":"b8980fdbe76c4e7111895bda6d837dec2bf41895aaa27141b928374ff81e2caf"} Apr 20 22:24:08.778576 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:08.778545 2573 generic.go:358] "Generic (PLEG): container finished" podID="17685fe779018045b34417c0aceeb6de" containerID="50a2df20dfdbc37924cc4ea53511a586fbcbc8b7bcf47e1c6e2f1f42da8514c0" exitCode=0 Apr 20 22:24:08.778693 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:08.778578 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-91.ec2.internal" event={"ID":"17685fe779018045b34417c0aceeb6de","Type":"ContainerDied","Data":"50a2df20dfdbc37924cc4ea53511a586fbcbc8b7bcf47e1c6e2f1f42da8514c0"} Apr 20 22:24:08.780085 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:08.780061 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-n2bhm" event={"ID":"42a32769-748f-43a7-95c5-8aea7b36621e","Type":"ContainerStarted","Data":"1f2633eb0ee4b6187f175d2eb452a7f31fd5355ddd45742ed05fc6ab4158143e"} Apr 20 22:24:08.782947 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:08.782923 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" event={"ID":"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b","Type":"ContainerStarted","Data":"f3c49e8cfba0def9dad8beded3f97ad77c0f822ed3268652f58e24f749ad4fce"} Apr 20 22:24:08.783038 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:08.782950 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" event={"ID":"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b","Type":"ContainerStarted","Data":"e77a443aea4cdbb4fa28995c11ac3f64c5e96d5f479da7f839a947cf25244d8c"} Apr 20 22:24:08.783038 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:08.782964 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" event={"ID":"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b","Type":"ContainerStarted","Data":"2e4c282a22fc9924337d0d9620311c00ad3565e743d8eca68834bf3cbe0b5c5d"} Apr 20 22:24:08.783038 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:08.782975 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" event={"ID":"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b","Type":"ContainerStarted","Data":"dc74853841335b2b7a6453624f3b52daef49e0fb921be772c773da6b6bdf0107"} Apr 20 22:24:08.784193 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:08.784168 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-56xnh" event={"ID":"3533f633-4984-403c-9826-8812fe861cca","Type":"ContainerStarted","Data":"f635a00539d0a2e19071ca0fe281a5a171c4f680482467169b1c7da379ab9cfe"} Apr 20 22:24:08.785481 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:08.785460 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" event={"ID":"764d33cb-2655-4fbe-a337-a91cf3ac5633","Type":"ContainerStarted","Data":"2df76ba34324de99e0320a4d716172f68d884100963f1848214d00dea2f1cc18"} Apr 20 22:24:08.788743 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:08.788713 2573 generic.go:358] "Generic (PLEG): container finished" podID="d7ecb730-4be8-4cc2-86d1-47a71c9e25e7" containerID="c4f201a04b8f0e770145c42382fbe1682ccc4a64a2ce63ddb132ee8c40ddee29" exitCode=0 Apr 20 22:24:08.788839 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:08.788780 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvjgp" event={"ID":"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7","Type":"ContainerDied","Data":"c4f201a04b8f0e770145c42382fbe1682ccc4a64a2ce63ddb132ee8c40ddee29"} Apr 20 22:24:08.790514 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:08.790471 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-r76sg" event={"ID":"0e9f5856-134e-449f-afaf-3ab93b2577c2","Type":"ContainerStarted","Data":"896391bdf41cf3390f35ab1418a43d81f07f00dc233f57267280f03cddd25dfe"} Apr 20 22:24:08.790692 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:08.790665 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-91.ec2.internal" Apr 20 22:24:08.801675 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:08.801653 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 22:24:08.801780 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:08.801704 2573 kubelet.go:3342] "Failed creating a mirror pod" err="pods \"kube-apiserver-proxy-ip-10-0-130-91.ec2.internal\" already exists" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-91.ec2.internal" Apr 20 22:24:08.801971 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:08.801931 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-m4npw" podStartSLOduration=3.788393655 podStartE2EDuration="21.801919606s" podCreationTimestamp="2026-04-20 22:23:47 +0000 UTC" firstStartedPulling="2026-04-20 22:23:48.863148745 +0000 UTC m=+1.816103342" lastFinishedPulling="2026-04-20 22:24:06.876674689 +0000 UTC m=+19.829629293" observedRunningTime="2026-04-20 22:24:08.788821668 +0000 UTC m=+21.741776289" watchObservedRunningTime="2026-04-20 22:24:08.801919606 +0000 UTC m=+21.754874224" Apr 20 22:24:08.813564 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:08.813526 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-n2bhm" podStartSLOduration=2.851849459 podStartE2EDuration="20.813513581s" podCreationTimestamp="2026-04-20 22:23:48 +0000 UTC" firstStartedPulling="2026-04-20 22:23:48.928356125 +0000 UTC m=+1.881310727" lastFinishedPulling="2026-04-20 22:24:06.890020248 +0000 UTC m=+19.842974849" observedRunningTime="2026-04-20 22:24:08.813502911 +0000 UTC m=+21.766457529" watchObservedRunningTime="2026-04-20 22:24:08.813513581 +0000 UTC m=+21.766468200" Apr 20 22:24:08.827241 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:08.827210 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-56xnh" podStartSLOduration=3.847851301 podStartE2EDuration="21.827200215s" podCreationTimestamp="2026-04-20 22:23:47 +0000 UTC" firstStartedPulling="2026-04-20 22:23:48.911413873 +0000 UTC m=+1.864368470" lastFinishedPulling="2026-04-20 22:24:06.890762786 +0000 UTC m=+19.843717384" observedRunningTime="2026-04-20 22:24:08.826834495 +0000 UTC m=+21.779789113" watchObservedRunningTime="2026-04-20 22:24:08.827200215 +0000 UTC m=+21.780154834" Apr 20 22:24:08.853619 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:08.853546 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-r76sg" podStartSLOduration=3.848536514 podStartE2EDuration="21.853532047s" podCreationTimestamp="2026-04-20 22:23:47 +0000 UTC" firstStartedPulling="2026-04-20 22:23:48.88624798 +0000 UTC m=+1.839202577" lastFinishedPulling="2026-04-20 22:24:06.891243478 +0000 UTC m=+19.844198110" observedRunningTime="2026-04-20 22:24:08.853085834 +0000 UTC m=+21.806040453" watchObservedRunningTime="2026-04-20 22:24:08.853532047 +0000 UTC m=+21.806486669" Apr 20 22:24:08.998805 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:08.998632 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 22:24:09.598057 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:09.597889 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T22:24:08.998801925Z","UUID":"4c3c0994-e4a5-40b6-9cce-a74495c1e80c","Handler":null,"Name":"","Endpoint":""} Apr 20 22:24:09.600812 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:09.600785 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 22:24:09.600812 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:09.600818 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 22:24:09.794468 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:09.794424 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" event={"ID":"764d33cb-2655-4fbe-a337-a91cf3ac5633","Type":"ContainerStarted","Data":"191fa3fd276ad59e4948c4fc79f57f1f80c5a3af95f642bc881dced3318862f3"} Apr 20 22:24:09.796873 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:09.796806 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-91.ec2.internal" event={"ID":"17685fe779018045b34417c0aceeb6de","Type":"ContainerStarted","Data":"8bf64d44c4a28848d6abfe65b15917cfa87d4caec2c14113a4743711d832a892"} Apr 20 22:24:09.820591 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:09.820545 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-91.ec2.internal" podStartSLOduration=21.820531216 podStartE2EDuration="21.820531216s" podCreationTimestamp="2026-04-20 22:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:24:09.820034559 +0000 UTC m=+22.772989181" watchObservedRunningTime="2026-04-20 22:24:09.820531216 +0000 UTC m=+22.773485837" Apr 20 22:24:10.596225 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:10.596190 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:24:10.596416 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:10.596339 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wb2c" podUID="601775e4-554d-4221-907f-4a5d646c32e4" Apr 20 22:24:10.596490 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:10.596422 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:24:10.596585 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:10.596559 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rl87j" podUID="75df7794-7926-4023-a9fe-c8bb08e18219" Apr 20 22:24:10.800650 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:10.800609 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" event={"ID":"764d33cb-2655-4fbe-a337-a91cf3ac5633","Type":"ContainerStarted","Data":"6daa3ca7afd9d7860901d65c7e1241331f13a2901841d804fc56eef55548875f"} Apr 20 22:24:10.804025 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:10.803998 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" event={"ID":"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b","Type":"ContainerStarted","Data":"1fb19d32a9e0f1df2857e084bbf6e3124ba73896e335709689d5cf27f41776f5"} Apr 20 22:24:10.817169 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:10.817126 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4f565" podStartSLOduration=2.96594699 podStartE2EDuration="23.817114157s" podCreationTimestamp="2026-04-20 22:23:47 +0000 UTC" firstStartedPulling="2026-04-20 22:23:48.907309473 +0000 UTC m=+1.860264070" lastFinishedPulling="2026-04-20 22:24:09.758476632 +0000 UTC m=+22.711431237" observedRunningTime="2026-04-20 22:24:10.817048387 +0000 UTC m=+23.770003009" watchObservedRunningTime="2026-04-20 22:24:10.817114157 +0000 UTC m=+23.770068775" Apr 20 22:24:12.596685 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:12.596655 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:24:12.597408 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:12.596655 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:24:12.597408 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:12.596755 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wb2c" podUID="601775e4-554d-4221-907f-4a5d646c32e4" Apr 20 22:24:12.597408 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:12.596865 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rl87j" podUID="75df7794-7926-4023-a9fe-c8bb08e18219" Apr 20 22:24:12.812773 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:12.812472 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" event={"ID":"bffb7e6c-ecd8-45cd-a238-8bbc21a4553b","Type":"ContainerStarted","Data":"d466e81e6e212915159b3265e668c68b6e7da8cfa0ca51194b09774987072eaf"} Apr 20 22:24:12.813281 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:12.813175 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:24:12.813281 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:12.813248 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:24:12.813281 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:12.813276 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:24:12.830382 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:12.830299 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:24:12.831513 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:12.831379 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:24:12.840379 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:12.839785 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" podStartSLOduration=6.190588921 podStartE2EDuration="24.839768674s" podCreationTimestamp="2026-04-20 22:23:48 +0000 UTC" firstStartedPulling="2026-04-20 22:23:48.924235966 +0000 UTC m=+1.877190582" lastFinishedPulling="2026-04-20 22:24:07.573415735 +0000 UTC m=+20.526370335" observedRunningTime="2026-04-20 22:24:12.839001205 +0000 UTC m=+25.791955830" watchObservedRunningTime="2026-04-20 22:24:12.839768674 +0000 UTC m=+25.792723294" Apr 20 22:24:13.101926 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:13.101895 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-n2bhm" Apr 20 22:24:13.102650 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:13.102627 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-n2bhm" Apr 20 22:24:13.815701 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:13.815669 2573 generic.go:358] "Generic (PLEG): container finished" podID="d7ecb730-4be8-4cc2-86d1-47a71c9e25e7" containerID="22111d4f0529cb9265e6d729bb6642c8db64899749135e6208903655aa54925c" exitCode=0 Apr 20 22:24:13.816242 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:13.815752 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvjgp" event={"ID":"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7","Type":"ContainerDied","Data":"22111d4f0529cb9265e6d729bb6642c8db64899749135e6208903655aa54925c"} Apr 20 22:24:13.816317 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:13.816266 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-n2bhm" Apr 20 22:24:13.816368 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:13.816351 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-n2bhm" Apr 20 22:24:14.596026 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:14.595997 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:24:14.596166 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:14.596005 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:24:14.596166 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:14.596095 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rl87j" podUID="75df7794-7926-4023-a9fe-c8bb08e18219" Apr 20 22:24:14.596166 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:14.596154 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wb2c" podUID="601775e4-554d-4221-907f-4a5d646c32e4" Apr 20 22:24:14.678029 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:14.677953 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2wb2c"] Apr 20 22:24:14.680619 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:14.680594 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rl87j"] Apr 20 22:24:14.819405 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:14.819377 2573 generic.go:358] "Generic (PLEG): container finished" podID="d7ecb730-4be8-4cc2-86d1-47a71c9e25e7" containerID="139df720f51e6efbf45b11c9588ae1ec4595112d314b9fc66d9e48948e7cc4d7" exitCode=0 Apr 20 22:24:14.819835 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:14.819471 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvjgp" event={"ID":"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7","Type":"ContainerDied","Data":"139df720f51e6efbf45b11c9588ae1ec4595112d314b9fc66d9e48948e7cc4d7"} Apr 20 22:24:14.819835 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:14.819720 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:24:14.819835 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:14.819828 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rl87j" podUID="75df7794-7926-4023-a9fe-c8bb08e18219" Apr 20 22:24:14.820040 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:14.819839 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:24:14.820040 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:14.819918 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wb2c" podUID="601775e4-554d-4221-907f-4a5d646c32e4" Apr 20 22:24:15.823304 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:15.823270 2573 generic.go:358] "Generic (PLEG): container finished" podID="d7ecb730-4be8-4cc2-86d1-47a71c9e25e7" containerID="89769fcaef48d266f15312e1b609b2f42a81cc7dbb0d87b0724df951516c80c5" exitCode=0 Apr 20 22:24:15.823667 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:15.823331 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvjgp" event={"ID":"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7","Type":"ContainerDied","Data":"89769fcaef48d266f15312e1b609b2f42a81cc7dbb0d87b0724df951516c80c5"} Apr 20 22:24:16.596700 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:16.596622 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:24:16.596867 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:16.596758 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wb2c" podUID="601775e4-554d-4221-907f-4a5d646c32e4" Apr 20 22:24:16.596984 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:16.596627 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:24:16.597118 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:16.597095 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rl87j" podUID="75df7794-7926-4023-a9fe-c8bb08e18219" Apr 20 22:24:18.596338 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:18.596303 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:24:18.596944 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:18.596304 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:24:18.596944 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:18.596423 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-rl87j" podUID="75df7794-7926-4023-a9fe-c8bb08e18219" Apr 20 22:24:18.596944 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:18.596513 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2wb2c" podUID="601775e4-554d-4221-907f-4a5d646c32e4" Apr 20 22:24:20.365898 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.365807 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-91.ec2.internal" event="NodeReady" Apr 20 22:24:20.366465 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.365958 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 22:24:20.391798 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.391771 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs\") pod \"network-metrics-daemon-rl87j\" (UID: \"75df7794-7926-4023-a9fe-c8bb08e18219\") " pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:24:20.391971 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.391880 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:20.391971 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.391934 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs podName:75df7794-7926-4023-a9fe-c8bb08e18219 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:52.391919576 +0000 UTC m=+65.344874176 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs") pod "network-metrics-daemon-rl87j" (UID: "75df7794-7926-4023-a9fe-c8bb08e18219") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 22:24:20.404174 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.403892 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-xnj54"] Apr 20 22:24:20.436277 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.434785 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5c69c58687-c7dk5"] Apr 20 22:24:20.453182 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.453151 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-98hzg"] Apr 20 22:24:20.453332 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.453237 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" Apr 20 22:24:20.453401 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.453330 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.455869 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.455831 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 20 22:24:20.455995 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.455985 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 20 22:24:20.456231 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.456215 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 22:24:20.456669 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.456650 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 22:24:20.456770 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.456699 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 22:24:20.457129 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.457106 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-rsv8l\"" Apr 20 22:24:20.457874 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.457231 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6fksq\"" Apr 20 22:24:20.457874 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.457641 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 20 22:24:20.457874 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.457821 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 20 22:24:20.465305 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.465284 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 22:24:20.471266 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.471245 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 20 22:24:20.474511 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.474489 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dmsjq"] Apr 20 22:24:20.474629 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.474616 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-98hzg" Apr 20 22:24:20.476839 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.476821 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 20 22:24:20.477128 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.477110 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 20 22:24:20.477221 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.477139 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-bvhfk\"" Apr 20 22:24:20.477221 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.477146 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 20 22:24:20.489004 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.488986 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8"] Apr 20 22:24:20.489149 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.489129 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dmsjq" Apr 20 22:24:20.492146 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.492126 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svn47\" (UniqueName: \"kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47\") pod \"network-check-target-2wb2c\" (UID: \"601775e4-554d-4221-907f-4a5d646c32e4\") " pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:24:20.492292 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.492270 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 20 22:24:20.492398 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.492217 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 20 22:24:20.492398 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.492315 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 22:24:20.492398 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.492334 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 22:24:20.492398 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.492244 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 20 22:24:20.492398 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.492348 2573 projected.go:194] Error preparing data for projected volume kube-api-access-svn47 for pod openshift-network-diagnostics/network-check-target-2wb2c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:20.492635 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.492217 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 20 22:24:20.492635 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.492432 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47 podName:601775e4-554d-4221-907f-4a5d646c32e4 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:52.492401111 +0000 UTC m=+65.445355709 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-svn47" (UniqueName: "kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47") pod "network-check-target-2wb2c" (UID: "601775e4-554d-4221-907f-4a5d646c32e4") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 22:24:20.492635 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.492272 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-ljlgt\"" Apr 20 22:24:20.508635 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.508616 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rfgkr"] Apr 20 22:24:20.508796 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.508778 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8" Apr 20 22:24:20.512017 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.511999 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-qxkk8\"" Apr 20 22:24:20.514700 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.512987 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 22:24:20.515448 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.515425 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 22:24:20.515975 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.515948 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 20 22:24:20.516385 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.516359 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 20 22:24:20.526429 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.526411 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fzjvb"] Apr 20 22:24:20.526575 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.526549 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rfgkr" Apr 20 22:24:20.529050 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.529032 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-zhcnc\"" Apr 20 22:24:20.529201 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.529058 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 20 22:24:20.529201 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.529066 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 20 22:24:20.541639 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.541620 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-7n7hh"] Apr 20 22:24:20.541785 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.541766 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fzjvb" Apr 20 22:24:20.544145 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.544118 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 20 22:24:20.544252 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.544117 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 20 22:24:20.544252 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.544210 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 20 22:24:20.544362 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.544281 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-tjpwn\"" Apr 20 22:24:20.544554 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.544537 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 20 22:24:20.553166 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.553146 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-649dd96b44-r7c7b"] Apr 20 22:24:20.553318 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.553297 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-7n7hh" Apr 20 22:24:20.556008 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.555985 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-wtg4p\"" Apr 20 22:24:20.556104 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.556062 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 22:24:20.556104 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.556071 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 20 22:24:20.556216 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.556122 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 20 22:24:20.556216 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.556137 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 22:24:20.560521 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.560501 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 20 22:24:20.566356 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.566334 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-84455b6c98-44svx"] Apr 20 22:24:20.566474 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.566452 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.580098 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.580071 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-g56dv"] Apr 20 22:24:20.580243 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.580224 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:20.583242 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.583223 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 22:24:20.583242 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.583236 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 22:24:20.583408 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.583245 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 22:24:20.583408 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.583248 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 22:24:20.583408 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.583304 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 22:24:20.583408 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.583255 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-rnl2b\"" Apr 20 22:24:20.583408 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.583327 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 22:24:20.592612 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.592592 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c57ff7d45-ndk5x"] Apr 20 22:24:20.592723 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.592688 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g56dv" Apr 20 22:24:20.592974 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.592889 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c475df6-d751-4f10-81c7-a1e56dec9176-serving-cert\") pod \"console-operator-9d4b6777b-xnj54\" (UID: \"4c475df6-d751-4f10-81c7-a1e56dec9176\") " pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" Apr 20 22:24:20.592974 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.592924 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a9bfbef-0d72-430e-b23e-bb50623a7093-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-98hzg\" (UID: \"3a9bfbef-0d72-430e-b23e-bb50623a7093\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-98hzg" Apr 20 22:24:20.593128 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.592974 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjf82\" (UniqueName: \"kubernetes.io/projected/4c475df6-d751-4f10-81c7-a1e56dec9176-kube-api-access-cjf82\") pod \"console-operator-9d4b6777b-xnj54\" (UID: \"4c475df6-d751-4f10-81c7-a1e56dec9176\") " pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" Apr 20 22:24:20.593128 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.593016 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e184abd3-491a-42d4-baec-feffd1648520-image-registry-private-configuration\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.593128 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.593059 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c475df6-d751-4f10-81c7-a1e56dec9176-config\") pod \"console-operator-9d4b6777b-xnj54\" (UID: \"4c475df6-d751-4f10-81c7-a1e56dec9176\") " pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" Apr 20 22:24:20.593128 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.593082 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.593128 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.593101 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e184abd3-491a-42d4-baec-feffd1648520-registry-certificates\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.593390 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.593135 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e184abd3-491a-42d4-baec-feffd1648520-ca-trust-extracted\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.593390 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.593164 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c06378d7-946b-49c3-ac21-44605e27cdd5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-lxvk8\" (UID: \"c06378d7-946b-49c3-ac21-44605e27cdd5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8" Apr 20 22:24:20.593390 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.593200 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c475df6-d751-4f10-81c7-a1e56dec9176-trusted-ca\") pod \"console-operator-9d4b6777b-xnj54\" (UID: \"4c475df6-d751-4f10-81c7-a1e56dec9176\") " pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" Apr 20 22:24:20.593390 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.593223 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e184abd3-491a-42d4-baec-feffd1648520-trusted-ca\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.593390 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.593278 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-bound-sa-token\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.593390 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.593302 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m69f6\" (UniqueName: \"kubernetes.io/projected/c06378d7-946b-49c3-ac21-44605e27cdd5-kube-api-access-m69f6\") pod \"cluster-monitoring-operator-75587bd455-lxvk8\" (UID: \"c06378d7-946b-49c3-ac21-44605e27cdd5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8" Apr 20 22:24:20.593390 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.593368 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e184abd3-491a-42d4-baec-feffd1648520-installation-pull-secrets\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.593718 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.593398 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeb4d8a0-f553-47b6-8134-40d74089fd72-serving-cert\") pod \"service-ca-operator-d6fc45fc5-dmsjq\" (UID: \"eeb4d8a0-f553-47b6-8134-40d74089fd72\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dmsjq" Apr 20 22:24:20.593718 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.593425 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeb4d8a0-f553-47b6-8134-40d74089fd72-config\") pod \"service-ca-operator-d6fc45fc5-dmsjq\" (UID: \"eeb4d8a0-f553-47b6-8134-40d74089fd72\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dmsjq" Apr 20 22:24:20.593718 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.593473 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtsmk\" (UniqueName: \"kubernetes.io/projected/3a9bfbef-0d72-430e-b23e-bb50623a7093-kube-api-access-vtsmk\") pod \"cluster-samples-operator-6dc5bdb6b4-98hzg\" (UID: \"3a9bfbef-0d72-430e-b23e-bb50623a7093\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-98hzg" Apr 20 22:24:20.593718 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.593522 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq6d5\" (UniqueName: \"kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-kube-api-access-nq6d5\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.593718 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.593547 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q7tt\" (UniqueName: \"kubernetes.io/projected/eeb4d8a0-f553-47b6-8134-40d74089fd72-kube-api-access-4q7tt\") pod \"service-ca-operator-d6fc45fc5-dmsjq\" (UID: \"eeb4d8a0-f553-47b6-8134-40d74089fd72\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dmsjq" Apr 20 22:24:20.593718 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.593573 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c06378d7-946b-49c3-ac21-44605e27cdd5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lxvk8\" (UID: \"c06378d7-946b-49c3-ac21-44605e27cdd5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8" Apr 20 22:24:20.595215 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.595194 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 22:24:20.595318 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.595219 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 22:24:20.595318 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.595282 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-6whsq\"" Apr 20 22:24:20.608747 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.608702 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk"] Apr 20 22:24:20.608881 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.608835 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c57ff7d45-ndk5x" Apr 20 22:24:20.611582 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.611561 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 22:24:20.611732 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.611581 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-5bcmr\"" Apr 20 22:24:20.611732 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.611561 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 22:24:20.612020 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.611997 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 22:24:20.612100 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.612024 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 22:24:20.626538 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.626495 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b966f6456-6r7jb"] Apr 20 22:24:20.626633 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.626619 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:24:20.626673 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.626626 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" Apr 20 22:24:20.626823 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.626807 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:24:20.629573 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.629544 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-66fzs\"" Apr 20 22:24:20.629688 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.629549 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 22:24:20.629688 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.629603 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 22:24:20.629688 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.629679 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 22:24:20.629843 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.629695 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 22:24:20.629985 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.629966 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 22:24:20.629985 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.629978 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ddtmq\"" Apr 20 22:24:20.642363 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.642258 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-xnj54"] Apr 20 22:24:20.642363 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.642285 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dmsjq"] Apr 20 22:24:20.642363 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.642299 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-98hzg"] Apr 20 22:24:20.642363 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.642315 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5c69c58687-c7dk5"] Apr 20 22:24:20.642363 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.642327 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rfgkr"] Apr 20 22:24:20.642363 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.642340 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-g56dv"] Apr 20 22:24:20.642363 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.642352 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk"] Apr 20 22:24:20.642363 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.642365 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-84455b6c98-44svx"] Apr 20 22:24:20.642779 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.642375 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-649dd96b44-r7c7b"] Apr 20 22:24:20.642779 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.642388 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-d5s8x"] Apr 20 22:24:20.642779 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.642642 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b966f6456-6r7jb" Apr 20 22:24:20.645100 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.645081 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 22:24:20.659943 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.659924 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c57ff7d45-ndk5x"] Apr 20 22:24:20.660057 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.659952 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-d5s8x"] Apr 20 22:24:20.660057 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.659965 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-7n7hh"] Apr 20 22:24:20.660057 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.659976 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8"] Apr 20 22:24:20.660057 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.659986 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fzjvb"] Apr 20 22:24:20.660057 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.659995 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b966f6456-6r7jb"] Apr 20 22:24:20.660057 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.660021 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fj4gp"] Apr 20 22:24:20.660370 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.660068 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-d5s8x" Apr 20 22:24:20.662682 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.662650 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 22:24:20.663036 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.662785 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 22:24:20.663302 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.663281 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bv5w5\"" Apr 20 22:24:20.663402 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.663310 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 22:24:20.678761 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.678743 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fj4gp"] Apr 20 22:24:20.678925 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.678910 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fj4gp" Apr 20 22:24:20.681237 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.681190 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 22:24:20.681411 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.681393 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 22:24:20.681498 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.681464 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bl4mg\"" Apr 20 22:24:20.694146 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.694125 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-trusted-ca\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.694241 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.694149 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-metrics-certs\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:20.694241 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.694172 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c475df6-d751-4f10-81c7-a1e56dec9176-serving-cert\") pod \"console-operator-9d4b6777b-xnj54\" (UID: \"4c475df6-d751-4f10-81c7-a1e56dec9176\") " pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" Apr 20 22:24:20.694241 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.694224 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9tq4\" (UniqueName: \"kubernetes.io/projected/ea02661d-e4a4-469a-9451-7f11a7db90d2-kube-api-access-z9tq4\") pod \"kube-storage-version-migrator-operator-6769c5d45-fzjvb\" (UID: \"ea02661d-e4a4-469a-9451-7f11a7db90d2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fzjvb" Apr 20 22:24:20.694369 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.694253 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-installation-pull-secrets\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.694369 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.694280 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zll55\" (UniqueName: \"kubernetes.io/projected/66ac7746-d497-453d-a7c7-08406f8f7baa-kube-api-access-zll55\") pod \"managed-serviceaccount-addon-agent-6c57ff7d45-ndk5x\" (UID: \"66ac7746-d497-453d-a7c7-08406f8f7baa\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c57ff7d45-ndk5x" Apr 20 22:24:20.694369 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.694313 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-certificates\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.694369 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.694355 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea02661d-e4a4-469a-9451-7f11a7db90d2-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-fzjvb\" (UID: \"ea02661d-e4a4-469a-9451-7f11a7db90d2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fzjvb" Apr 20 22:24:20.694587 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.694433 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c475df6-d751-4f10-81c7-a1e56dec9176-config\") pod \"console-operator-9d4b6777b-xnj54\" (UID: \"4c475df6-d751-4f10-81c7-a1e56dec9176\") " pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" Apr 20 22:24:20.694587 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.694476 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-bound-sa-token\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.694587 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.694506 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/bb95d71c-3b6d-407a-9ff3-a70562af1b93-snapshots\") pod \"insights-operator-585dfdc468-7n7hh\" (UID: \"bb95d71c-3b6d-407a-9ff3-a70562af1b93\") " pod="openshift-insights/insights-operator-585dfdc468-7n7hh" Apr 20 22:24:20.694587 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.694547 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e184abd3-491a-42d4-baec-feffd1648520-ca-trust-extracted\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.694587 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.694574 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.694793 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.694631 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd65m\" (UniqueName: \"kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-kube-api-access-pd65m\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.694793 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.694664 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb95d71c-3b6d-407a-9ff3-a70562af1b93-serving-cert\") pod \"insights-operator-585dfdc468-7n7hh\" (UID: \"bb95d71c-3b6d-407a-9ff3-a70562af1b93\") " pod="openshift-insights/insights-operator-585dfdc468-7n7hh" Apr 20 22:24:20.694793 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.694732 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-bound-sa-token\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.695015 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.694996 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e184abd3-491a-42d4-baec-feffd1648520-ca-trust-extracted\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.695015 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695008 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/66ac7746-d497-453d-a7c7-08406f8f7baa-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6c57ff7d45-ndk5x\" (UID: \"66ac7746-d497-453d-a7c7-08406f8f7baa\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c57ff7d45-ndk5x" Apr 20 22:24:20.695127 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695059 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-default-certificate\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:20.695127 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695097 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nq6d5\" (UniqueName: \"kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-kube-api-access-nq6d5\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.695127 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695121 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4q7tt\" (UniqueName: \"kubernetes.io/projected/eeb4d8a0-f553-47b6-8134-40d74089fd72-kube-api-access-4q7tt\") pod \"service-ca-operator-d6fc45fc5-dmsjq\" (UID: \"eeb4d8a0-f553-47b6-8134-40d74089fd72\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dmsjq" Apr 20 22:24:20.695262 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695145 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e184abd3-491a-42d4-baec-feffd1648520-installation-pull-secrets\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.695262 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695175 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtsmk\" (UniqueName: \"kubernetes.io/projected/3a9bfbef-0d72-430e-b23e-bb50623a7093-kube-api-access-vtsmk\") pod \"cluster-samples-operator-6dc5bdb6b4-98hzg\" (UID: \"3a9bfbef-0d72-430e-b23e-bb50623a7093\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-98hzg" Apr 20 22:24:20.695262 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695222 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c06378d7-946b-49c3-ac21-44605e27cdd5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lxvk8\" (UID: \"c06378d7-946b-49c3-ac21-44605e27cdd5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8" Apr 20 22:24:20.695262 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695250 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea02661d-e4a4-469a-9451-7f11a7db90d2-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-fzjvb\" (UID: \"ea02661d-e4a4-469a-9451-7f11a7db90d2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fzjvb" Apr 20 22:24:20.695385 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695264 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c475df6-d751-4f10-81c7-a1e56dec9176-config\") pod \"console-operator-9d4b6777b-xnj54\" (UID: \"4c475df6-d751-4f10-81c7-a1e56dec9176\") " pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" Apr 20 22:24:20.695385 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695276 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-ca-trust-extracted\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.695385 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695302 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-stats-auth\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:20.695385 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695340 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bb95d71c-3b6d-407a-9ff3-a70562af1b93-tmp\") pod \"insights-operator-585dfdc468-7n7hh\" (UID: \"bb95d71c-3b6d-407a-9ff3-a70562af1b93\") " pod="openshift-insights/insights-operator-585dfdc468-7n7hh" Apr 20 22:24:20.695385 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.695354 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 22:24:20.695385 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695380 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a9bfbef-0d72-430e-b23e-bb50623a7093-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-98hzg\" (UID: \"3a9bfbef-0d72-430e-b23e-bb50623a7093\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-98hzg" Apr 20 22:24:20.695554 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695409 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb95d71c-3b6d-407a-9ff3-a70562af1b93-service-ca-bundle\") pod \"insights-operator-585dfdc468-7n7hh\" (UID: \"bb95d71c-3b6d-407a-9ff3-a70562af1b93\") " pod="openshift-insights/insights-operator-585dfdc468-7n7hh" Apr 20 22:24:20.695554 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.695454 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c06378d7-946b-49c3-ac21-44605e27cdd5-cluster-monitoring-operator-tls podName:c06378d7-946b-49c3-ac21-44605e27cdd5 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:21.195433072 +0000 UTC m=+34.148387677 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c06378d7-946b-49c3-ac21-44605e27cdd5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lxvk8" (UID: "c06378d7-946b-49c3-ac21-44605e27cdd5") : secret "cluster-monitoring-operator-tls" not found Apr 20 22:24:20.695554 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695474 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c99a639-1f48-429a-a14e-800ce227becb-service-ca-bundle\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:20.695554 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695509 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4pzk\" (UniqueName: \"kubernetes.io/projected/06591b6d-348e-44ef-8a2d-4112e4d70e60-kube-api-access-z4pzk\") pod \"network-check-source-8894fc9bd-g56dv\" (UID: \"06591b6d-348e-44ef-8a2d-4112e4d70e60\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g56dv" Apr 20 22:24:20.695554 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.695511 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 22:24:20.695554 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695537 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkzf6\" (UniqueName: \"kubernetes.io/projected/bb95d71c-3b6d-407a-9ff3-a70562af1b93-kube-api-access-lkzf6\") pod \"insights-operator-585dfdc468-7n7hh\" (UID: \"bb95d71c-3b6d-407a-9ff3-a70562af1b93\") " pod="openshift-insights/insights-operator-585dfdc468-7n7hh" Apr 20 22:24:20.695825 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.695572 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a9bfbef-0d72-430e-b23e-bb50623a7093-samples-operator-tls podName:3a9bfbef-0d72-430e-b23e-bb50623a7093 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:21.195555085 +0000 UTC m=+34.148509687 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3a9bfbef-0d72-430e-b23e-bb50623a7093-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-98hzg" (UID: "3a9bfbef-0d72-430e-b23e-bb50623a7093") : secret "samples-operator-tls" not found Apr 20 22:24:20.695825 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695604 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cjf82\" (UniqueName: \"kubernetes.io/projected/4c475df6-d751-4f10-81c7-a1e56dec9176-kube-api-access-cjf82\") pod \"console-operator-9d4b6777b-xnj54\" (UID: \"4c475df6-d751-4f10-81c7-a1e56dec9176\") " pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" Apr 20 22:24:20.695825 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695628 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e184abd3-491a-42d4-baec-feffd1648520-image-registry-private-configuration\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.695825 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695666 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v45zd\" (UniqueName: \"kubernetes.io/projected/7c99a639-1f48-429a-a14e-800ce227becb-kube-api-access-v45zd\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:20.695825 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695696 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.695825 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695736 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e184abd3-491a-42d4-baec-feffd1648520-registry-certificates\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.695825 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695803 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c06378d7-946b-49c3-ac21-44605e27cdd5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-lxvk8\" (UID: \"c06378d7-946b-49c3-ac21-44605e27cdd5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8" Apr 20 22:24:20.696195 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695835 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c475df6-d751-4f10-81c7-a1e56dec9176-trusted-ca\") pod \"console-operator-9d4b6777b-xnj54\" (UID: \"4c475df6-d751-4f10-81c7-a1e56dec9176\") " pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" Apr 20 22:24:20.696195 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695878 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e184abd3-491a-42d4-baec-feffd1648520-trusted-ca\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.696195 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695932 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m69f6\" (UniqueName: \"kubernetes.io/projected/c06378d7-946b-49c3-ac21-44605e27cdd5-kube-api-access-m69f6\") pod \"cluster-monitoring-operator-75587bd455-lxvk8\" (UID: \"c06378d7-946b-49c3-ac21-44605e27cdd5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8" Apr 20 22:24:20.696195 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695966 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeb4d8a0-f553-47b6-8134-40d74089fd72-serving-cert\") pod \"service-ca-operator-d6fc45fc5-dmsjq\" (UID: \"eeb4d8a0-f553-47b6-8134-40d74089fd72\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dmsjq" Apr 20 22:24:20.696195 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.695995 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-image-registry-private-configuration\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.696195 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.696023 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeb4d8a0-f553-47b6-8134-40d74089fd72-config\") pod \"service-ca-operator-d6fc45fc5-dmsjq\" (UID: \"eeb4d8a0-f553-47b6-8134-40d74089fd72\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dmsjq" Apr 20 22:24:20.696195 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.696050 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb95d71c-3b6d-407a-9ff3-a70562af1b93-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-7n7hh\" (UID: \"bb95d71c-3b6d-407a-9ff3-a70562af1b93\") " pod="openshift-insights/insights-operator-585dfdc468-7n7hh" Apr 20 22:24:20.696195 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.696071 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56qg2\" (UniqueName: \"kubernetes.io/projected/7c73411b-82b9-42c4-bbc3-edc35e00606d-kube-api-access-56qg2\") pod \"volume-data-source-validator-7c6cbb6c87-rfgkr\" (UID: \"7c73411b-82b9-42c4-bbc3-edc35e00606d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rfgkr" Apr 20 22:24:20.696195 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.695809 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 22:24:20.696195 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.696131 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c69c58687-c7dk5: secret "image-registry-tls" not found Apr 20 22:24:20.696195 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.696171 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls podName:e184abd3-491a-42d4-baec-feffd1648520 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:21.196160489 +0000 UTC m=+34.149115086 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls") pod "image-registry-5c69c58687-c7dk5" (UID: "e184abd3-491a-42d4-baec-feffd1648520") : secret "image-registry-tls" not found Apr 20 22:24:20.696772 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.696298 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e184abd3-491a-42d4-baec-feffd1648520-registry-certificates\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.696772 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.696550 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c06378d7-946b-49c3-ac21-44605e27cdd5-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-lxvk8\" (UID: \"c06378d7-946b-49c3-ac21-44605e27cdd5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8" Apr 20 22:24:20.696909 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.696811 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eeb4d8a0-f553-47b6-8134-40d74089fd72-config\") pod \"service-ca-operator-d6fc45fc5-dmsjq\" (UID: \"eeb4d8a0-f553-47b6-8134-40d74089fd72\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dmsjq" Apr 20 22:24:20.697074 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.697030 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4c475df6-d751-4f10-81c7-a1e56dec9176-trusted-ca\") pod \"console-operator-9d4b6777b-xnj54\" (UID: \"4c475df6-d751-4f10-81c7-a1e56dec9176\") " pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" Apr 20 22:24:20.699729 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.699707 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e184abd3-491a-42d4-baec-feffd1648520-image-registry-private-configuration\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.700005 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.699982 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eeb4d8a0-f553-47b6-8134-40d74089fd72-serving-cert\") pod \"service-ca-operator-d6fc45fc5-dmsjq\" (UID: \"eeb4d8a0-f553-47b6-8134-40d74089fd72\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dmsjq" Apr 20 22:24:20.700236 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.700216 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c475df6-d751-4f10-81c7-a1e56dec9176-serving-cert\") pod \"console-operator-9d4b6777b-xnj54\" (UID: \"4c475df6-d751-4f10-81c7-a1e56dec9176\") " pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" Apr 20 22:24:20.700306 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.700234 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e184abd3-491a-42d4-baec-feffd1648520-installation-pull-secrets\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.705497 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.705451 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq6d5\" (UniqueName: \"kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-kube-api-access-nq6d5\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.705601 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.705568 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjf82\" (UniqueName: \"kubernetes.io/projected/4c475df6-d751-4f10-81c7-a1e56dec9176-kube-api-access-cjf82\") pod \"console-operator-9d4b6777b-xnj54\" (UID: \"4c475df6-d751-4f10-81c7-a1e56dec9176\") " pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" Apr 20 22:24:20.706529 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.706506 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4q7tt\" (UniqueName: \"kubernetes.io/projected/eeb4d8a0-f553-47b6-8134-40d74089fd72-kube-api-access-4q7tt\") pod \"service-ca-operator-d6fc45fc5-dmsjq\" (UID: \"eeb4d8a0-f553-47b6-8134-40d74089fd72\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dmsjq" Apr 20 22:24:20.706627 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.706517 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtsmk\" (UniqueName: \"kubernetes.io/projected/3a9bfbef-0d72-430e-b23e-bb50623a7093-kube-api-access-vtsmk\") pod \"cluster-samples-operator-6dc5bdb6b4-98hzg\" (UID: \"3a9bfbef-0d72-430e-b23e-bb50623a7093\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-98hzg" Apr 20 22:24:20.706627 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.706576 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m69f6\" (UniqueName: \"kubernetes.io/projected/c06378d7-946b-49c3-ac21-44605e27cdd5-kube-api-access-m69f6\") pod \"cluster-monitoring-operator-75587bd455-lxvk8\" (UID: \"c06378d7-946b-49c3-ac21-44605e27cdd5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8" Apr 20 22:24:20.707496 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.707473 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-bound-sa-token\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.710263 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.710243 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e184abd3-491a-42d4-baec-feffd1648520-trusted-ca\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:20.767098 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.767066 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" Apr 20 22:24:20.796682 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.796642 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-trusted-ca\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.796832 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.796691 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-metrics-certs\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:20.796832 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.796724 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-cert\") pod \"ingress-canary-d5s8x\" (UID: \"82357e1f-f9a8-4cf7-b3dd-fe77912c49a1\") " pod="openshift-ingress-canary/ingress-canary-d5s8x" Apr 20 22:24:20.796832 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.796754 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9tq4\" (UniqueName: \"kubernetes.io/projected/ea02661d-e4a4-469a-9451-7f11a7db90d2-kube-api-access-z9tq4\") pod \"kube-storage-version-migrator-operator-6769c5d45-fzjvb\" (UID: \"ea02661d-e4a4-469a-9451-7f11a7db90d2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fzjvb" Apr 20 22:24:20.796832 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.796783 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-installation-pull-secrets\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.797051 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.796873 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 22:24:20.797051 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.796900 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zll55\" (UniqueName: \"kubernetes.io/projected/66ac7746-d497-453d-a7c7-08406f8f7baa-kube-api-access-zll55\") pod \"managed-serviceaccount-addon-agent-6c57ff7d45-ndk5x\" (UID: \"66ac7746-d497-453d-a7c7-08406f8f7baa\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c57ff7d45-ndk5x" Apr 20 22:24:20.797051 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.796945 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-metrics-certs podName:7c99a639-1f48-429a-a14e-800ce227becb nodeName:}" failed. No retries permitted until 2026-04-20 22:24:21.296923931 +0000 UTC m=+34.249878530 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-metrics-certs") pod "router-default-84455b6c98-44svx" (UID: "7c99a639-1f48-429a-a14e-800ce227becb") : secret "router-metrics-certs-default" not found Apr 20 22:24:20.798217 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.797714 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ddc6811c-1bae-4d19-8bc7-a04e9df30ebf-klusterlet-config\") pod \"klusterlet-addon-workmgr-7b966f6456-6r7jb\" (UID: \"ddc6811c-1bae-4d19-8bc7-a04e9df30ebf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b966f6456-6r7jb" Apr 20 22:24:20.798217 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.797784 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-certificates\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.798217 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.797879 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea02661d-e4a4-469a-9451-7f11a7db90d2-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-fzjvb\" (UID: \"ea02661d-e4a4-469a-9451-7f11a7db90d2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fzjvb" Apr 20 22:24:20.798217 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.797925 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/908bd97a-6313-4646-ab52-90bb7ffefdaa-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk\" (UID: \"908bd97a-6313-4646-ab52-90bb7ffefdaa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" Apr 20 22:24:20.798217 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.797965 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-bound-sa-token\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.798217 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.797999 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/bb95d71c-3b6d-407a-9ff3-a70562af1b93-snapshots\") pod \"insights-operator-585dfdc468-7n7hh\" (UID: \"bb95d71c-3b6d-407a-9ff3-a70562af1b93\") " pod="openshift-insights/insights-operator-585dfdc468-7n7hh" Apr 20 22:24:20.798217 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.798033 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.798217 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.798079 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsd94\" (UniqueName: \"kubernetes.io/projected/ddc6811c-1bae-4d19-8bc7-a04e9df30ebf-kube-api-access-wsd94\") pod \"klusterlet-addon-workmgr-7b966f6456-6r7jb\" (UID: \"ddc6811c-1bae-4d19-8bc7-a04e9df30ebf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b966f6456-6r7jb" Apr 20 22:24:20.798217 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.798121 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pd65m\" (UniqueName: \"kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-kube-api-access-pd65m\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.798217 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.798160 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb95d71c-3b6d-407a-9ff3-a70562af1b93-serving-cert\") pod \"insights-operator-585dfdc468-7n7hh\" (UID: \"bb95d71c-3b6d-407a-9ff3-a70562af1b93\") " pod="openshift-insights/insights-operator-585dfdc468-7n7hh" Apr 20 22:24:20.798217 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.798194 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4ed632e-0c77-4b80-b076-66bdfd17da84-metrics-tls\") pod \"dns-default-fj4gp\" (UID: \"e4ed632e-0c77-4b80-b076-66bdfd17da84\") " pod="openshift-dns/dns-default-fj4gp" Apr 20 22:24:20.799411 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.798927 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/66ac7746-d497-453d-a7c7-08406f8f7baa-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6c57ff7d45-ndk5x\" (UID: \"66ac7746-d497-453d-a7c7-08406f8f7baa\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c57ff7d45-ndk5x" Apr 20 22:24:20.799411 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.798971 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-default-certificate\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:20.799411 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.798981 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dmsjq" Apr 20 22:24:20.799411 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.798986 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/bb95d71c-3b6d-407a-9ff3-a70562af1b93-snapshots\") pod \"insights-operator-585dfdc468-7n7hh\" (UID: \"bb95d71c-3b6d-407a-9ff3-a70562af1b93\") " pod="openshift-insights/insights-operator-585dfdc468-7n7hh" Apr 20 22:24:20.799411 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.799016 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftz7n\" (UniqueName: \"kubernetes.io/projected/e4ed632e-0c77-4b80-b076-66bdfd17da84-kube-api-access-ftz7n\") pod \"dns-default-fj4gp\" (UID: \"e4ed632e-0c77-4b80-b076-66bdfd17da84\") " pod="openshift-dns/dns-default-fj4gp" Apr 20 22:24:20.799411 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.799160 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ddc6811c-1bae-4d19-8bc7-a04e9df30ebf-tmp\") pod \"klusterlet-addon-workmgr-7b966f6456-6r7jb\" (UID: \"ddc6811c-1bae-4d19-8bc7-a04e9df30ebf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b966f6456-6r7jb" Apr 20 22:24:20.799411 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.799196 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/908bd97a-6313-4646-ab52-90bb7ffefdaa-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk\" (UID: \"908bd97a-6313-4646-ab52-90bb7ffefdaa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" Apr 20 22:24:20.799411 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.799248 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea02661d-e4a4-469a-9451-7f11a7db90d2-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-fzjvb\" (UID: \"ea02661d-e4a4-469a-9451-7f11a7db90d2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fzjvb" Apr 20 22:24:20.799411 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.798670 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-trusted-ca\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.799411 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.799283 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-ca-trust-extracted\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.799411 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.798705 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea02661d-e4a4-469a-9451-7f11a7db90d2-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-fzjvb\" (UID: \"ea02661d-e4a4-469a-9451-7f11a7db90d2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fzjvb" Apr 20 22:24:20.799411 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.799316 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-stats-auth\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:20.799411 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.799374 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bb95d71c-3b6d-407a-9ff3-a70562af1b93-tmp\") pod \"insights-operator-585dfdc468-7n7hh\" (UID: \"bb95d71c-3b6d-407a-9ff3-a70562af1b93\") " pod="openshift-insights/insights-operator-585dfdc468-7n7hh" Apr 20 22:24:20.800126 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.799424 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb95d71c-3b6d-407a-9ff3-a70562af1b93-service-ca-bundle\") pod \"insights-operator-585dfdc468-7n7hh\" (UID: \"bb95d71c-3b6d-407a-9ff3-a70562af1b93\") " pod="openshift-insights/insights-operator-585dfdc468-7n7hh" Apr 20 22:24:20.800126 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.798763 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 22:24:20.800126 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.799457 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-649dd96b44-r7c7b: secret "image-registry-tls" not found Apr 20 22:24:20.800126 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.799524 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls podName:bf5aed09-1fd8-4294-aef5-ee13e17b2bf3 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:21.299500124 +0000 UTC m=+34.252454722 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls") pod "image-registry-649dd96b44-r7c7b" (UID: "bf5aed09-1fd8-4294-aef5-ee13e17b2bf3") : secret "image-registry-tls" not found Apr 20 22:24:20.800126 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.799801 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bb95d71c-3b6d-407a-9ff3-a70562af1b93-tmp\") pod \"insights-operator-585dfdc468-7n7hh\" (UID: \"bb95d71c-3b6d-407a-9ff3-a70562af1b93\") " pod="openshift-insights/insights-operator-585dfdc468-7n7hh" Apr 20 22:24:20.800377 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.800362 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb95d71c-3b6d-407a-9ff3-a70562af1b93-service-ca-bundle\") pod \"insights-operator-585dfdc468-7n7hh\" (UID: \"bb95d71c-3b6d-407a-9ff3-a70562af1b93\") " pod="openshift-insights/insights-operator-585dfdc468-7n7hh" Apr 20 22:24:20.801298 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.800660 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-ca-trust-extracted\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.801298 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.800720 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c99a639-1f48-429a-a14e-800ce227becb-service-ca-bundle\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:20.801298 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.800763 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4pzk\" (UniqueName: \"kubernetes.io/projected/06591b6d-348e-44ef-8a2d-4112e4d70e60-kube-api-access-z4pzk\") pod \"network-check-source-8894fc9bd-g56dv\" (UID: \"06591b6d-348e-44ef-8a2d-4112e4d70e60\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g56dv" Apr 20 22:24:20.801298 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.800799 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lkzf6\" (UniqueName: \"kubernetes.io/projected/bb95d71c-3b6d-407a-9ff3-a70562af1b93-kube-api-access-lkzf6\") pod \"insights-operator-585dfdc468-7n7hh\" (UID: \"bb95d71c-3b6d-407a-9ff3-a70562af1b93\") " pod="openshift-insights/insights-operator-585dfdc468-7n7hh" Apr 20 22:24:20.801298 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.800872 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v45zd\" (UniqueName: \"kubernetes.io/projected/7c99a639-1f48-429a-a14e-800ce227becb-kube-api-access-v45zd\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:20.801298 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.800913 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/908bd97a-6313-4646-ab52-90bb7ffefdaa-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk\" (UID: \"908bd97a-6313-4646-ab52-90bb7ffefdaa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" Apr 20 22:24:20.801298 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.800976 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/908bd97a-6313-4646-ab52-90bb7ffefdaa-ca\") pod \"cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk\" (UID: \"908bd97a-6313-4646-ab52-90bb7ffefdaa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" Apr 20 22:24:20.801298 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.801034 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e4ed632e-0c77-4b80-b076-66bdfd17da84-tmp-dir\") pod \"dns-default-fj4gp\" (UID: \"e4ed632e-0c77-4b80-b076-66bdfd17da84\") " pod="openshift-dns/dns-default-fj4gp" Apr 20 22:24:20.801298 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.801114 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrpp9\" (UniqueName: \"kubernetes.io/projected/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-kube-api-access-hrpp9\") pod \"ingress-canary-d5s8x\" (UID: \"82357e1f-f9a8-4cf7-b3dd-fe77912c49a1\") " pod="openshift-ingress-canary/ingress-canary-d5s8x" Apr 20 22:24:20.801298 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.801147 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4ed632e-0c77-4b80-b076-66bdfd17da84-config-volume\") pod \"dns-default-fj4gp\" (UID: \"e4ed632e-0c77-4b80-b076-66bdfd17da84\") " pod="openshift-dns/dns-default-fj4gp" Apr 20 22:24:20.801298 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.801180 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/908bd97a-6313-4646-ab52-90bb7ffefdaa-hub\") pod \"cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk\" (UID: \"908bd97a-6313-4646-ab52-90bb7ffefdaa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" Apr 20 22:24:20.801298 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.801219 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-image-registry-private-configuration\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.801298 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.801251 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb95d71c-3b6d-407a-9ff3-a70562af1b93-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-7n7hh\" (UID: \"bb95d71c-3b6d-407a-9ff3-a70562af1b93\") " pod="openshift-insights/insights-operator-585dfdc468-7n7hh" Apr 20 22:24:20.801298 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.801283 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-56qg2\" (UniqueName: \"kubernetes.io/projected/7c73411b-82b9-42c4-bbc3-edc35e00606d-kube-api-access-56qg2\") pod \"volume-data-source-validator-7c6cbb6c87-rfgkr\" (UID: \"7c73411b-82b9-42c4-bbc3-edc35e00606d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rfgkr" Apr 20 22:24:20.806648 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.801310 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbtnh\" (UniqueName: \"kubernetes.io/projected/908bd97a-6313-4646-ab52-90bb7ffefdaa-kube-api-access-nbtnh\") pod \"cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk\" (UID: \"908bd97a-6313-4646-ab52-90bb7ffefdaa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" Apr 20 22:24:20.806648 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.801746 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7c99a639-1f48-429a-a14e-800ce227becb-service-ca-bundle podName:7c99a639-1f48-429a-a14e-800ce227becb nodeName:}" failed. No retries permitted until 2026-04-20 22:24:21.301729093 +0000 UTC m=+34.254683710 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7c99a639-1f48-429a-a14e-800ce227becb-service-ca-bundle") pod "router-default-84455b6c98-44svx" (UID: "7c99a639-1f48-429a-a14e-800ce227becb") : configmap references non-existent config key: service-ca.crt Apr 20 22:24:20.806648 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.801803 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-installation-pull-secrets\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.806648 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.801802 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb95d71c-3b6d-407a-9ff3-a70562af1b93-serving-cert\") pod \"insights-operator-585dfdc468-7n7hh\" (UID: \"bb95d71c-3b6d-407a-9ff3-a70562af1b93\") " pod="openshift-insights/insights-operator-585dfdc468-7n7hh" Apr 20 22:24:20.806648 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.798487 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-certificates\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.806648 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.802570 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bb95d71c-3b6d-407a-9ff3-a70562af1b93-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-7n7hh\" (UID: \"bb95d71c-3b6d-407a-9ff3-a70562af1b93\") " pod="openshift-insights/insights-operator-585dfdc468-7n7hh" Apr 20 22:24:20.806648 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.803982 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-default-certificate\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:20.806648 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.805620 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-image-registry-private-configuration\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.806973 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.806892 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-stats-auth\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:20.807063 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.807045 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/66ac7746-d497-453d-a7c7-08406f8f7baa-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6c57ff7d45-ndk5x\" (UID: \"66ac7746-d497-453d-a7c7-08406f8f7baa\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c57ff7d45-ndk5x" Apr 20 22:24:20.808728 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.808334 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea02661d-e4a4-469a-9451-7f11a7db90d2-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-fzjvb\" (UID: \"ea02661d-e4a4-469a-9451-7f11a7db90d2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fzjvb" Apr 20 22:24:20.810969 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.810641 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9tq4\" (UniqueName: \"kubernetes.io/projected/ea02661d-e4a4-469a-9451-7f11a7db90d2-kube-api-access-z9tq4\") pod \"kube-storage-version-migrator-operator-6769c5d45-fzjvb\" (UID: \"ea02661d-e4a4-469a-9451-7f11a7db90d2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fzjvb" Apr 20 22:24:20.810969 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.810718 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zll55\" (UniqueName: \"kubernetes.io/projected/66ac7746-d497-453d-a7c7-08406f8f7baa-kube-api-access-zll55\") pod \"managed-serviceaccount-addon-agent-6c57ff7d45-ndk5x\" (UID: \"66ac7746-d497-453d-a7c7-08406f8f7baa\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c57ff7d45-ndk5x" Apr 20 22:24:20.811538 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.811278 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v45zd\" (UniqueName: \"kubernetes.io/projected/7c99a639-1f48-429a-a14e-800ce227becb-kube-api-access-v45zd\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:20.811538 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.811489 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-bound-sa-token\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.812460 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.811878 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-56qg2\" (UniqueName: \"kubernetes.io/projected/7c73411b-82b9-42c4-bbc3-edc35e00606d-kube-api-access-56qg2\") pod \"volume-data-source-validator-7c6cbb6c87-rfgkr\" (UID: \"7c73411b-82b9-42c4-bbc3-edc35e00606d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rfgkr" Apr 20 22:24:20.812460 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.812414 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkzf6\" (UniqueName: \"kubernetes.io/projected/bb95d71c-3b6d-407a-9ff3-a70562af1b93-kube-api-access-lkzf6\") pod \"insights-operator-585dfdc468-7n7hh\" (UID: \"bb95d71c-3b6d-407a-9ff3-a70562af1b93\") " pod="openshift-insights/insights-operator-585dfdc468-7n7hh" Apr 20 22:24:20.812667 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.812627 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd65m\" (UniqueName: \"kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-kube-api-access-pd65m\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:20.814078 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.814059 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4pzk\" (UniqueName: \"kubernetes.io/projected/06591b6d-348e-44ef-8a2d-4112e4d70e60-kube-api-access-z4pzk\") pod \"network-check-source-8894fc9bd-g56dv\" (UID: \"06591b6d-348e-44ef-8a2d-4112e4d70e60\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g56dv" Apr 20 22:24:20.835868 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.835825 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rfgkr" Apr 20 22:24:20.851772 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.851735 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fzjvb" Apr 20 22:24:20.863574 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.863547 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-7n7hh" Apr 20 22:24:20.902699 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.902628 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/908bd97a-6313-4646-ab52-90bb7ffefdaa-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk\" (UID: \"908bd97a-6313-4646-ab52-90bb7ffefdaa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" Apr 20 22:24:20.902699 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.902694 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wsd94\" (UniqueName: \"kubernetes.io/projected/ddc6811c-1bae-4d19-8bc7-a04e9df30ebf-kube-api-access-wsd94\") pod \"klusterlet-addon-workmgr-7b966f6456-6r7jb\" (UID: \"ddc6811c-1bae-4d19-8bc7-a04e9df30ebf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b966f6456-6r7jb" Apr 20 22:24:20.902936 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.902719 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4ed632e-0c77-4b80-b076-66bdfd17da84-metrics-tls\") pod \"dns-default-fj4gp\" (UID: \"e4ed632e-0c77-4b80-b076-66bdfd17da84\") " pod="openshift-dns/dns-default-fj4gp" Apr 20 22:24:20.902936 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.902741 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftz7n\" (UniqueName: \"kubernetes.io/projected/e4ed632e-0c77-4b80-b076-66bdfd17da84-kube-api-access-ftz7n\") pod \"dns-default-fj4gp\" (UID: \"e4ed632e-0c77-4b80-b076-66bdfd17da84\") " pod="openshift-dns/dns-default-fj4gp" Apr 20 22:24:20.902936 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.902765 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ddc6811c-1bae-4d19-8bc7-a04e9df30ebf-tmp\") pod \"klusterlet-addon-workmgr-7b966f6456-6r7jb\" (UID: \"ddc6811c-1bae-4d19-8bc7-a04e9df30ebf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b966f6456-6r7jb" Apr 20 22:24:20.902936 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.902790 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/908bd97a-6313-4646-ab52-90bb7ffefdaa-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk\" (UID: \"908bd97a-6313-4646-ab52-90bb7ffefdaa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" Apr 20 22:24:20.902936 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.902879 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/908bd97a-6313-4646-ab52-90bb7ffefdaa-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk\" (UID: \"908bd97a-6313-4646-ab52-90bb7ffefdaa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" Apr 20 22:24:20.902936 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.902884 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 22:24:20.902936 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.902924 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/908bd97a-6313-4646-ab52-90bb7ffefdaa-ca\") pod \"cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk\" (UID: \"908bd97a-6313-4646-ab52-90bb7ffefdaa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" Apr 20 22:24:20.903252 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.902956 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4ed632e-0c77-4b80-b076-66bdfd17da84-metrics-tls podName:e4ed632e-0c77-4b80-b076-66bdfd17da84 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:21.402934683 +0000 UTC m=+34.355889288 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4ed632e-0c77-4b80-b076-66bdfd17da84-metrics-tls") pod "dns-default-fj4gp" (UID: "e4ed632e-0c77-4b80-b076-66bdfd17da84") : secret "dns-default-metrics-tls" not found Apr 20 22:24:20.903252 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.902996 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e4ed632e-0c77-4b80-b076-66bdfd17da84-tmp-dir\") pod \"dns-default-fj4gp\" (UID: \"e4ed632e-0c77-4b80-b076-66bdfd17da84\") " pod="openshift-dns/dns-default-fj4gp" Apr 20 22:24:20.903252 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.903044 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hrpp9\" (UniqueName: \"kubernetes.io/projected/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-kube-api-access-hrpp9\") pod \"ingress-canary-d5s8x\" (UID: \"82357e1f-f9a8-4cf7-b3dd-fe77912c49a1\") " pod="openshift-ingress-canary/ingress-canary-d5s8x" Apr 20 22:24:20.903252 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.903070 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4ed632e-0c77-4b80-b076-66bdfd17da84-config-volume\") pod \"dns-default-fj4gp\" (UID: \"e4ed632e-0c77-4b80-b076-66bdfd17da84\") " pod="openshift-dns/dns-default-fj4gp" Apr 20 22:24:20.903252 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.903097 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/908bd97a-6313-4646-ab52-90bb7ffefdaa-hub\") pod \"cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk\" (UID: \"908bd97a-6313-4646-ab52-90bb7ffefdaa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" Apr 20 22:24:20.903252 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.903132 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbtnh\" (UniqueName: \"kubernetes.io/projected/908bd97a-6313-4646-ab52-90bb7ffefdaa-kube-api-access-nbtnh\") pod \"cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk\" (UID: \"908bd97a-6313-4646-ab52-90bb7ffefdaa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" Apr 20 22:24:20.903252 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.903177 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-cert\") pod \"ingress-canary-d5s8x\" (UID: \"82357e1f-f9a8-4cf7-b3dd-fe77912c49a1\") " pod="openshift-ingress-canary/ingress-canary-d5s8x" Apr 20 22:24:20.903252 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.903209 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ddc6811c-1bae-4d19-8bc7-a04e9df30ebf-klusterlet-config\") pod \"klusterlet-addon-workmgr-7b966f6456-6r7jb\" (UID: \"ddc6811c-1bae-4d19-8bc7-a04e9df30ebf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b966f6456-6r7jb" Apr 20 22:24:20.903646 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.903602 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e4ed632e-0c77-4b80-b076-66bdfd17da84-tmp-dir\") pod \"dns-default-fj4gp\" (UID: \"e4ed632e-0c77-4b80-b076-66bdfd17da84\") " pod="openshift-dns/dns-default-fj4gp" Apr 20 22:24:20.903758 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.903738 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e4ed632e-0c77-4b80-b076-66bdfd17da84-config-volume\") pod \"dns-default-fj4gp\" (UID: \"e4ed632e-0c77-4b80-b076-66bdfd17da84\") " pod="openshift-dns/dns-default-fj4gp" Apr 20 22:24:20.903827 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.903775 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/908bd97a-6313-4646-ab52-90bb7ffefdaa-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk\" (UID: \"908bd97a-6313-4646-ab52-90bb7ffefdaa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" Apr 20 22:24:20.903896 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.903886 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 22:24:20.903958 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:20.903935 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-cert podName:82357e1f-f9a8-4cf7-b3dd-fe77912c49a1 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:21.403919003 +0000 UTC m=+34.356873611 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-cert") pod "ingress-canary-d5s8x" (UID: "82357e1f-f9a8-4cf7-b3dd-fe77912c49a1") : secret "canary-serving-cert" not found Apr 20 22:24:20.904467 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.904414 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ddc6811c-1bae-4d19-8bc7-a04e9df30ebf-tmp\") pod \"klusterlet-addon-workmgr-7b966f6456-6r7jb\" (UID: \"ddc6811c-1bae-4d19-8bc7-a04e9df30ebf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b966f6456-6r7jb" Apr 20 22:24:20.905527 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.905206 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g56dv" Apr 20 22:24:20.906450 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.906408 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/908bd97a-6313-4646-ab52-90bb7ffefdaa-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk\" (UID: \"908bd97a-6313-4646-ab52-90bb7ffefdaa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" Apr 20 22:24:20.906558 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.906520 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ddc6811c-1bae-4d19-8bc7-a04e9df30ebf-klusterlet-config\") pod \"klusterlet-addon-workmgr-7b966f6456-6r7jb\" (UID: \"ddc6811c-1bae-4d19-8bc7-a04e9df30ebf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b966f6456-6r7jb" Apr 20 22:24:20.906724 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.906695 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/908bd97a-6313-4646-ab52-90bb7ffefdaa-ca\") pod \"cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk\" (UID: \"908bd97a-6313-4646-ab52-90bb7ffefdaa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" Apr 20 22:24:20.906963 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.906940 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/908bd97a-6313-4646-ab52-90bb7ffefdaa-hub\") pod \"cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk\" (UID: \"908bd97a-6313-4646-ab52-90bb7ffefdaa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" Apr 20 22:24:20.907048 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.907037 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/908bd97a-6313-4646-ab52-90bb7ffefdaa-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk\" (UID: \"908bd97a-6313-4646-ab52-90bb7ffefdaa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" Apr 20 22:24:20.913979 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.913929 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftz7n\" (UniqueName: \"kubernetes.io/projected/e4ed632e-0c77-4b80-b076-66bdfd17da84-kube-api-access-ftz7n\") pod \"dns-default-fj4gp\" (UID: \"e4ed632e-0c77-4b80-b076-66bdfd17da84\") " pod="openshift-dns/dns-default-fj4gp" Apr 20 22:24:20.913979 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.913929 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsd94\" (UniqueName: \"kubernetes.io/projected/ddc6811c-1bae-4d19-8bc7-a04e9df30ebf-kube-api-access-wsd94\") pod \"klusterlet-addon-workmgr-7b966f6456-6r7jb\" (UID: \"ddc6811c-1bae-4d19-8bc7-a04e9df30ebf\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b966f6456-6r7jb" Apr 20 22:24:20.914656 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.914639 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrpp9\" (UniqueName: \"kubernetes.io/projected/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-kube-api-access-hrpp9\") pod \"ingress-canary-d5s8x\" (UID: \"82357e1f-f9a8-4cf7-b3dd-fe77912c49a1\") " pod="openshift-ingress-canary/ingress-canary-d5s8x" Apr 20 22:24:20.914752 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.914734 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbtnh\" (UniqueName: \"kubernetes.io/projected/908bd97a-6313-4646-ab52-90bb7ffefdaa-kube-api-access-nbtnh\") pod \"cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk\" (UID: \"908bd97a-6313-4646-ab52-90bb7ffefdaa\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" Apr 20 22:24:20.926713 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.926690 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c57ff7d45-ndk5x" Apr 20 22:24:20.945453 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.945431 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" Apr 20 22:24:20.972886 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:20.972839 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b966f6456-6r7jb" Apr 20 22:24:21.206405 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.206319 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:21.206571 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:21.206478 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 22:24:21.206571 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:21.206499 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c69c58687-c7dk5: secret "image-registry-tls" not found Apr 20 22:24:21.206667 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:21.206593 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls podName:e184abd3-491a-42d4-baec-feffd1648520 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:22.206572395 +0000 UTC m=+35.159526993 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls") pod "image-registry-5c69c58687-c7dk5" (UID: "e184abd3-491a-42d4-baec-feffd1648520") : secret "image-registry-tls" not found Apr 20 22:24:21.206797 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.206768 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c06378d7-946b-49c3-ac21-44605e27cdd5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lxvk8\" (UID: \"c06378d7-946b-49c3-ac21-44605e27cdd5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8" Apr 20 22:24:21.206934 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.206821 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a9bfbef-0d72-430e-b23e-bb50623a7093-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-98hzg\" (UID: \"3a9bfbef-0d72-430e-b23e-bb50623a7093\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-98hzg" Apr 20 22:24:21.206934 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:21.206922 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 22:24:21.207052 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:21.206971 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 22:24:21.207052 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:21.206992 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c06378d7-946b-49c3-ac21-44605e27cdd5-cluster-monitoring-operator-tls podName:c06378d7-946b-49c3-ac21-44605e27cdd5 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:22.206971656 +0000 UTC m=+35.159926256 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c06378d7-946b-49c3-ac21-44605e27cdd5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lxvk8" (UID: "c06378d7-946b-49c3-ac21-44605e27cdd5") : secret "cluster-monitoring-operator-tls" not found Apr 20 22:24:21.207052 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:21.207031 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a9bfbef-0d72-430e-b23e-bb50623a7093-samples-operator-tls podName:3a9bfbef-0d72-430e-b23e-bb50623a7093 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:22.207017057 +0000 UTC m=+35.159971659 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3a9bfbef-0d72-430e-b23e-bb50623a7093-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-98hzg" (UID: "3a9bfbef-0d72-430e-b23e-bb50623a7093") : secret "samples-operator-tls" not found Apr 20 22:24:21.307652 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.307614 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-metrics-certs\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:21.307874 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.307686 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:21.307874 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.307778 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c99a639-1f48-429a-a14e-800ce227becb-service-ca-bundle\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:21.307874 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:21.307788 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 22:24:21.307874 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:21.307841 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 22:24:21.307874 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:21.307872 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-metrics-certs podName:7c99a639-1f48-429a-a14e-800ce227becb nodeName:}" failed. No retries permitted until 2026-04-20 22:24:22.307836667 +0000 UTC m=+35.260791287 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-metrics-certs") pod "router-default-84455b6c98-44svx" (UID: "7c99a639-1f48-429a-a14e-800ce227becb") : secret "router-metrics-certs-default" not found Apr 20 22:24:21.307874 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:21.307875 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-649dd96b44-r7c7b: secret "image-registry-tls" not found Apr 20 22:24:21.308115 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:21.307930 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls podName:bf5aed09-1fd8-4294-aef5-ee13e17b2bf3 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:22.307914783 +0000 UTC m=+35.260869383 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls") pod "image-registry-649dd96b44-r7c7b" (UID: "bf5aed09-1fd8-4294-aef5-ee13e17b2bf3") : secret "image-registry-tls" not found Apr 20 22:24:21.308115 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:21.307951 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7c99a639-1f48-429a-a14e-800ce227becb-service-ca-bundle podName:7c99a639-1f48-429a-a14e-800ce227becb nodeName:}" failed. No retries permitted until 2026-04-20 22:24:22.307942819 +0000 UTC m=+35.260897439 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7c99a639-1f48-429a-a14e-800ce227becb-service-ca-bundle") pod "router-default-84455b6c98-44svx" (UID: "7c99a639-1f48-429a-a14e-800ce227becb") : configmap references non-existent config key: service-ca.crt Apr 20 22:24:21.408911 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.408872 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-cert\") pod \"ingress-canary-d5s8x\" (UID: \"82357e1f-f9a8-4cf7-b3dd-fe77912c49a1\") " pod="openshift-ingress-canary/ingress-canary-d5s8x" Apr 20 22:24:21.409471 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:21.409032 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 22:24:21.409471 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.409038 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4ed632e-0c77-4b80-b076-66bdfd17da84-metrics-tls\") pod \"dns-default-fj4gp\" (UID: \"e4ed632e-0c77-4b80-b076-66bdfd17da84\") " pod="openshift-dns/dns-default-fj4gp" Apr 20 22:24:21.409471 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:21.409099 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-cert podName:82357e1f-f9a8-4cf7-b3dd-fe77912c49a1 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:22.409080251 +0000 UTC m=+35.362034868 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-cert") pod "ingress-canary-d5s8x" (UID: "82357e1f-f9a8-4cf7-b3dd-fe77912c49a1") : secret "canary-serving-cert" not found Apr 20 22:24:21.409471 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:21.409113 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 22:24:21.409471 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:21.409163 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4ed632e-0c77-4b80-b076-66bdfd17da84-metrics-tls podName:e4ed632e-0c77-4b80-b076-66bdfd17da84 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:22.409149354 +0000 UTC m=+35.362103953 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4ed632e-0c77-4b80-b076-66bdfd17da84-metrics-tls") pod "dns-default-fj4gp" (UID: "e4ed632e-0c77-4b80-b076-66bdfd17da84") : secret "dns-default-metrics-tls" not found Apr 20 22:24:21.688776 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.688530 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fzjvb"] Apr 20 22:24:21.691577 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:24:21.691545 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea02661d_e4a4_469a_9451_7f11a7db90d2.slice/crio-570251403f302fc8474391f02b664db48a0cc31517ca777bd6a0691d81870e39 WatchSource:0}: Error finding container 570251403f302fc8474391f02b664db48a0cc31517ca777bd6a0691d81870e39: Status 404 returned error can't find the container with id 570251403f302fc8474391f02b664db48a0cc31517ca777bd6a0691d81870e39 Apr 20 22:24:21.700026 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.699987 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rfgkr"] Apr 20 22:24:21.702331 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.701812 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-xnj54"] Apr 20 22:24:21.702331 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:24:21.702286 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c475df6_d751_4f10_81c7_a1e56dec9176.slice/crio-b1826b87827dd550740eb685a83f4cd397753673e5dcf404c6b064a7460d7bcf WatchSource:0}: Error finding container b1826b87827dd550740eb685a83f4cd397753673e5dcf404c6b064a7460d7bcf: Status 404 returned error can't find the container with id b1826b87827dd550740eb685a83f4cd397753673e5dcf404c6b064a7460d7bcf Apr 20 22:24:21.703160 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.703127 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b966f6456-6r7jb"] Apr 20 22:24:21.704295 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:24:21.704022 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c73411b_82b9_42c4_bbc3_edc35e00606d.slice/crio-cf7d89f584fcf51412e9301eb40de5441b963e350062cbf61e6ccbd711938dc2 WatchSource:0}: Error finding container cf7d89f584fcf51412e9301eb40de5441b963e350062cbf61e6ccbd711938dc2: Status 404 returned error can't find the container with id cf7d89f584fcf51412e9301eb40de5441b963e350062cbf61e6ccbd711938dc2 Apr 20 22:24:21.704565 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.704446 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk"] Apr 20 22:24:21.707402 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:24:21.707382 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod908bd97a_6313_4646_ab52_90bb7ffefdaa.slice/crio-357dd4183de47cf2e7dccd5c403f5b7e04d9b3ca758d17a8a938a38e5246c23f WatchSource:0}: Error finding container 357dd4183de47cf2e7dccd5c403f5b7e04d9b3ca758d17a8a938a38e5246c23f: Status 404 returned error can't find the container with id 357dd4183de47cf2e7dccd5c403f5b7e04d9b3ca758d17a8a938a38e5246c23f Apr 20 22:24:21.708270 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:24:21.708215 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddc6811c_1bae_4d19_8bc7_a04e9df30ebf.slice/crio-752b92e1f4eb3a1aaedd696a4452cea6f6e23ac997acfb9c1180128fff465c4d WatchSource:0}: Error finding container 752b92e1f4eb3a1aaedd696a4452cea6f6e23ac997acfb9c1180128fff465c4d: Status 404 returned error can't find the container with id 752b92e1f4eb3a1aaedd696a4452cea6f6e23ac997acfb9c1180128fff465c4d Apr 20 22:24:21.716266 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.716248 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-g56dv"] Apr 20 22:24:21.718704 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:24:21.718673 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06591b6d_348e_44ef_8a2d_4112e4d70e60.slice/crio-8fa508bcc30fcd2c2f70898c5ef98b7a3f25bd82ffed1bccbc3ecf4c8db18754 WatchSource:0}: Error finding container 8fa508bcc30fcd2c2f70898c5ef98b7a3f25bd82ffed1bccbc3ecf4c8db18754: Status 404 returned error can't find the container with id 8fa508bcc30fcd2c2f70898c5ef98b7a3f25bd82ffed1bccbc3ecf4c8db18754 Apr 20 22:24:21.835888 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.835836 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b966f6456-6r7jb" event={"ID":"ddc6811c-1bae-4d19-8bc7-a04e9df30ebf","Type":"ContainerStarted","Data":"752b92e1f4eb3a1aaedd696a4452cea6f6e23ac997acfb9c1180128fff465c4d"} Apr 20 22:24:21.837146 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.837119 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fzjvb" event={"ID":"ea02661d-e4a4-469a-9451-7f11a7db90d2","Type":"ContainerStarted","Data":"570251403f302fc8474391f02b664db48a0cc31517ca777bd6a0691d81870e39"} Apr 20 22:24:21.838976 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.838941 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rfgkr" event={"ID":"7c73411b-82b9-42c4-bbc3-edc35e00606d","Type":"ContainerStarted","Data":"cf7d89f584fcf51412e9301eb40de5441b963e350062cbf61e6ccbd711938dc2"} Apr 20 22:24:21.839972 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.839954 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" event={"ID":"4c475df6-d751-4f10-81c7-a1e56dec9176","Type":"ContainerStarted","Data":"b1826b87827dd550740eb685a83f4cd397753673e5dcf404c6b064a7460d7bcf"} Apr 20 22:24:21.840861 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.840824 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g56dv" event={"ID":"06591b6d-348e-44ef-8a2d-4112e4d70e60","Type":"ContainerStarted","Data":"8fa508bcc30fcd2c2f70898c5ef98b7a3f25bd82ffed1bccbc3ecf4c8db18754"} Apr 20 22:24:21.841704 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.841688 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" event={"ID":"908bd97a-6313-4646-ab52-90bb7ffefdaa","Type":"ContainerStarted","Data":"357dd4183de47cf2e7dccd5c403f5b7e04d9b3ca758d17a8a938a38e5246c23f"} Apr 20 22:24:21.900020 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.899991 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-7n7hh"] Apr 20 22:24:21.903478 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.903448 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c57ff7d45-ndk5x"] Apr 20 22:24:21.904787 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:21.904764 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dmsjq"] Apr 20 22:24:21.908172 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:24:21.908144 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66ac7746_d497_453d_a7c7_08406f8f7baa.slice/crio-74a3e63b5a0744323ae4432390ed180ccf6eeb34338fcf5a392f27d88a9a7cbc WatchSource:0}: Error finding container 74a3e63b5a0744323ae4432390ed180ccf6eeb34338fcf5a392f27d88a9a7cbc: Status 404 returned error can't find the container with id 74a3e63b5a0744323ae4432390ed180ccf6eeb34338fcf5a392f27d88a9a7cbc Apr 20 22:24:21.909497 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:24:21.909444 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb95d71c_3b6d_407a_9ff3_a70562af1b93.slice/crio-cea2a1461779a978275eb41f154b9659bd561fcdbf2dd5809b6f682b46c7ba08 WatchSource:0}: Error finding container cea2a1461779a978275eb41f154b9659bd561fcdbf2dd5809b6f682b46c7ba08: Status 404 returned error can't find the container with id cea2a1461779a978275eb41f154b9659bd561fcdbf2dd5809b6f682b46c7ba08 Apr 20 22:24:21.910478 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:24:21.910440 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeeb4d8a0_f553_47b6_8134_40d74089fd72.slice/crio-f70a7f505b491f9c9633fb731294562ec39741daac90342a7aea357ed15ef1b1 WatchSource:0}: Error finding container f70a7f505b491f9c9633fb731294562ec39741daac90342a7aea357ed15ef1b1: Status 404 returned error can't find the container with id f70a7f505b491f9c9633fb731294562ec39741daac90342a7aea357ed15ef1b1 Apr 20 22:24:22.218522 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:22.218441 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c06378d7-946b-49c3-ac21-44605e27cdd5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lxvk8\" (UID: \"c06378d7-946b-49c3-ac21-44605e27cdd5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8" Apr 20 22:24:22.218522 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:22.218495 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a9bfbef-0d72-430e-b23e-bb50623a7093-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-98hzg\" (UID: \"3a9bfbef-0d72-430e-b23e-bb50623a7093\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-98hzg" Apr 20 22:24:22.218742 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:22.218552 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:22.218742 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:22.218596 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 22:24:22.218742 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:22.218672 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c06378d7-946b-49c3-ac21-44605e27cdd5-cluster-monitoring-operator-tls podName:c06378d7-946b-49c3-ac21-44605e27cdd5 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:24.218649678 +0000 UTC m=+37.171604278 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c06378d7-946b-49c3-ac21-44605e27cdd5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lxvk8" (UID: "c06378d7-946b-49c3-ac21-44605e27cdd5") : secret "cluster-monitoring-operator-tls" not found Apr 20 22:24:22.218742 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:22.218739 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 22:24:22.218963 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:22.218753 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c69c58687-c7dk5: secret "image-registry-tls" not found Apr 20 22:24:22.218963 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:22.218802 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls podName:e184abd3-491a-42d4-baec-feffd1648520 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:24.218786113 +0000 UTC m=+37.171740713 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls") pod "image-registry-5c69c58687-c7dk5" (UID: "e184abd3-491a-42d4-baec-feffd1648520") : secret "image-registry-tls" not found Apr 20 22:24:22.218963 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:22.218891 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 22:24:22.218963 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:22.218928 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a9bfbef-0d72-430e-b23e-bb50623a7093-samples-operator-tls podName:3a9bfbef-0d72-430e-b23e-bb50623a7093 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:24.218916832 +0000 UTC m=+37.171871435 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3a9bfbef-0d72-430e-b23e-bb50623a7093-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-98hzg" (UID: "3a9bfbef-0d72-430e-b23e-bb50623a7093") : secret "samples-operator-tls" not found Apr 20 22:24:22.319530 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:22.319391 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-metrics-certs\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:22.319530 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:22.319479 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:22.319530 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:22.319513 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 22:24:22.319808 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:22.319560 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c99a639-1f48-429a-a14e-800ce227becb-service-ca-bundle\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:22.319808 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:22.319585 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-metrics-certs podName:7c99a639-1f48-429a-a14e-800ce227becb nodeName:}" failed. No retries permitted until 2026-04-20 22:24:24.319565279 +0000 UTC m=+37.272519880 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-metrics-certs") pod "router-default-84455b6c98-44svx" (UID: "7c99a639-1f48-429a-a14e-800ce227becb") : secret "router-metrics-certs-default" not found Apr 20 22:24:22.319808 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:22.319658 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7c99a639-1f48-429a-a14e-800ce227becb-service-ca-bundle podName:7c99a639-1f48-429a-a14e-800ce227becb nodeName:}" failed. No retries permitted until 2026-04-20 22:24:24.319643319 +0000 UTC m=+37.272597916 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7c99a639-1f48-429a-a14e-800ce227becb-service-ca-bundle") pod "router-default-84455b6c98-44svx" (UID: "7c99a639-1f48-429a-a14e-800ce227becb") : configmap references non-existent config key: service-ca.crt Apr 20 22:24:22.319808 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:22.319777 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 22:24:22.319808 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:22.319790 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-649dd96b44-r7c7b: secret "image-registry-tls" not found Apr 20 22:24:22.320030 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:22.319824 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls podName:bf5aed09-1fd8-4294-aef5-ee13e17b2bf3 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:24.319813222 +0000 UTC m=+37.272767823 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls") pod "image-registry-649dd96b44-r7c7b" (UID: "bf5aed09-1fd8-4294-aef5-ee13e17b2bf3") : secret "image-registry-tls" not found Apr 20 22:24:22.421410 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:22.420645 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-cert\") pod \"ingress-canary-d5s8x\" (UID: \"82357e1f-f9a8-4cf7-b3dd-fe77912c49a1\") " pod="openshift-ingress-canary/ingress-canary-d5s8x" Apr 20 22:24:22.421410 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:22.420725 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4ed632e-0c77-4b80-b076-66bdfd17da84-metrics-tls\") pod \"dns-default-fj4gp\" (UID: \"e4ed632e-0c77-4b80-b076-66bdfd17da84\") " pod="openshift-dns/dns-default-fj4gp" Apr 20 22:24:22.421410 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:22.420955 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 22:24:22.421410 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:22.421019 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4ed632e-0c77-4b80-b076-66bdfd17da84-metrics-tls podName:e4ed632e-0c77-4b80-b076-66bdfd17da84 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:24.420999871 +0000 UTC m=+37.373954470 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4ed632e-0c77-4b80-b076-66bdfd17da84-metrics-tls") pod "dns-default-fj4gp" (UID: "e4ed632e-0c77-4b80-b076-66bdfd17da84") : secret "dns-default-metrics-tls" not found Apr 20 22:24:22.421410 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:22.421278 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 22:24:22.421410 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:22.421340 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-cert podName:82357e1f-f9a8-4cf7-b3dd-fe77912c49a1 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:24.421321829 +0000 UTC m=+37.374276444 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-cert") pod "ingress-canary-d5s8x" (UID: "82357e1f-f9a8-4cf7-b3dd-fe77912c49a1") : secret "canary-serving-cert" not found Apr 20 22:24:22.849388 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:22.849319 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c57ff7d45-ndk5x" event={"ID":"66ac7746-d497-453d-a7c7-08406f8f7baa","Type":"ContainerStarted","Data":"74a3e63b5a0744323ae4432390ed180ccf6eeb34338fcf5a392f27d88a9a7cbc"} Apr 20 22:24:22.859016 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:22.858940 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-7n7hh" event={"ID":"bb95d71c-3b6d-407a-9ff3-a70562af1b93","Type":"ContainerStarted","Data":"cea2a1461779a978275eb41f154b9659bd561fcdbf2dd5809b6f682b46c7ba08"} Apr 20 22:24:22.861363 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:22.861277 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dmsjq" event={"ID":"eeb4d8a0-f553-47b6-8134-40d74089fd72","Type":"ContainerStarted","Data":"f70a7f505b491f9c9633fb731294562ec39741daac90342a7aea357ed15ef1b1"} Apr 20 22:24:22.870385 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:22.870354 2573 generic.go:358] "Generic (PLEG): container finished" podID="d7ecb730-4be8-4cc2-86d1-47a71c9e25e7" containerID="39859b6a00457ce1eaef9e42e71547841e4422f74519feee7671d12d39e90ead" exitCode=0 Apr 20 22:24:22.870502 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:22.870411 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvjgp" event={"ID":"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7","Type":"ContainerDied","Data":"39859b6a00457ce1eaef9e42e71547841e4422f74519feee7671d12d39e90ead"} Apr 20 22:24:23.908406 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:23.908147 2573 generic.go:358] "Generic (PLEG): container finished" podID="d7ecb730-4be8-4cc2-86d1-47a71c9e25e7" containerID="02217956c9000e533fb2e645dfee407258f9d175693b581b31ccd7a6f537ece9" exitCode=0 Apr 20 22:24:23.908896 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:23.908300 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvjgp" event={"ID":"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7","Type":"ContainerDied","Data":"02217956c9000e533fb2e645dfee407258f9d175693b581b31ccd7a6f537ece9"} Apr 20 22:24:24.245722 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:24.245661 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c06378d7-946b-49c3-ac21-44605e27cdd5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lxvk8\" (UID: \"c06378d7-946b-49c3-ac21-44605e27cdd5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8" Apr 20 22:24:24.245722 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:24.245721 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a9bfbef-0d72-430e-b23e-bb50623a7093-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-98hzg\" (UID: \"3a9bfbef-0d72-430e-b23e-bb50623a7093\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-98hzg" Apr 20 22:24:24.245965 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:24.245780 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:24.246023 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:24.245965 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 22:24:24.246023 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:24.245995 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 22:24:24.246115 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:24.246032 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c06378d7-946b-49c3-ac21-44605e27cdd5-cluster-monitoring-operator-tls podName:c06378d7-946b-49c3-ac21-44605e27cdd5 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:28.246011346 +0000 UTC m=+41.198965946 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c06378d7-946b-49c3-ac21-44605e27cdd5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lxvk8" (UID: "c06378d7-946b-49c3-ac21-44605e27cdd5") : secret "cluster-monitoring-operator-tls" not found Apr 20 22:24:24.246115 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:24.246060 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a9bfbef-0d72-430e-b23e-bb50623a7093-samples-operator-tls podName:3a9bfbef-0d72-430e-b23e-bb50623a7093 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:28.246041904 +0000 UTC m=+41.198996503 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3a9bfbef-0d72-430e-b23e-bb50623a7093-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-98hzg" (UID: "3a9bfbef-0d72-430e-b23e-bb50623a7093") : secret "samples-operator-tls" not found Apr 20 22:24:24.247346 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:24.247257 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 22:24:24.247346 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:24.247277 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c69c58687-c7dk5: secret "image-registry-tls" not found Apr 20 22:24:24.247346 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:24.247327 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls podName:e184abd3-491a-42d4-baec-feffd1648520 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:28.247311429 +0000 UTC m=+41.200266028 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls") pod "image-registry-5c69c58687-c7dk5" (UID: "e184abd3-491a-42d4-baec-feffd1648520") : secret "image-registry-tls" not found Apr 20 22:24:24.347944 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:24.346954 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-metrics-certs\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:24.347944 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:24.347050 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:24.347944 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:24.347145 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c99a639-1f48-429a-a14e-800ce227becb-service-ca-bundle\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:24.347944 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:24.347328 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7c99a639-1f48-429a-a14e-800ce227becb-service-ca-bundle podName:7c99a639-1f48-429a-a14e-800ce227becb nodeName:}" failed. No retries permitted until 2026-04-20 22:24:28.347308217 +0000 UTC m=+41.300262817 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7c99a639-1f48-429a-a14e-800ce227becb-service-ca-bundle") pod "router-default-84455b6c98-44svx" (UID: "7c99a639-1f48-429a-a14e-800ce227becb") : configmap references non-existent config key: service-ca.crt Apr 20 22:24:24.347944 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:24.347736 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 22:24:24.347944 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:24.347788 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-metrics-certs podName:7c99a639-1f48-429a-a14e-800ce227becb nodeName:}" failed. No retries permitted until 2026-04-20 22:24:28.347771941 +0000 UTC m=+41.300726538 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-metrics-certs") pod "router-default-84455b6c98-44svx" (UID: "7c99a639-1f48-429a-a14e-800ce227becb") : secret "router-metrics-certs-default" not found Apr 20 22:24:24.347944 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:24.347866 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 22:24:24.347944 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:24.347878 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-649dd96b44-r7c7b: secret "image-registry-tls" not found Apr 20 22:24:24.347944 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:24.347910 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls podName:bf5aed09-1fd8-4294-aef5-ee13e17b2bf3 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:28.347899657 +0000 UTC m=+41.300854260 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls") pod "image-registry-649dd96b44-r7c7b" (UID: "bf5aed09-1fd8-4294-aef5-ee13e17b2bf3") : secret "image-registry-tls" not found Apr 20 22:24:24.449301 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:24.448253 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4ed632e-0c77-4b80-b076-66bdfd17da84-metrics-tls\") pod \"dns-default-fj4gp\" (UID: \"e4ed632e-0c77-4b80-b076-66bdfd17da84\") " pod="openshift-dns/dns-default-fj4gp" Apr 20 22:24:24.449301 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:24.448529 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-cert\") pod \"ingress-canary-d5s8x\" (UID: \"82357e1f-f9a8-4cf7-b3dd-fe77912c49a1\") " pod="openshift-ingress-canary/ingress-canary-d5s8x" Apr 20 22:24:24.449301 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:24.448697 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 22:24:24.449301 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:24.448759 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-cert podName:82357e1f-f9a8-4cf7-b3dd-fe77912c49a1 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:28.448741054 +0000 UTC m=+41.401695653 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-cert") pod "ingress-canary-d5s8x" (UID: "82357e1f-f9a8-4cf7-b3dd-fe77912c49a1") : secret "canary-serving-cert" not found Apr 20 22:24:24.449301 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:24.449193 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 22:24:24.449301 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:24.449265 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4ed632e-0c77-4b80-b076-66bdfd17da84-metrics-tls podName:e4ed632e-0c77-4b80-b076-66bdfd17da84 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:28.449250471 +0000 UTC m=+41.402205088 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4ed632e-0c77-4b80-b076-66bdfd17da84-metrics-tls") pod "dns-default-fj4gp" (UID: "e4ed632e-0c77-4b80-b076-66bdfd17da84") : secret "dns-default-metrics-tls" not found Apr 20 22:24:25.392716 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:25.391562 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-4fd8s"] Apr 20 22:24:25.413101 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:25.411988 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-4fd8s"] Apr 20 22:24:25.413101 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:25.412126 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4fd8s" Apr 20 22:24:25.416332 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:25.415677 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-g99rh\"" Apr 20 22:24:25.416332 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:25.415937 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 20 22:24:25.416332 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:25.416133 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 20 22:24:25.460183 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:25.460015 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4fd8s\" (UID: \"51e93c8f-5d21-4964-8b01-3ddb5f2e5c86\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4fd8s" Apr 20 22:24:25.460183 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:25.460071 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-4fd8s\" (UID: \"51e93c8f-5d21-4964-8b01-3ddb5f2e5c86\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4fd8s" Apr 20 22:24:25.561806 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:25.560768 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4fd8s\" (UID: \"51e93c8f-5d21-4964-8b01-3ddb5f2e5c86\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4fd8s" Apr 20 22:24:25.561806 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:25.560824 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-4fd8s\" (UID: \"51e93c8f-5d21-4964-8b01-3ddb5f2e5c86\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4fd8s" Apr 20 22:24:25.561806 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:25.561742 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-4fd8s\" (UID: \"51e93c8f-5d21-4964-8b01-3ddb5f2e5c86\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4fd8s" Apr 20 22:24:25.561806 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:25.561770 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 22:24:25.562127 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:25.561843 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-networking-console-plugin-cert podName:51e93c8f-5d21-4964-8b01-3ddb5f2e5c86 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:26.061823926 +0000 UTC m=+39.014778526 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4fd8s" (UID: "51e93c8f-5d21-4964-8b01-3ddb5f2e5c86") : secret "networking-console-plugin-cert" not found Apr 20 22:24:26.066473 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:26.066428 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4fd8s\" (UID: \"51e93c8f-5d21-4964-8b01-3ddb5f2e5c86\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4fd8s" Apr 20 22:24:26.066690 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:26.066594 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 22:24:26.066690 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:26.066671 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-networking-console-plugin-cert podName:51e93c8f-5d21-4964-8b01-3ddb5f2e5c86 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:27.066651585 +0000 UTC m=+40.019606197 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4fd8s" (UID: "51e93c8f-5d21-4964-8b01-3ddb5f2e5c86") : secret "networking-console-plugin-cert" not found Apr 20 22:24:27.075258 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:27.075213 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4fd8s\" (UID: \"51e93c8f-5d21-4964-8b01-3ddb5f2e5c86\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4fd8s" Apr 20 22:24:27.075818 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:27.075355 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 22:24:27.075818 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:27.075432 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-networking-console-plugin-cert podName:51e93c8f-5d21-4964-8b01-3ddb5f2e5c86 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:29.075411224 +0000 UTC m=+42.028365829 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4fd8s" (UID: "51e93c8f-5d21-4964-8b01-3ddb5f2e5c86") : secret "networking-console-plugin-cert" not found Apr 20 22:24:28.287526 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:28.287475 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c06378d7-946b-49c3-ac21-44605e27cdd5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lxvk8\" (UID: \"c06378d7-946b-49c3-ac21-44605e27cdd5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8" Apr 20 22:24:28.287526 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:28.287533 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a9bfbef-0d72-430e-b23e-bb50623a7093-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-98hzg\" (UID: \"3a9bfbef-0d72-430e-b23e-bb50623a7093\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-98hzg" Apr 20 22:24:28.288075 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:28.287585 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:28.288075 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:28.287643 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 22:24:28.288075 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:28.287717 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c06378d7-946b-49c3-ac21-44605e27cdd5-cluster-monitoring-operator-tls podName:c06378d7-946b-49c3-ac21-44605e27cdd5 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:36.287698375 +0000 UTC m=+49.240652971 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c06378d7-946b-49c3-ac21-44605e27cdd5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lxvk8" (UID: "c06378d7-946b-49c3-ac21-44605e27cdd5") : secret "cluster-monitoring-operator-tls" not found Apr 20 22:24:28.288075 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:28.287724 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 22:24:28.288075 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:28.287739 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c69c58687-c7dk5: secret "image-registry-tls" not found Apr 20 22:24:28.288075 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:28.287790 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls podName:e184abd3-491a-42d4-baec-feffd1648520 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:36.287774059 +0000 UTC m=+49.240728658 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls") pod "image-registry-5c69c58687-c7dk5" (UID: "e184abd3-491a-42d4-baec-feffd1648520") : secret "image-registry-tls" not found Apr 20 22:24:28.288075 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:28.287714 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 22:24:28.288075 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:28.287841 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a9bfbef-0d72-430e-b23e-bb50623a7093-samples-operator-tls podName:3a9bfbef-0d72-430e-b23e-bb50623a7093 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:36.287825678 +0000 UTC m=+49.240780289 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3a9bfbef-0d72-430e-b23e-bb50623a7093-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-98hzg" (UID: "3a9bfbef-0d72-430e-b23e-bb50623a7093") : secret "samples-operator-tls" not found Apr 20 22:24:28.388526 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:28.388494 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:28.388718 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:28.388600 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c99a639-1f48-429a-a14e-800ce227becb-service-ca-bundle\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:28.388718 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:28.388667 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 22:24:28.388718 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:28.388691 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-649dd96b44-r7c7b: secret "image-registry-tls" not found Apr 20 22:24:28.388718 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:28.388698 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-metrics-certs\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:28.388946 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:28.388756 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls podName:bf5aed09-1fd8-4294-aef5-ee13e17b2bf3 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:36.388733583 +0000 UTC m=+49.341688191 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls") pod "image-registry-649dd96b44-r7c7b" (UID: "bf5aed09-1fd8-4294-aef5-ee13e17b2bf3") : secret "image-registry-tls" not found Apr 20 22:24:28.388946 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:28.388777 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7c99a639-1f48-429a-a14e-800ce227becb-service-ca-bundle podName:7c99a639-1f48-429a-a14e-800ce227becb nodeName:}" failed. No retries permitted until 2026-04-20 22:24:36.388764457 +0000 UTC m=+49.341719059 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7c99a639-1f48-429a-a14e-800ce227becb-service-ca-bundle") pod "router-default-84455b6c98-44svx" (UID: "7c99a639-1f48-429a-a14e-800ce227becb") : configmap references non-existent config key: service-ca.crt Apr 20 22:24:28.388946 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:28.388819 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 22:24:28.388946 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:28.388877 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-metrics-certs podName:7c99a639-1f48-429a-a14e-800ce227becb nodeName:}" failed. No retries permitted until 2026-04-20 22:24:36.388863756 +0000 UTC m=+49.341818367 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-metrics-certs") pod "router-default-84455b6c98-44svx" (UID: "7c99a639-1f48-429a-a14e-800ce227becb") : secret "router-metrics-certs-default" not found Apr 20 22:24:28.489625 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:28.489596 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-cert\") pod \"ingress-canary-d5s8x\" (UID: \"82357e1f-f9a8-4cf7-b3dd-fe77912c49a1\") " pod="openshift-ingress-canary/ingress-canary-d5s8x" Apr 20 22:24:28.489806 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:28.489657 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4ed632e-0c77-4b80-b076-66bdfd17da84-metrics-tls\") pod \"dns-default-fj4gp\" (UID: \"e4ed632e-0c77-4b80-b076-66bdfd17da84\") " pod="openshift-dns/dns-default-fj4gp" Apr 20 22:24:28.489806 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:28.489752 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 22:24:28.489806 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:28.489761 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 22:24:28.489986 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:28.489812 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4ed632e-0c77-4b80-b076-66bdfd17da84-metrics-tls podName:e4ed632e-0c77-4b80-b076-66bdfd17da84 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:36.489799187 +0000 UTC m=+49.442753789 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4ed632e-0c77-4b80-b076-66bdfd17da84-metrics-tls") pod "dns-default-fj4gp" (UID: "e4ed632e-0c77-4b80-b076-66bdfd17da84") : secret "dns-default-metrics-tls" not found Apr 20 22:24:28.489986 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:28.489829 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-cert podName:82357e1f-f9a8-4cf7-b3dd-fe77912c49a1 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:36.489821561 +0000 UTC m=+49.442776157 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-cert") pod "ingress-canary-d5s8x" (UID: "82357e1f-f9a8-4cf7-b3dd-fe77912c49a1") : secret "canary-serving-cert" not found Apr 20 22:24:29.097509 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:29.097462 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4fd8s\" (UID: \"51e93c8f-5d21-4964-8b01-3ddb5f2e5c86\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4fd8s" Apr 20 22:24:29.097696 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:29.097644 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 22:24:29.097768 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:29.097729 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-networking-console-plugin-cert podName:51e93c8f-5d21-4964-8b01-3ddb5f2e5c86 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:33.097710629 +0000 UTC m=+46.050665229 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4fd8s" (UID: "51e93c8f-5d21-4964-8b01-3ddb5f2e5c86") : secret "networking-console-plugin-cert" not found Apr 20 22:24:31.402706 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:31.402675 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-5m4dx"] Apr 20 22:24:31.449908 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:31.449874 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5m4dx"] Apr 20 22:24:31.450080 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:31.449984 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5m4dx" Apr 20 22:24:31.452514 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:31.452490 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 22:24:31.518126 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:31.518054 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/29a8ab11-5eff-49a0-b910-dbb219ffd462-kubelet-config\") pod \"global-pull-secret-syncer-5m4dx\" (UID: \"29a8ab11-5eff-49a0-b910-dbb219ffd462\") " pod="kube-system/global-pull-secret-syncer-5m4dx" Apr 20 22:24:31.518290 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:31.518203 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/29a8ab11-5eff-49a0-b910-dbb219ffd462-dbus\") pod \"global-pull-secret-syncer-5m4dx\" (UID: \"29a8ab11-5eff-49a0-b910-dbb219ffd462\") " pod="kube-system/global-pull-secret-syncer-5m4dx" Apr 20 22:24:31.518290 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:31.518227 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/29a8ab11-5eff-49a0-b910-dbb219ffd462-original-pull-secret\") pod \"global-pull-secret-syncer-5m4dx\" (UID: \"29a8ab11-5eff-49a0-b910-dbb219ffd462\") " pod="kube-system/global-pull-secret-syncer-5m4dx" Apr 20 22:24:31.619187 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:31.619146 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/29a8ab11-5eff-49a0-b910-dbb219ffd462-kubelet-config\") pod \"global-pull-secret-syncer-5m4dx\" (UID: \"29a8ab11-5eff-49a0-b910-dbb219ffd462\") " pod="kube-system/global-pull-secret-syncer-5m4dx" Apr 20 22:24:31.619377 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:31.619290 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/29a8ab11-5eff-49a0-b910-dbb219ffd462-kubelet-config\") pod \"global-pull-secret-syncer-5m4dx\" (UID: \"29a8ab11-5eff-49a0-b910-dbb219ffd462\") " pod="kube-system/global-pull-secret-syncer-5m4dx" Apr 20 22:24:31.619377 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:31.619313 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/29a8ab11-5eff-49a0-b910-dbb219ffd462-dbus\") pod \"global-pull-secret-syncer-5m4dx\" (UID: \"29a8ab11-5eff-49a0-b910-dbb219ffd462\") " pod="kube-system/global-pull-secret-syncer-5m4dx" Apr 20 22:24:31.619377 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:31.619342 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/29a8ab11-5eff-49a0-b910-dbb219ffd462-original-pull-secret\") pod \"global-pull-secret-syncer-5m4dx\" (UID: \"29a8ab11-5eff-49a0-b910-dbb219ffd462\") " pod="kube-system/global-pull-secret-syncer-5m4dx" Apr 20 22:24:31.619544 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:31.619497 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/29a8ab11-5eff-49a0-b910-dbb219ffd462-dbus\") pod \"global-pull-secret-syncer-5m4dx\" (UID: \"29a8ab11-5eff-49a0-b910-dbb219ffd462\") " pod="kube-system/global-pull-secret-syncer-5m4dx" Apr 20 22:24:31.623462 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:31.623441 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/29a8ab11-5eff-49a0-b910-dbb219ffd462-original-pull-secret\") pod \"global-pull-secret-syncer-5m4dx\" (UID: \"29a8ab11-5eff-49a0-b910-dbb219ffd462\") " pod="kube-system/global-pull-secret-syncer-5m4dx" Apr 20 22:24:31.760407 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:31.760327 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-5m4dx" Apr 20 22:24:33.133194 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:33.133153 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4fd8s\" (UID: \"51e93c8f-5d21-4964-8b01-3ddb5f2e5c86\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4fd8s" Apr 20 22:24:33.133681 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:33.133348 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 22:24:33.133681 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:33.133435 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-networking-console-plugin-cert podName:51e93c8f-5d21-4964-8b01-3ddb5f2e5c86 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:41.133412343 +0000 UTC m=+54.086366966 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4fd8s" (UID: "51e93c8f-5d21-4964-8b01-3ddb5f2e5c86") : secret "networking-console-plugin-cert" not found Apr 20 22:24:34.317769 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:34.317725 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-5m4dx"] Apr 20 22:24:34.939394 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:34.939352 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b966f6456-6r7jb" event={"ID":"ddc6811c-1bae-4d19-8bc7-a04e9df30ebf","Type":"ContainerStarted","Data":"3d0ffe8a889ea5cbd3fd14c454137ad7de3da1e0391a877985739c4d9277c7a6"} Apr 20 22:24:34.941693 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:34.941657 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b966f6456-6r7jb" Apr 20 22:24:34.942463 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:34.942445 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b966f6456-6r7jb" Apr 20 22:24:34.944520 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:34.944341 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fzjvb" event={"ID":"ea02661d-e4a4-469a-9451-7f11a7db90d2","Type":"ContainerStarted","Data":"1dcb1bc608878af06bdf39147b04d23e584fcbaa9e89fb928b1006811f405819"} Apr 20 22:24:34.946721 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:34.946599 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dmsjq" event={"ID":"eeb4d8a0-f553-47b6-8134-40d74089fd72","Type":"ContainerStarted","Data":"0d53cb35592b675d97da657b4c1d2deddd4c8367ece8b4858b2fb5e921d4d269"} Apr 20 22:24:34.948572 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:34.948525 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5m4dx" event={"ID":"29a8ab11-5eff-49a0-b910-dbb219ffd462","Type":"ContainerStarted","Data":"0fa6b7e8206a0e775ff024464e562c10e40215240205b418478108f5ff2cb67f"} Apr 20 22:24:34.955917 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:34.954268 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qvjgp" event={"ID":"d7ecb730-4be8-4cc2-86d1-47a71c9e25e7","Type":"ContainerStarted","Data":"0f682a0a1b08f0034363841ace387ff29e65f82a41b4dabbacd52732bdd9bb7e"} Apr 20 22:24:34.957484 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:34.956940 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b966f6456-6r7jb" podStartSLOduration=28.525584667 podStartE2EDuration="40.95692621s" podCreationTimestamp="2026-04-20 22:23:54 +0000 UTC" firstStartedPulling="2026-04-20 22:24:21.710149116 +0000 UTC m=+34.663103713" lastFinishedPulling="2026-04-20 22:24:34.141490645 +0000 UTC m=+47.094445256" observedRunningTime="2026-04-20 22:24:34.955064501 +0000 UTC m=+47.908019123" watchObservedRunningTime="2026-04-20 22:24:34.95692621 +0000 UTC m=+47.909880830" Apr 20 22:24:34.958759 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:34.958381 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rfgkr" event={"ID":"7c73411b-82b9-42c4-bbc3-edc35e00606d","Type":"ContainerStarted","Data":"2889cd48587b057f2cf952e3a1e9b87a85f3187718fb3d5526da589f0cba5f2f"} Apr 20 22:24:34.963084 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:34.962702 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xnj54_4c475df6-d751-4f10-81c7-a1e56dec9176/console-operator/0.log" Apr 20 22:24:34.963084 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:34.962736 2573 generic.go:358] "Generic (PLEG): container finished" podID="4c475df6-d751-4f10-81c7-a1e56dec9176" containerID="231740ad276b02201a2f6d8f4d10d8250faba5cb6d57b98b7583e7e4559e1ea3" exitCode=255 Apr 20 22:24:34.963084 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:34.962787 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" event={"ID":"4c475df6-d751-4f10-81c7-a1e56dec9176","Type":"ContainerDied","Data":"231740ad276b02201a2f6d8f4d10d8250faba5cb6d57b98b7583e7e4559e1ea3"} Apr 20 22:24:34.963084 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:34.963083 2573 scope.go:117] "RemoveContainer" containerID="231740ad276b02201a2f6d8f4d10d8250faba5cb6d57b98b7583e7e4559e1ea3" Apr 20 22:24:34.967890 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:34.967813 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g56dv" event={"ID":"06591b6d-348e-44ef-8a2d-4112e4d70e60","Type":"ContainerStarted","Data":"80c863d511b20b2caea94c4affafcb5daaaad234998778f1c6f4da56f0082fca"} Apr 20 22:24:34.969784 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:34.969725 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" event={"ID":"908bd97a-6313-4646-ab52-90bb7ffefdaa","Type":"ContainerStarted","Data":"b2d3eca8d2c9cc2fba53e35e5dd4b2e70fb1cf595e74220bef01c0d979f48f1f"} Apr 20 22:24:34.971214 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:34.971170 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c57ff7d45-ndk5x" event={"ID":"66ac7746-d497-453d-a7c7-08406f8f7baa","Type":"ContainerStarted","Data":"18ca08c8145e11b038d98b26db5d0b3e12b07d7c4a77e631515eadeeee5eb860"} Apr 20 22:24:34.980914 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:34.979535 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qvjgp" podStartSLOduration=15.166521829 podStartE2EDuration="47.979520881s" podCreationTimestamp="2026-04-20 22:23:47 +0000 UTC" firstStartedPulling="2026-04-20 22:23:48.917768173 +0000 UTC m=+1.870722770" lastFinishedPulling="2026-04-20 22:24:21.730767223 +0000 UTC m=+34.683721822" observedRunningTime="2026-04-20 22:24:34.977102012 +0000 UTC m=+47.930056632" watchObservedRunningTime="2026-04-20 22:24:34.979520881 +0000 UTC m=+47.932475504" Apr 20 22:24:34.980914 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:34.979993 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-7n7hh" event={"ID":"bb95d71c-3b6d-407a-9ff3-a70562af1b93","Type":"ContainerStarted","Data":"c08422570a15d84cda8161a375c8edbeed27f5065237660dd51832c38b1228ab"} Apr 20 22:24:34.999154 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:34.998682 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dmsjq" podStartSLOduration=13.750899783 podStartE2EDuration="25.998665482s" podCreationTimestamp="2026-04-20 22:24:09 +0000 UTC" firstStartedPulling="2026-04-20 22:24:21.912933128 +0000 UTC m=+34.865887724" lastFinishedPulling="2026-04-20 22:24:34.160698813 +0000 UTC m=+47.113653423" observedRunningTime="2026-04-20 22:24:34.996272824 +0000 UTC m=+47.949227444" watchObservedRunningTime="2026-04-20 22:24:34.998665482 +0000 UTC m=+47.951620100" Apr 20 22:24:35.045044 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:35.040167 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fzjvb" podStartSLOduration=12.59886891 podStartE2EDuration="25.040150264s" podCreationTimestamp="2026-04-20 22:24:10 +0000 UTC" firstStartedPulling="2026-04-20 22:24:21.701462015 +0000 UTC m=+34.654416626" lastFinishedPulling="2026-04-20 22:24:34.142743369 +0000 UTC m=+47.095697980" observedRunningTime="2026-04-20 22:24:35.038266264 +0000 UTC m=+47.991220881" watchObservedRunningTime="2026-04-20 22:24:35.040150264 +0000 UTC m=+47.993104882" Apr 20 22:24:35.076481 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:35.076055 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-rfgkr" podStartSLOduration=13.193729047 podStartE2EDuration="25.076037107s" podCreationTimestamp="2026-04-20 22:24:10 +0000 UTC" firstStartedPulling="2026-04-20 22:24:21.708256516 +0000 UTC m=+34.661211117" lastFinishedPulling="2026-04-20 22:24:33.59056458 +0000 UTC m=+46.543519177" observedRunningTime="2026-04-20 22:24:35.074335379 +0000 UTC m=+48.027290010" watchObservedRunningTime="2026-04-20 22:24:35.076037107 +0000 UTC m=+48.028991727" Apr 20 22:24:35.091960 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:35.091910 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-7n7hh" podStartSLOduration=13.860411451000001 podStartE2EDuration="26.091846585s" podCreationTimestamp="2026-04-20 22:24:09 +0000 UTC" firstStartedPulling="2026-04-20 22:24:21.911692688 +0000 UTC m=+34.864647289" lastFinishedPulling="2026-04-20 22:24:34.143127822 +0000 UTC m=+47.096082423" observedRunningTime="2026-04-20 22:24:35.091105381 +0000 UTC m=+48.044060001" watchObservedRunningTime="2026-04-20 22:24:35.091846585 +0000 UTC m=+48.044801206" Apr 20 22:24:35.111521 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:35.111467 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g56dv" podStartSLOduration=13.671878895999999 podStartE2EDuration="26.111449935s" podCreationTimestamp="2026-04-20 22:24:09 +0000 UTC" firstStartedPulling="2026-04-20 22:24:21.720485869 +0000 UTC m=+34.673440466" lastFinishedPulling="2026-04-20 22:24:34.160056896 +0000 UTC m=+47.113011505" observedRunningTime="2026-04-20 22:24:35.110125387 +0000 UTC m=+48.063080007" watchObservedRunningTime="2026-04-20 22:24:35.111449935 +0000 UTC m=+48.064404556" Apr 20 22:24:35.986872 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:35.986764 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xnj54_4c475df6-d751-4f10-81c7-a1e56dec9176/console-operator/1.log" Apr 20 22:24:35.988163 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:35.987544 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xnj54_4c475df6-d751-4f10-81c7-a1e56dec9176/console-operator/0.log" Apr 20 22:24:35.988163 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:35.987665 2573 generic.go:358] "Generic (PLEG): container finished" podID="4c475df6-d751-4f10-81c7-a1e56dec9176" containerID="cf6b097107973e8c412c796852f65e9438ea4e18a6db82613b2858ecac5434d6" exitCode=255 Apr 20 22:24:35.988163 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:35.988104 2573 scope.go:117] "RemoveContainer" containerID="cf6b097107973e8c412c796852f65e9438ea4e18a6db82613b2858ecac5434d6" Apr 20 22:24:35.988383 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:35.988317 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-xnj54_openshift-console-operator(4c475df6-d751-4f10-81c7-a1e56dec9176)\"" pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" podUID="4c475df6-d751-4f10-81c7-a1e56dec9176" Apr 20 22:24:35.988908 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:35.988731 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" event={"ID":"4c475df6-d751-4f10-81c7-a1e56dec9176","Type":"ContainerDied","Data":"cf6b097107973e8c412c796852f65e9438ea4e18a6db82613b2858ecac5434d6"} Apr 20 22:24:35.988908 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:35.988794 2573 scope.go:117] "RemoveContainer" containerID="231740ad276b02201a2f6d8f4d10d8250faba5cb6d57b98b7583e7e4559e1ea3" Apr 20 22:24:36.011830 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:36.010770 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6c57ff7d45-ndk5x" podStartSLOduration=29.756195452 podStartE2EDuration="42.010754004s" podCreationTimestamp="2026-04-20 22:23:54 +0000 UTC" firstStartedPulling="2026-04-20 22:24:21.910724563 +0000 UTC m=+34.863679160" lastFinishedPulling="2026-04-20 22:24:34.165283111 +0000 UTC m=+47.118237712" observedRunningTime="2026-04-20 22:24:35.138135723 +0000 UTC m=+48.091090345" watchObservedRunningTime="2026-04-20 22:24:36.010754004 +0000 UTC m=+48.963708625" Apr 20 22:24:36.366067 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:36.365517 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c06378d7-946b-49c3-ac21-44605e27cdd5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lxvk8\" (UID: \"c06378d7-946b-49c3-ac21-44605e27cdd5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8" Apr 20 22:24:36.366067 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:36.365574 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a9bfbef-0d72-430e-b23e-bb50623a7093-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-98hzg\" (UID: \"3a9bfbef-0d72-430e-b23e-bb50623a7093\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-98hzg" Apr 20 22:24:36.366067 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:36.365639 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:36.366067 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:36.365774 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 22:24:36.366067 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:36.365790 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c69c58687-c7dk5: secret "image-registry-tls" not found Apr 20 22:24:36.366067 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:36.365953 2573 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 22:24:36.366067 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:36.366029 2573 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 22:24:36.367362 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:36.365848 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls podName:e184abd3-491a-42d4-baec-feffd1648520 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:52.365829232 +0000 UTC m=+65.318783847 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls") pod "image-registry-5c69c58687-c7dk5" (UID: "e184abd3-491a-42d4-baec-feffd1648520") : secret "image-registry-tls" not found Apr 20 22:24:36.367449 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:36.367388 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c06378d7-946b-49c3-ac21-44605e27cdd5-cluster-monitoring-operator-tls podName:c06378d7-946b-49c3-ac21-44605e27cdd5 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:52.367366863 +0000 UTC m=+65.320321466 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c06378d7-946b-49c3-ac21-44605e27cdd5-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-lxvk8" (UID: "c06378d7-946b-49c3-ac21-44605e27cdd5") : secret "cluster-monitoring-operator-tls" not found Apr 20 22:24:36.367449 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:36.367407 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3a9bfbef-0d72-430e-b23e-bb50623a7093-samples-operator-tls podName:3a9bfbef-0d72-430e-b23e-bb50623a7093 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:52.367396396 +0000 UTC m=+65.320351002 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3a9bfbef-0d72-430e-b23e-bb50623a7093-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-98hzg" (UID: "3a9bfbef-0d72-430e-b23e-bb50623a7093") : secret "samples-operator-tls" not found Apr 20 22:24:36.434011 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:36.433588 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-9qgmc"] Apr 20 22:24:36.459177 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:36.459148 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-9qgmc"] Apr 20 22:24:36.459353 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:36.459291 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9qgmc" Apr 20 22:24:36.462185 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:36.462161 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-45v7h\"" Apr 20 22:24:36.462356 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:36.462306 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 22:24:36.462808 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:36.462621 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 22:24:36.466656 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:36.466634 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:36.466883 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:36.466800 2573 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 22:24:36.466883 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:36.466822 2573 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-649dd96b44-r7c7b: secret "image-registry-tls" not found Apr 20 22:24:36.466883 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:36.466819 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c99a639-1f48-429a-a14e-800ce227becb-service-ca-bundle\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:36.467529 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:36.466931 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls podName:bf5aed09-1fd8-4294-aef5-ee13e17b2bf3 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:52.466910644 +0000 UTC m=+65.419865245 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls") pod "image-registry-649dd96b44-r7c7b" (UID: "bf5aed09-1fd8-4294-aef5-ee13e17b2bf3") : secret "image-registry-tls" not found Apr 20 22:24:36.467529 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:36.466976 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7c99a639-1f48-429a-a14e-800ce227becb-service-ca-bundle podName:7c99a639-1f48-429a-a14e-800ce227becb nodeName:}" failed. No retries permitted until 2026-04-20 22:24:52.466958323 +0000 UTC m=+65.419912934 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/7c99a639-1f48-429a-a14e-800ce227becb-service-ca-bundle") pod "router-default-84455b6c98-44svx" (UID: "7c99a639-1f48-429a-a14e-800ce227becb") : configmap references non-existent config key: service-ca.crt Apr 20 22:24:36.467529 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:36.467106 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-metrics-certs\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:36.467529 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:36.467217 2573 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 22:24:36.467529 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:36.467254 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-metrics-certs podName:7c99a639-1f48-429a-a14e-800ce227becb nodeName:}" failed. No retries permitted until 2026-04-20 22:24:52.467241354 +0000 UTC m=+65.420195970 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-metrics-certs") pod "router-default-84455b6c98-44svx" (UID: "7c99a639-1f48-429a-a14e-800ce227becb") : secret "router-metrics-certs-default" not found Apr 20 22:24:36.567956 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:36.567912 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khl5z\" (UniqueName: \"kubernetes.io/projected/e37007b0-9ddd-4f6c-90e7-b3a1dd501568-kube-api-access-khl5z\") pod \"migrator-74bb7799d9-9qgmc\" (UID: \"e37007b0-9ddd-4f6c-90e7-b3a1dd501568\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9qgmc" Apr 20 22:24:36.568132 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:36.567985 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-cert\") pod \"ingress-canary-d5s8x\" (UID: \"82357e1f-f9a8-4cf7-b3dd-fe77912c49a1\") " pod="openshift-ingress-canary/ingress-canary-d5s8x" Apr 20 22:24:36.568132 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:36.568039 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4ed632e-0c77-4b80-b076-66bdfd17da84-metrics-tls\") pod \"dns-default-fj4gp\" (UID: \"e4ed632e-0c77-4b80-b076-66bdfd17da84\") " pod="openshift-dns/dns-default-fj4gp" Apr 20 22:24:36.568255 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:36.568193 2573 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 22:24:36.568255 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:36.568252 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4ed632e-0c77-4b80-b076-66bdfd17da84-metrics-tls podName:e4ed632e-0c77-4b80-b076-66bdfd17da84 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:52.568236974 +0000 UTC m=+65.521191574 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e4ed632e-0c77-4b80-b076-66bdfd17da84-metrics-tls") pod "dns-default-fj4gp" (UID: "e4ed632e-0c77-4b80-b076-66bdfd17da84") : secret "dns-default-metrics-tls" not found Apr 20 22:24:36.568475 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:36.568446 2573 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 22:24:36.568574 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:36.568517 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-cert podName:82357e1f-f9a8-4cf7-b3dd-fe77912c49a1 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:52.56850002 +0000 UTC m=+65.521454633 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-cert") pod "ingress-canary-d5s8x" (UID: "82357e1f-f9a8-4cf7-b3dd-fe77912c49a1") : secret "canary-serving-cert" not found Apr 20 22:24:36.668746 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:36.668645 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khl5z\" (UniqueName: \"kubernetes.io/projected/e37007b0-9ddd-4f6c-90e7-b3a1dd501568-kube-api-access-khl5z\") pod \"migrator-74bb7799d9-9qgmc\" (UID: \"e37007b0-9ddd-4f6c-90e7-b3a1dd501568\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9qgmc" Apr 20 22:24:36.678117 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:36.678085 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khl5z\" (UniqueName: \"kubernetes.io/projected/e37007b0-9ddd-4f6c-90e7-b3a1dd501568-kube-api-access-khl5z\") pod \"migrator-74bb7799d9-9qgmc\" (UID: \"e37007b0-9ddd-4f6c-90e7-b3a1dd501568\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9qgmc" Apr 20 22:24:36.772038 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:36.772004 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9qgmc" Apr 20 22:24:36.997010 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:36.996930 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xnj54_4c475df6-d751-4f10-81c7-a1e56dec9176/console-operator/1.log" Apr 20 22:24:36.997432 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:36.997348 2573 scope.go:117] "RemoveContainer" containerID="cf6b097107973e8c412c796852f65e9438ea4e18a6db82613b2858ecac5434d6" Apr 20 22:24:36.997601 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:36.997566 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-xnj54_openshift-console-operator(4c475df6-d751-4f10-81c7-a1e56dec9176)\"" pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" podUID="4c475df6-d751-4f10-81c7-a1e56dec9176" Apr 20 22:24:37.182906 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:37.182877 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-9qgmc"] Apr 20 22:24:37.186678 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:24:37.186647 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode37007b0_9ddd_4f6c_90e7_b3a1dd501568.slice/crio-2e6507df380bbfce39b573e10e225ce187a7f9c69bee56777d9e0f70b5427fc0 WatchSource:0}: Error finding container 2e6507df380bbfce39b573e10e225ce187a7f9c69bee56777d9e0f70b5427fc0: Status 404 returned error can't find the container with id 2e6507df380bbfce39b573e10e225ce187a7f9c69bee56777d9e0f70b5427fc0 Apr 20 22:24:37.286505 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:37.286484 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-56xnh_3533f633-4984-403c-9826-8812fe861cca/dns-node-resolver/0.log" Apr 20 22:24:38.002353 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.002241 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" event={"ID":"908bd97a-6313-4646-ab52-90bb7ffefdaa","Type":"ContainerStarted","Data":"cd27339081a619d08cb621f78de14f3e933198c4a9f1b3c49d83ce771df69802"} Apr 20 22:24:38.002353 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.002284 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" event={"ID":"908bd97a-6313-4646-ab52-90bb7ffefdaa","Type":"ContainerStarted","Data":"e492f296fcf18e41f074e4d81f75a6f1b9ccf1fb125419460795bb9cd9f5b838"} Apr 20 22:24:38.003844 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.003812 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9qgmc" event={"ID":"e37007b0-9ddd-4f6c-90e7-b3a1dd501568","Type":"ContainerStarted","Data":"2e6507df380bbfce39b573e10e225ce187a7f9c69bee56777d9e0f70b5427fc0"} Apr 20 22:24:38.020276 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.020228 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" podStartSLOduration=28.220899852 podStartE2EDuration="44.020213715s" podCreationTimestamp="2026-04-20 22:23:54 +0000 UTC" firstStartedPulling="2026-04-20 22:24:21.708546209 +0000 UTC m=+34.661500807" lastFinishedPulling="2026-04-20 22:24:37.507860057 +0000 UTC m=+50.460814670" observedRunningTime="2026-04-20 22:24:38.019216592 +0000 UTC m=+50.972171214" watchObservedRunningTime="2026-04-20 22:24:38.020213715 +0000 UTC m=+50.973168335" Apr 20 22:24:38.448655 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.448615 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jq9th"] Apr 20 22:24:38.468327 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.468297 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jq9th"] Apr 20 22:24:38.468480 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.468417 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-jq9th" Apr 20 22:24:38.471176 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.471152 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 20 22:24:38.471176 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.471163 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 20 22:24:38.472376 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.472354 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-9f9vx\"" Apr 20 22:24:38.472484 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.472410 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 20 22:24:38.472674 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.472656 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 20 22:24:38.486040 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.486019 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-m4npw_1d341b24-8cdf-4d59-a97c-54cecc195860/node-ca/0.log" Apr 20 22:24:38.487411 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.487389 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zql98\" (UniqueName: \"kubernetes.io/projected/0619557a-0b72-4b8b-981d-2f328ee552e6-kube-api-access-zql98\") pod \"service-ca-865cb79987-jq9th\" (UID: \"0619557a-0b72-4b8b-981d-2f328ee552e6\") " pod="openshift-service-ca/service-ca-865cb79987-jq9th" Apr 20 22:24:38.487619 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.487600 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0619557a-0b72-4b8b-981d-2f328ee552e6-signing-cabundle\") pod \"service-ca-865cb79987-jq9th\" (UID: \"0619557a-0b72-4b8b-981d-2f328ee552e6\") " pod="openshift-service-ca/service-ca-865cb79987-jq9th" Apr 20 22:24:38.487716 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.487693 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0619557a-0b72-4b8b-981d-2f328ee552e6-signing-key\") pod \"service-ca-865cb79987-jq9th\" (UID: \"0619557a-0b72-4b8b-981d-2f328ee552e6\") " pod="openshift-service-ca/service-ca-865cb79987-jq9th" Apr 20 22:24:38.589068 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.589031 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0619557a-0b72-4b8b-981d-2f328ee552e6-signing-cabundle\") pod \"service-ca-865cb79987-jq9th\" (UID: \"0619557a-0b72-4b8b-981d-2f328ee552e6\") " pod="openshift-service-ca/service-ca-865cb79987-jq9th" Apr 20 22:24:38.589251 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.589119 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0619557a-0b72-4b8b-981d-2f328ee552e6-signing-key\") pod \"service-ca-865cb79987-jq9th\" (UID: \"0619557a-0b72-4b8b-981d-2f328ee552e6\") " pod="openshift-service-ca/service-ca-865cb79987-jq9th" Apr 20 22:24:38.589251 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.589173 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zql98\" (UniqueName: \"kubernetes.io/projected/0619557a-0b72-4b8b-981d-2f328ee552e6-kube-api-access-zql98\") pod \"service-ca-865cb79987-jq9th\" (UID: \"0619557a-0b72-4b8b-981d-2f328ee552e6\") " pod="openshift-service-ca/service-ca-865cb79987-jq9th" Apr 20 22:24:38.589824 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.589797 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0619557a-0b72-4b8b-981d-2f328ee552e6-signing-cabundle\") pod \"service-ca-865cb79987-jq9th\" (UID: \"0619557a-0b72-4b8b-981d-2f328ee552e6\") " pod="openshift-service-ca/service-ca-865cb79987-jq9th" Apr 20 22:24:38.591943 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.591918 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0619557a-0b72-4b8b-981d-2f328ee552e6-signing-key\") pod \"service-ca-865cb79987-jq9th\" (UID: \"0619557a-0b72-4b8b-981d-2f328ee552e6\") " pod="openshift-service-ca/service-ca-865cb79987-jq9th" Apr 20 22:24:38.600828 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.600801 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zql98\" (UniqueName: \"kubernetes.io/projected/0619557a-0b72-4b8b-981d-2f328ee552e6-kube-api-access-zql98\") pod \"service-ca-865cb79987-jq9th\" (UID: \"0619557a-0b72-4b8b-981d-2f328ee552e6\") " pod="openshift-service-ca/service-ca-865cb79987-jq9th" Apr 20 22:24:38.778917 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:38.778884 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-jq9th" Apr 20 22:24:39.591825 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:39.591799 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-jq9th"] Apr 20 22:24:39.594314 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:24:39.594287 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0619557a_0b72_4b8b_981d_2f328ee552e6.slice/crio-e49c2ac82c1157f823457c02bbae4568a7b15ce39158d177fa4283530305a2c4 WatchSource:0}: Error finding container e49c2ac82c1157f823457c02bbae4568a7b15ce39158d177fa4283530305a2c4: Status 404 returned error can't find the container with id e49c2ac82c1157f823457c02bbae4568a7b15ce39158d177fa4283530305a2c4 Apr 20 22:24:40.011495 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:40.011455 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-5m4dx" event={"ID":"29a8ab11-5eff-49a0-b910-dbb219ffd462","Type":"ContainerStarted","Data":"2d65b2114af31d000d94f062423f6cbec361e039c5176b823c8f9724cf704009"} Apr 20 22:24:40.012947 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:40.012909 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9qgmc" event={"ID":"e37007b0-9ddd-4f6c-90e7-b3a1dd501568","Type":"ContainerStarted","Data":"dfd6b53f6f976e1fc191ef817cff8baeb3f1c6e57d57d442434dde5db4e7e39e"} Apr 20 22:24:40.012947 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:40.012935 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9qgmc" event={"ID":"e37007b0-9ddd-4f6c-90e7-b3a1dd501568","Type":"ContainerStarted","Data":"c8c8c93ef6e5cddf95409f39bf9ac1a6a4835a7d23beafee3d72dccf1f19538d"} Apr 20 22:24:40.014134 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:40.014112 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-jq9th" event={"ID":"0619557a-0b72-4b8b-981d-2f328ee552e6","Type":"ContainerStarted","Data":"350eac2e08296237f6799e060784bd93f3b8db9c0978d72433977ddef6600d64"} Apr 20 22:24:40.014219 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:40.014140 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-jq9th" event={"ID":"0619557a-0b72-4b8b-981d-2f328ee552e6","Type":"ContainerStarted","Data":"e49c2ac82c1157f823457c02bbae4568a7b15ce39158d177fa4283530305a2c4"} Apr 20 22:24:40.028232 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:40.028182 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-5m4dx" podStartSLOduration=3.888793756 podStartE2EDuration="9.028167318s" podCreationTimestamp="2026-04-20 22:24:31 +0000 UTC" firstStartedPulling="2026-04-20 22:24:34.331468635 +0000 UTC m=+47.284423235" lastFinishedPulling="2026-04-20 22:24:39.470842189 +0000 UTC m=+52.423796797" observedRunningTime="2026-04-20 22:24:40.027917585 +0000 UTC m=+52.980872205" watchObservedRunningTime="2026-04-20 22:24:40.028167318 +0000 UTC m=+52.981121936" Apr 20 22:24:40.045012 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:40.044964 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-9qgmc" podStartSLOduration=1.776707399 podStartE2EDuration="4.044950715s" podCreationTimestamp="2026-04-20 22:24:36 +0000 UTC" firstStartedPulling="2026-04-20 22:24:37.18903937 +0000 UTC m=+50.141993969" lastFinishedPulling="2026-04-20 22:24:39.457282674 +0000 UTC m=+52.410237285" observedRunningTime="2026-04-20 22:24:40.043992703 +0000 UTC m=+52.996947335" watchObservedRunningTime="2026-04-20 22:24:40.044950715 +0000 UTC m=+52.997905336" Apr 20 22:24:40.060234 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:40.060200 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-jq9th" podStartSLOduration=2.060189508 podStartE2EDuration="2.060189508s" podCreationTimestamp="2026-04-20 22:24:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:24:40.060000864 +0000 UTC m=+53.012955484" watchObservedRunningTime="2026-04-20 22:24:40.060189508 +0000 UTC m=+53.013144173" Apr 20 22:24:40.767867 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:40.767817 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" Apr 20 22:24:40.767867 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:40.767871 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" Apr 20 22:24:40.768266 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:40.768240 2573 scope.go:117] "RemoveContainer" containerID="cf6b097107973e8c412c796852f65e9438ea4e18a6db82613b2858ecac5434d6" Apr 20 22:24:40.768418 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:40.768400 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-xnj54_openshift-console-operator(4c475df6-d751-4f10-81c7-a1e56dec9176)\"" pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" podUID="4c475df6-d751-4f10-81c7-a1e56dec9176" Apr 20 22:24:41.215948 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:41.215848 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4fd8s\" (UID: \"51e93c8f-5d21-4964-8b01-3ddb5f2e5c86\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4fd8s" Apr 20 22:24:41.216089 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:41.216012 2573 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 22:24:41.216130 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:24:41.216090 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-networking-console-plugin-cert podName:51e93c8f-5d21-4964-8b01-3ddb5f2e5c86 nodeName:}" failed. No retries permitted until 2026-04-20 22:24:57.216070226 +0000 UTC m=+70.169024828 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-4fd8s" (UID: "51e93c8f-5d21-4964-8b01-3ddb5f2e5c86") : secret "networking-console-plugin-cert" not found Apr 20 22:24:44.835088 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:44.835061 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2trzq" Apr 20 22:24:52.419191 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.419151 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c06378d7-946b-49c3-ac21-44605e27cdd5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lxvk8\" (UID: \"c06378d7-946b-49c3-ac21-44605e27cdd5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8" Apr 20 22:24:52.419191 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.419199 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a9bfbef-0d72-430e-b23e-bb50623a7093-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-98hzg\" (UID: \"3a9bfbef-0d72-430e-b23e-bb50623a7093\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-98hzg" Apr 20 22:24:52.419656 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.419232 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:52.419656 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.419262 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs\") pod \"network-metrics-daemon-rl87j\" (UID: \"75df7794-7926-4023-a9fe-c8bb08e18219\") " pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:24:52.421705 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.421683 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c06378d7-946b-49c3-ac21-44605e27cdd5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-lxvk8\" (UID: \"c06378d7-946b-49c3-ac21-44605e27cdd5\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8" Apr 20 22:24:52.421815 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.421707 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3a9bfbef-0d72-430e-b23e-bb50623a7093-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-98hzg\" (UID: \"3a9bfbef-0d72-430e-b23e-bb50623a7093\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-98hzg" Apr 20 22:24:52.421815 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.421781 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls\") pod \"image-registry-5c69c58687-c7dk5\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:52.422029 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.422011 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 22:24:52.431916 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.431895 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75df7794-7926-4023-a9fe-c8bb08e18219-metrics-certs\") pod \"network-metrics-daemon-rl87j\" (UID: \"75df7794-7926-4023-a9fe-c8bb08e18219\") " pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:24:52.454272 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.454249 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-ddtmq\"" Apr 20 22:24:52.462259 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.462240 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-rl87j" Apr 20 22:24:52.519790 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.519751 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:52.519972 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.519883 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c99a639-1f48-429a-a14e-800ce227becb-service-ca-bundle\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:52.519972 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.519947 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-svn47\" (UniqueName: \"kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47\") pod \"network-check-target-2wb2c\" (UID: \"601775e4-554d-4221-907f-4a5d646c32e4\") " pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:24:52.520054 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.520009 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-metrics-certs\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:52.520551 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.520522 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c99a639-1f48-429a-a14e-800ce227becb-service-ca-bundle\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:52.522611 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.522586 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7c99a639-1f48-429a-a14e-800ce227becb-metrics-certs\") pod \"router-default-84455b6c98-44svx\" (UID: \"7c99a639-1f48-429a-a14e-800ce227becb\") " pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:52.522714 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.522622 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls\") pod \"image-registry-649dd96b44-r7c7b\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:52.522900 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.522878 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-svn47\" (UniqueName: \"kubernetes.io/projected/601775e4-554d-4221-907f-4a5d646c32e4-kube-api-access-svn47\") pod \"network-check-target-2wb2c\" (UID: \"601775e4-554d-4221-907f-4a5d646c32e4\") " pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:24:52.575866 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.575818 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6fksq\"" Apr 20 22:24:52.583948 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.583912 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:52.595292 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.595266 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-rl87j"] Apr 20 22:24:52.597030 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.597007 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-bvhfk\"" Apr 20 22:24:52.598232 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:24:52.598210 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75df7794_7926_4023_a9fe_c8bb08e18219.slice/crio-15e9705596d892c7ee1c0f6289fb5bb822854a2e9a967c5594c46198c7023716 WatchSource:0}: Error finding container 15e9705596d892c7ee1c0f6289fb5bb822854a2e9a967c5594c46198c7023716: Status 404 returned error can't find the container with id 15e9705596d892c7ee1c0f6289fb5bb822854a2e9a967c5594c46198c7023716 Apr 20 22:24:52.605259 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.605240 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-98hzg" Apr 20 22:24:52.622223 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.622197 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-qxkk8\"" Apr 20 22:24:52.622391 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.622360 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-cert\") pod \"ingress-canary-d5s8x\" (UID: \"82357e1f-f9a8-4cf7-b3dd-fe77912c49a1\") " pod="openshift-ingress-canary/ingress-canary-d5s8x" Apr 20 22:24:52.622479 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.622428 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4ed632e-0c77-4b80-b076-66bdfd17da84-metrics-tls\") pod \"dns-default-fj4gp\" (UID: \"e4ed632e-0c77-4b80-b076-66bdfd17da84\") " pod="openshift-dns/dns-default-fj4gp" Apr 20 22:24:52.624974 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.624953 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e4ed632e-0c77-4b80-b076-66bdfd17da84-metrics-tls\") pod \"dns-default-fj4gp\" (UID: \"e4ed632e-0c77-4b80-b076-66bdfd17da84\") " pod="openshift-dns/dns-default-fj4gp" Apr 20 22:24:52.625129 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.625113 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82357e1f-f9a8-4cf7-b3dd-fe77912c49a1-cert\") pod \"ingress-canary-d5s8x\" (UID: \"82357e1f-f9a8-4cf7-b3dd-fe77912c49a1\") " pod="openshift-ingress-canary/ingress-canary-d5s8x" Apr 20 22:24:52.630020 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.629968 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8" Apr 20 22:24:52.676922 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.676749 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:52.693443 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.693203 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-rnl2b\"" Apr 20 22:24:52.702391 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.701666 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:52.723088 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.723032 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5c69c58687-c7dk5"] Apr 20 22:24:52.729785 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:24:52.729746 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode184abd3_491a_42d4_baec_feffd1648520.slice/crio-4a3ee4fe7127a12e1ffa8b4031024ff49c84184d6e1d691a1edac0e0db652cb2 WatchSource:0}: Error finding container 4a3ee4fe7127a12e1ffa8b4031024ff49c84184d6e1d691a1edac0e0db652cb2: Status 404 returned error can't find the container with id 4a3ee4fe7127a12e1ffa8b4031024ff49c84184d6e1d691a1edac0e0db652cb2 Apr 20 22:24:52.740968 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.740938 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-66fzs\"" Apr 20 22:24:52.747923 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.747873 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-98hzg"] Apr 20 22:24:52.748506 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.748476 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:24:52.777081 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.776814 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-bv5w5\"" Apr 20 22:24:52.788957 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.788630 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-d5s8x" Apr 20 22:24:52.803846 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.802614 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-bl4mg\"" Apr 20 22:24:52.803846 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.802896 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fj4gp" Apr 20 22:24:52.819926 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.819879 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8"] Apr 20 22:24:52.850793 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:24:52.850758 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc06378d7_946b_49c3_ac21_44605e27cdd5.slice/crio-e1f2fd16bc99522442627c2678186fc1d52a918de1ca7cce4502825fd0998389 WatchSource:0}: Error finding container e1f2fd16bc99522442627c2678186fc1d52a918de1ca7cce4502825fd0998389: Status 404 returned error can't find the container with id e1f2fd16bc99522442627c2678186fc1d52a918de1ca7cce4502825fd0998389 Apr 20 22:24:52.882975 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.882928 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-649dd96b44-r7c7b"] Apr 20 22:24:52.912926 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:24:52.904386 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf5aed09_1fd8_4294_aef5_ee13e17b2bf3.slice/crio-29d80b5aa1e4c6d93a9c84f7071dc7eab8ca1ea6878698787de345e389e170c1 WatchSource:0}: Error finding container 29d80b5aa1e4c6d93a9c84f7071dc7eab8ca1ea6878698787de345e389e170c1: Status 404 returned error can't find the container with id 29d80b5aa1e4c6d93a9c84f7071dc7eab8ca1ea6878698787de345e389e170c1 Apr 20 22:24:52.946244 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.944973 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-84455b6c98-44svx"] Apr 20 22:24:52.950154 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:24:52.950122 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c99a639_1f48_429a_a14e_800ce227becb.slice/crio-b8f44accff01aaefec4babce1271b5a025cd8ad6b2861ff5b32acdadf03ff963 WatchSource:0}: Error finding container b8f44accff01aaefec4babce1271b5a025cd8ad6b2861ff5b32acdadf03ff963: Status 404 returned error can't find the container with id b8f44accff01aaefec4babce1271b5a025cd8ad6b2861ff5b32acdadf03ff963 Apr 20 22:24:52.954580 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.954561 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2wb2c"] Apr 20 22:24:52.966094 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:24:52.966064 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod601775e4_554d_4221_907f_4a5d646c32e4.slice/crio-f64fa8c708483705a30f22616a2cee6f5feef6e9db1fb124d43640114f284888 WatchSource:0}: Error finding container f64fa8c708483705a30f22616a2cee6f5feef6e9db1fb124d43640114f284888: Status 404 returned error can't find the container with id f64fa8c708483705a30f22616a2cee6f5feef6e9db1fb124d43640114f284888 Apr 20 22:24:52.988430 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:52.988389 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fj4gp"] Apr 20 22:24:52.991318 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:24:52.991291 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4ed632e_0c77_4b80_b076_66bdfd17da84.slice/crio-34f26672fcc83a5ef4df42261588379a49d05fb5769f67a410cc2d471778c50b WatchSource:0}: Error finding container 34f26672fcc83a5ef4df42261588379a49d05fb5769f67a410cc2d471778c50b: Status 404 returned error can't find the container with id 34f26672fcc83a5ef4df42261588379a49d05fb5769f67a410cc2d471778c50b Apr 20 22:24:53.008386 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:53.008363 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-d5s8x"] Apr 20 22:24:53.010907 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:24:53.010881 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82357e1f_f9a8_4cf7_b3dd_fe77912c49a1.slice/crio-c1b205c804dae1c8db0e30bfafd70e461007128e3272a46c74008e4ee08abaa0 WatchSource:0}: Error finding container c1b205c804dae1c8db0e30bfafd70e461007128e3272a46c74008e4ee08abaa0: Status 404 returned error can't find the container with id c1b205c804dae1c8db0e30bfafd70e461007128e3272a46c74008e4ee08abaa0 Apr 20 22:24:53.050549 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:53.050523 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2wb2c" event={"ID":"601775e4-554d-4221-907f-4a5d646c32e4","Type":"ContainerStarted","Data":"f64fa8c708483705a30f22616a2cee6f5feef6e9db1fb124d43640114f284888"} Apr 20 22:24:53.052223 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:53.052196 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-84455b6c98-44svx" event={"ID":"7c99a639-1f48-429a-a14e-800ce227becb","Type":"ContainerStarted","Data":"b8f44accff01aaefec4babce1271b5a025cd8ad6b2861ff5b32acdadf03ff963"} Apr 20 22:24:53.053627 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:53.053589 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" event={"ID":"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3","Type":"ContainerStarted","Data":"b7a960552e020ef63a6300953aa00ec51681e8a062118d215a63b4c0da8e7100"} Apr 20 22:24:53.053627 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:53.053617 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" event={"ID":"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3","Type":"ContainerStarted","Data":"29d80b5aa1e4c6d93a9c84f7071dc7eab8ca1ea6878698787de345e389e170c1"} Apr 20 22:24:53.053864 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:53.053830 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:24:53.055140 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:53.055111 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fj4gp" event={"ID":"e4ed632e-0c77-4b80-b076-66bdfd17da84","Type":"ContainerStarted","Data":"34f26672fcc83a5ef4df42261588379a49d05fb5769f67a410cc2d471778c50b"} Apr 20 22:24:53.056268 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:53.056244 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rl87j" event={"ID":"75df7794-7926-4023-a9fe-c8bb08e18219","Type":"ContainerStarted","Data":"15e9705596d892c7ee1c0f6289fb5bb822854a2e9a967c5594c46198c7023716"} Apr 20 22:24:53.057425 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:53.057402 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8" event={"ID":"c06378d7-946b-49c3-ac21-44605e27cdd5","Type":"ContainerStarted","Data":"e1f2fd16bc99522442627c2678186fc1d52a918de1ca7cce4502825fd0998389"} Apr 20 22:24:53.058474 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:53.058442 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-d5s8x" event={"ID":"82357e1f-f9a8-4cf7-b3dd-fe77912c49a1","Type":"ContainerStarted","Data":"c1b205c804dae1c8db0e30bfafd70e461007128e3272a46c74008e4ee08abaa0"} Apr 20 22:24:53.059572 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:53.059540 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-98hzg" event={"ID":"3a9bfbef-0d72-430e-b23e-bb50623a7093","Type":"ContainerStarted","Data":"8ceecb49f6353084c1f4ec0ada173da497446a72b5229097f698026410f8a1c5"} Apr 20 22:24:53.061024 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:53.061004 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" event={"ID":"e184abd3-491a-42d4-baec-feffd1648520","Type":"ContainerStarted","Data":"1fa300847babf07a9fc0e126fd99c3c665326ca7700a647503da21c75e4197c7"} Apr 20 22:24:53.061120 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:53.061027 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" event={"ID":"e184abd3-491a-42d4-baec-feffd1648520","Type":"ContainerStarted","Data":"4a3ee4fe7127a12e1ffa8b4031024ff49c84184d6e1d691a1edac0e0db652cb2"} Apr 20 22:24:53.061414 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:53.061399 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:24:53.071807 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:53.071757 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" podStartSLOduration=43.071744944 podStartE2EDuration="43.071744944s" podCreationTimestamp="2026-04-20 22:24:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:24:53.070162829 +0000 UTC m=+66.023117471" watchObservedRunningTime="2026-04-20 22:24:53.071744944 +0000 UTC m=+66.024699563" Apr 20 22:24:54.083087 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:54.080817 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2wb2c" event={"ID":"601775e4-554d-4221-907f-4a5d646c32e4","Type":"ContainerStarted","Data":"b4328bbca384996b45e05773cffc013d9bb0a960284bcdf543552bcdc1d7d839"} Apr 20 22:24:54.083087 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:54.082156 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:24:54.085709 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:54.084877 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-84455b6c98-44svx" event={"ID":"7c99a639-1f48-429a-a14e-800ce227becb","Type":"ContainerStarted","Data":"e6a9e186bed4784062a751cc45de47fd64051d0d714ca1826baa0e66bf248398"} Apr 20 22:24:54.099604 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:54.097434 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" podStartSLOduration=66.097418837 podStartE2EDuration="1m6.097418837s" podCreationTimestamp="2026-04-20 22:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:24:53.088570239 +0000 UTC m=+66.041524901" watchObservedRunningTime="2026-04-20 22:24:54.097418837 +0000 UTC m=+67.050373459" Apr 20 22:24:54.099604 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:54.097808 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-2wb2c" podStartSLOduration=66.097798847 podStartE2EDuration="1m6.097798847s" podCreationTimestamp="2026-04-20 22:23:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:24:54.095931216 +0000 UTC m=+67.048885836" watchObservedRunningTime="2026-04-20 22:24:54.097798847 +0000 UTC m=+67.050753467" Apr 20 22:24:54.702366 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:54.702335 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:54.705133 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:54.705112 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:54.725877 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:54.725803 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-84455b6c98-44svx" podStartSLOduration=44.725784706 podStartE2EDuration="44.725784706s" podCreationTimestamp="2026-04-20 22:24:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:24:54.115219593 +0000 UTC m=+67.068174223" watchObservedRunningTime="2026-04-20 22:24:54.725784706 +0000 UTC m=+67.678739326" Apr 20 22:24:55.088892 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:55.088843 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:55.090072 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:55.090043 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-84455b6c98-44svx" Apr 20 22:24:56.596809 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:56.596456 2573 scope.go:117] "RemoveContainer" containerID="cf6b097107973e8c412c796852f65e9438ea4e18a6db82613b2858ecac5434d6" Apr 20 22:24:56.827464 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:56.827385 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8bv96"] Apr 20 22:24:56.832688 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:56.832664 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8bv96" Apr 20 22:24:56.836375 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:56.836327 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 22:24:56.836493 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:56.836432 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 22:24:56.839666 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:56.839453 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rsxbh\"" Apr 20 22:24:56.841796 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:56.841623 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8bv96"] Apr 20 22:24:56.877270 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:56.877245 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-649dd96b44-r7c7b"] Apr 20 22:24:56.966773 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:56.966741 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ce2b0f82-932f-457f-bd81-3a5c0a321390-data-volume\") pod \"insights-runtime-extractor-8bv96\" (UID: \"ce2b0f82-932f-457f-bd81-3a5c0a321390\") " pod="openshift-insights/insights-runtime-extractor-8bv96" Apr 20 22:24:56.967014 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:56.966995 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ce2b0f82-932f-457f-bd81-3a5c0a321390-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8bv96\" (UID: \"ce2b0f82-932f-457f-bd81-3a5c0a321390\") " pod="openshift-insights/insights-runtime-extractor-8bv96" Apr 20 22:24:56.967224 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:56.967197 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql8vj\" (UniqueName: \"kubernetes.io/projected/ce2b0f82-932f-457f-bd81-3a5c0a321390-kube-api-access-ql8vj\") pod \"insights-runtime-extractor-8bv96\" (UID: \"ce2b0f82-932f-457f-bd81-3a5c0a321390\") " pod="openshift-insights/insights-runtime-extractor-8bv96" Apr 20 22:24:56.967289 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:56.967261 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ce2b0f82-932f-457f-bd81-3a5c0a321390-crio-socket\") pod \"insights-runtime-extractor-8bv96\" (UID: \"ce2b0f82-932f-457f-bd81-3a5c0a321390\") " pod="openshift-insights/insights-runtime-extractor-8bv96" Apr 20 22:24:56.967343 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:56.967297 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ce2b0f82-932f-457f-bd81-3a5c0a321390-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8bv96\" (UID: \"ce2b0f82-932f-457f-bd81-3a5c0a321390\") " pod="openshift-insights/insights-runtime-extractor-8bv96" Apr 20 22:24:57.030224 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.030192 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vqj8t"] Apr 20 22:24:57.033690 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.033672 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vqj8t" Apr 20 22:24:57.036127 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.036099 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 22:24:57.036242 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.036152 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-jdqb2\"" Apr 20 22:24:57.040739 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.040717 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vqj8t"] Apr 20 22:24:57.067927 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.067899 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ql8vj\" (UniqueName: \"kubernetes.io/projected/ce2b0f82-932f-457f-bd81-3a5c0a321390-kube-api-access-ql8vj\") pod \"insights-runtime-extractor-8bv96\" (UID: \"ce2b0f82-932f-457f-bd81-3a5c0a321390\") " pod="openshift-insights/insights-runtime-extractor-8bv96" Apr 20 22:24:57.068116 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.067942 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ce2b0f82-932f-457f-bd81-3a5c0a321390-crio-socket\") pod \"insights-runtime-extractor-8bv96\" (UID: \"ce2b0f82-932f-457f-bd81-3a5c0a321390\") " pod="openshift-insights/insights-runtime-extractor-8bv96" Apr 20 22:24:57.068116 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.067965 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ce2b0f82-932f-457f-bd81-3a5c0a321390-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8bv96\" (UID: \"ce2b0f82-932f-457f-bd81-3a5c0a321390\") " pod="openshift-insights/insights-runtime-extractor-8bv96" Apr 20 22:24:57.068116 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.068023 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ce2b0f82-932f-457f-bd81-3a5c0a321390-data-volume\") pod \"insights-runtime-extractor-8bv96\" (UID: \"ce2b0f82-932f-457f-bd81-3a5c0a321390\") " pod="openshift-insights/insights-runtime-extractor-8bv96" Apr 20 22:24:57.068116 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.068047 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ce2b0f82-932f-457f-bd81-3a5c0a321390-crio-socket\") pod \"insights-runtime-extractor-8bv96\" (UID: \"ce2b0f82-932f-457f-bd81-3a5c0a321390\") " pod="openshift-insights/insights-runtime-extractor-8bv96" Apr 20 22:24:57.068116 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.068052 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ce2b0f82-932f-457f-bd81-3a5c0a321390-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8bv96\" (UID: \"ce2b0f82-932f-457f-bd81-3a5c0a321390\") " pod="openshift-insights/insights-runtime-extractor-8bv96" Apr 20 22:24:57.068377 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.068357 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ce2b0f82-932f-457f-bd81-3a5c0a321390-data-volume\") pod \"insights-runtime-extractor-8bv96\" (UID: \"ce2b0f82-932f-457f-bd81-3a5c0a321390\") " pod="openshift-insights/insights-runtime-extractor-8bv96" Apr 20 22:24:57.068648 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.068627 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ce2b0f82-932f-457f-bd81-3a5c0a321390-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8bv96\" (UID: \"ce2b0f82-932f-457f-bd81-3a5c0a321390\") " pod="openshift-insights/insights-runtime-extractor-8bv96" Apr 20 22:24:57.070532 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.070504 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ce2b0f82-932f-457f-bd81-3a5c0a321390-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8bv96\" (UID: \"ce2b0f82-932f-457f-bd81-3a5c0a321390\") " pod="openshift-insights/insights-runtime-extractor-8bv96" Apr 20 22:24:57.077141 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.077115 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql8vj\" (UniqueName: \"kubernetes.io/projected/ce2b0f82-932f-457f-bd81-3a5c0a321390-kube-api-access-ql8vj\") pod \"insights-runtime-extractor-8bv96\" (UID: \"ce2b0f82-932f-457f-bd81-3a5c0a321390\") " pod="openshift-insights/insights-runtime-extractor-8bv96" Apr 20 22:24:57.097047 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.096991 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xnj54_4c475df6-d751-4f10-81c7-a1e56dec9176/console-operator/1.log" Apr 20 22:24:57.097133 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.097062 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" event={"ID":"4c475df6-d751-4f10-81c7-a1e56dec9176","Type":"ContainerStarted","Data":"122aa72072adf7b46b925e3e76691ac6cc2a29acaf3690968fad50c5d4d165c4"} Apr 20 22:24:57.097386 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.097365 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" Apr 20 22:24:57.098763 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.098738 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-d5s8x" event={"ID":"82357e1f-f9a8-4cf7-b3dd-fe77912c49a1","Type":"ContainerStarted","Data":"daf3f539741effb6911cf25faf0ce713ad807ebb42b305036b23ffd81cca4dee"} Apr 20 22:24:57.100420 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.100398 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-98hzg" event={"ID":"3a9bfbef-0d72-430e-b23e-bb50623a7093","Type":"ContainerStarted","Data":"ac1c5b865a6f5c978f5d53cf6898ba32c4520c11a3fa2a425f695753661b170b"} Apr 20 22:24:57.100521 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.100425 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-98hzg" event={"ID":"3a9bfbef-0d72-430e-b23e-bb50623a7093","Type":"ContainerStarted","Data":"193637c0b2bc1430b7aa5ded87d8c0bcb43decb8365ad35cfc9c247819b6b037"} Apr 20 22:24:57.102015 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.101996 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fj4gp" event={"ID":"e4ed632e-0c77-4b80-b076-66bdfd17da84","Type":"ContainerStarted","Data":"4e2bc5b5e4768742cf48199830d0e0053500261f3c40666e15745f52a860a196"} Apr 20 22:24:57.102115 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.102018 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fj4gp" event={"ID":"e4ed632e-0c77-4b80-b076-66bdfd17da84","Type":"ContainerStarted","Data":"1b341d842d345a317387362d3b1e742f81ac49a7cf89a9e798411ef3642cd93d"} Apr 20 22:24:57.102181 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.102125 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-fj4gp" Apr 20 22:24:57.103634 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.103606 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rl87j" event={"ID":"75df7794-7926-4023-a9fe-c8bb08e18219","Type":"ContainerStarted","Data":"edb73eeffd6bbd58e75de240506c116dbc3f4e75f6f01dcd6c9dd6505b67b406"} Apr 20 22:24:57.103634 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.103633 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-rl87j" event={"ID":"75df7794-7926-4023-a9fe-c8bb08e18219","Type":"ContainerStarted","Data":"190a9d0474b2c5d25770952e0eebd173030acba0671baf7c03336db8c4858f42"} Apr 20 22:24:57.104996 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.104975 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8" event={"ID":"c06378d7-946b-49c3-ac21-44605e27cdd5","Type":"ContainerStarted","Data":"2123bb17e5132b297ede954bc128e53a56953341578ef1677c171484cec0f2e7"} Apr 20 22:24:57.121012 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.120971 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" podStartSLOduration=36.235071616 podStartE2EDuration="48.120960129s" podCreationTimestamp="2026-04-20 22:24:09 +0000 UTC" firstStartedPulling="2026-04-20 22:24:21.704674883 +0000 UTC m=+34.657629486" lastFinishedPulling="2026-04-20 22:24:33.590563399 +0000 UTC m=+46.543517999" observedRunningTime="2026-04-20 22:24:57.119427813 +0000 UTC m=+70.072382455" watchObservedRunningTime="2026-04-20 22:24:57.120960129 +0000 UTC m=+70.073914748" Apr 20 22:24:57.145150 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.145120 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8bv96" Apr 20 22:24:57.147289 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.147252 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-rl87j" podStartSLOduration=66.459748129 podStartE2EDuration="1m10.147240447s" podCreationTimestamp="2026-04-20 22:23:47 +0000 UTC" firstStartedPulling="2026-04-20 22:24:52.600169484 +0000 UTC m=+65.553124082" lastFinishedPulling="2026-04-20 22:24:56.287661789 +0000 UTC m=+69.240616400" observedRunningTime="2026-04-20 22:24:57.146285881 +0000 UTC m=+70.099240501" watchObservedRunningTime="2026-04-20 22:24:57.147240447 +0000 UTC m=+70.100195067" Apr 20 22:24:57.169562 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.169532 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/790e03ca-74f0-4b4c-8111-f962f1503d6f-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-vqj8t\" (UID: \"790e03ca-74f0-4b4c-8111-f962f1503d6f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vqj8t" Apr 20 22:24:57.185579 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.185521 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fj4gp" podStartSLOduration=33.883896606 podStartE2EDuration="37.185502558s" podCreationTimestamp="2026-04-20 22:24:20 +0000 UTC" firstStartedPulling="2026-04-20 22:24:52.993685795 +0000 UTC m=+65.946640391" lastFinishedPulling="2026-04-20 22:24:56.295291744 +0000 UTC m=+69.248246343" observedRunningTime="2026-04-20 22:24:57.184887861 +0000 UTC m=+70.137842481" watchObservedRunningTime="2026-04-20 22:24:57.185502558 +0000 UTC m=+70.138457178" Apr 20 22:24:57.216846 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.216789 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-d5s8x" podStartSLOduration=33.927571436 podStartE2EDuration="37.216772616s" podCreationTimestamp="2026-04-20 22:24:20 +0000 UTC" firstStartedPulling="2026-04-20 22:24:53.012937983 +0000 UTC m=+65.965892581" lastFinishedPulling="2026-04-20 22:24:56.302139149 +0000 UTC m=+69.255093761" observedRunningTime="2026-04-20 22:24:57.214032868 +0000 UTC m=+70.166987488" watchObservedRunningTime="2026-04-20 22:24:57.216772616 +0000 UTC m=+70.169727233" Apr 20 22:24:57.238230 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.238165 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-lxvk8" podStartSLOduration=44.79667555 podStartE2EDuration="48.238147723s" podCreationTimestamp="2026-04-20 22:24:09 +0000 UTC" firstStartedPulling="2026-04-20 22:24:52.854395196 +0000 UTC m=+65.807349800" lastFinishedPulling="2026-04-20 22:24:56.29586736 +0000 UTC m=+69.248821973" observedRunningTime="2026-04-20 22:24:57.236873198 +0000 UTC m=+70.189827819" watchObservedRunningTime="2026-04-20 22:24:57.238147723 +0000 UTC m=+70.191102336" Apr 20 22:24:57.257738 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.257686 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-98hzg" podStartSLOduration=44.846745559 podStartE2EDuration="48.257671231s" podCreationTimestamp="2026-04-20 22:24:09 +0000 UTC" firstStartedPulling="2026-04-20 22:24:52.885344465 +0000 UTC m=+65.838299076" lastFinishedPulling="2026-04-20 22:24:56.296270145 +0000 UTC m=+69.249224748" observedRunningTime="2026-04-20 22:24:57.256024981 +0000 UTC m=+70.208979601" watchObservedRunningTime="2026-04-20 22:24:57.257671231 +0000 UTC m=+70.210625847" Apr 20 22:24:57.270356 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.270331 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4fd8s\" (UID: \"51e93c8f-5d21-4964-8b01-3ddb5f2e5c86\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4fd8s" Apr 20 22:24:57.270517 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.270389 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/790e03ca-74f0-4b4c-8111-f962f1503d6f-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-vqj8t\" (UID: \"790e03ca-74f0-4b4c-8111-f962f1503d6f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vqj8t" Apr 20 22:24:57.273469 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.273441 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/790e03ca-74f0-4b4c-8111-f962f1503d6f-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-vqj8t\" (UID: \"790e03ca-74f0-4b4c-8111-f962f1503d6f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vqj8t" Apr 20 22:24:57.273569 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.273474 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/51e93c8f-5d21-4964-8b01-3ddb5f2e5c86-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-4fd8s\" (UID: \"51e93c8f-5d21-4964-8b01-3ddb5f2e5c86\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-4fd8s" Apr 20 22:24:57.278103 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.278080 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8bv96"] Apr 20 22:24:57.283075 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:24:57.283051 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce2b0f82_932f_457f_bd81_3a5c0a321390.slice/crio-b09c3af100421267fdfc527a9a64b155be6e87d37c9d34c8e74c9fb5ce842a06 WatchSource:0}: Error finding container b09c3af100421267fdfc527a9a64b155be6e87d37c9d34c8e74c9fb5ce842a06: Status 404 returned error can't find the container with id b09c3af100421267fdfc527a9a64b155be6e87d37c9d34c8e74c9fb5ce842a06 Apr 20 22:24:57.344844 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.344808 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vqj8t" Apr 20 22:24:57.468983 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.468960 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vqj8t"] Apr 20 22:24:57.472187 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:24:57.472150 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod790e03ca_74f0_4b4c_8111_f962f1503d6f.slice/crio-8e45190b56603685036e0ff53877555bdf2fd7ff70aff58071212631da0d5ef0 WatchSource:0}: Error finding container 8e45190b56603685036e0ff53877555bdf2fd7ff70aff58071212631da0d5ef0: Status 404 returned error can't find the container with id 8e45190b56603685036e0ff53877555bdf2fd7ff70aff58071212631da0d5ef0 Apr 20 22:24:57.532905 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.532875 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-g99rh\"" Apr 20 22:24:57.540985 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.540966 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4fd8s" Apr 20 22:24:57.641164 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.641136 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-xnj54" Apr 20 22:24:57.670222 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.669888 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-4fd8s"] Apr 20 22:24:57.817243 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.817208 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-h8q5v"] Apr 20 22:24:57.826061 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.826036 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-h8q5v" Apr 20 22:24:57.829307 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.829234 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-zt6hw\"" Apr 20 22:24:57.829424 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.829238 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 22:24:57.829494 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.829433 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 22:24:57.831241 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.831195 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-h8q5v"] Apr 20 22:24:57.977259 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:57.977173 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5nqh\" (UniqueName: \"kubernetes.io/projected/27356a39-9ad0-4501-9cca-fbecf0cc9aa9-kube-api-access-g5nqh\") pod \"downloads-6bcc868b7-h8q5v\" (UID: \"27356a39-9ad0-4501-9cca-fbecf0cc9aa9\") " pod="openshift-console/downloads-6bcc868b7-h8q5v" Apr 20 22:24:58.078679 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:58.078651 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5nqh\" (UniqueName: \"kubernetes.io/projected/27356a39-9ad0-4501-9cca-fbecf0cc9aa9-kube-api-access-g5nqh\") pod \"downloads-6bcc868b7-h8q5v\" (UID: \"27356a39-9ad0-4501-9cca-fbecf0cc9aa9\") " pod="openshift-console/downloads-6bcc868b7-h8q5v" Apr 20 22:24:58.086724 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:58.086689 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5nqh\" (UniqueName: \"kubernetes.io/projected/27356a39-9ad0-4501-9cca-fbecf0cc9aa9-kube-api-access-g5nqh\") pod \"downloads-6bcc868b7-h8q5v\" (UID: \"27356a39-9ad0-4501-9cca-fbecf0cc9aa9\") " pod="openshift-console/downloads-6bcc868b7-h8q5v" Apr 20 22:24:58.109821 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:58.109771 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vqj8t" event={"ID":"790e03ca-74f0-4b4c-8111-f962f1503d6f","Type":"ContainerStarted","Data":"8e45190b56603685036e0ff53877555bdf2fd7ff70aff58071212631da0d5ef0"} Apr 20 22:24:58.111679 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:58.111651 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8bv96" event={"ID":"ce2b0f82-932f-457f-bd81-3a5c0a321390","Type":"ContainerStarted","Data":"40538b6bc3b52b0e834d36c527d42e41637860e137d77f65da2696762a52776b"} Apr 20 22:24:58.111796 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:58.111690 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8bv96" event={"ID":"ce2b0f82-932f-457f-bd81-3a5c0a321390","Type":"ContainerStarted","Data":"b09c3af100421267fdfc527a9a64b155be6e87d37c9d34c8e74c9fb5ce842a06"} Apr 20 22:24:58.113030 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:58.112882 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4fd8s" event={"ID":"51e93c8f-5d21-4964-8b01-3ddb5f2e5c86","Type":"ContainerStarted","Data":"e61910619bfed0a8dd699589f5107a1c969d86b8ad9f2037046cd2092f6df87b"} Apr 20 22:24:58.136391 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:58.136366 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-h8q5v" Apr 20 22:24:58.289837 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:58.289793 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-h8q5v"] Apr 20 22:24:58.551142 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:24:58.551107 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27356a39_9ad0_4501_9cca_fbecf0cc9aa9.slice/crio-fb91f56ee2316839df26bf8d94454cd6d958f2e31c2868ba2646390419d3150d WatchSource:0}: Error finding container fb91f56ee2316839df26bf8d94454cd6d958f2e31c2868ba2646390419d3150d: Status 404 returned error can't find the container with id fb91f56ee2316839df26bf8d94454cd6d958f2e31c2868ba2646390419d3150d Apr 20 22:24:59.118567 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:59.118439 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vqj8t" event={"ID":"790e03ca-74f0-4b4c-8111-f962f1503d6f","Type":"ContainerStarted","Data":"db6a77c8d3b876f72799d5ef0c0cd7f6afdb97df78b7ab4e0ce6775dd8c1c87c"} Apr 20 22:24:59.119013 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:59.118956 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vqj8t" Apr 20 22:24:59.119924 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:59.119878 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-h8q5v" event={"ID":"27356a39-9ad0-4501-9cca-fbecf0cc9aa9","Type":"ContainerStarted","Data":"fb91f56ee2316839df26bf8d94454cd6d958f2e31c2868ba2646390419d3150d"} Apr 20 22:24:59.124233 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:59.124188 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8bv96" event={"ID":"ce2b0f82-932f-457f-bd81-3a5c0a321390","Type":"ContainerStarted","Data":"c89b03e1ec3878c64bffe5d0d9757bd010193d05ad1e1d992e67d7ab76a5b44b"} Apr 20 22:24:59.125360 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:59.125344 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vqj8t" Apr 20 22:24:59.126636 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:59.126600 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4fd8s" event={"ID":"51e93c8f-5d21-4964-8b01-3ddb5f2e5c86","Type":"ContainerStarted","Data":"8ee70421ec8496252d0e5980fe2662bd6c3bb1de825b92646eddf191cada9b11"} Apr 20 22:24:59.135753 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:59.135684 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-vqj8t" podStartSLOduration=0.78331418 podStartE2EDuration="2.135671717s" podCreationTimestamp="2026-04-20 22:24:57 +0000 UTC" firstStartedPulling="2026-04-20 22:24:57.474455834 +0000 UTC m=+70.427410431" lastFinishedPulling="2026-04-20 22:24:58.826813357 +0000 UTC m=+71.779767968" observedRunningTime="2026-04-20 22:24:59.134326032 +0000 UTC m=+72.087280652" watchObservedRunningTime="2026-04-20 22:24:59.135671717 +0000 UTC m=+72.088626336" Apr 20 22:24:59.150925 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:24:59.150874 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-4fd8s" podStartSLOduration=33.000578761 podStartE2EDuration="34.150845103s" podCreationTimestamp="2026-04-20 22:24:25 +0000 UTC" firstStartedPulling="2026-04-20 22:24:57.678061275 +0000 UTC m=+70.631015875" lastFinishedPulling="2026-04-20 22:24:58.828327619 +0000 UTC m=+71.781282217" observedRunningTime="2026-04-20 22:24:59.148816757 +0000 UTC m=+72.101771378" watchObservedRunningTime="2026-04-20 22:24:59.150845103 +0000 UTC m=+72.103799723" Apr 20 22:25:00.091570 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:00.091472 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-djhcv"] Apr 20 22:25:00.100867 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:00.100792 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-djhcv"] Apr 20 22:25:00.101015 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:00.100953 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-djhcv" Apr 20 22:25:00.103773 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:00.103747 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 22:25:00.103966 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:00.103758 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 22:25:00.105257 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:00.105234 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 22:25:00.106169 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:00.105415 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-rbzqn\"" Apr 20 22:25:00.132076 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:00.132021 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8bv96" event={"ID":"ce2b0f82-932f-457f-bd81-3a5c0a321390","Type":"ContainerStarted","Data":"e968cfcbdb9de119811dc9d8db54624a4af4750a1dd2bb21d48498d2853a0445"} Apr 20 22:25:00.150205 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:00.150163 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8bv96" podStartSLOduration=1.7254737759999998 podStartE2EDuration="4.150150135s" podCreationTimestamp="2026-04-20 22:24:56 +0000 UTC" firstStartedPulling="2026-04-20 22:24:57.377143833 +0000 UTC m=+70.330098429" lastFinishedPulling="2026-04-20 22:24:59.801820186 +0000 UTC m=+72.754774788" observedRunningTime="2026-04-20 22:25:00.149874953 +0000 UTC m=+73.102829571" watchObservedRunningTime="2026-04-20 22:25:00.150150135 +0000 UTC m=+73.103104753" Apr 20 22:25:00.199757 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:00.199718 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/711c7d6d-c6dc-41fb-bd61-56110cca941e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-djhcv\" (UID: \"711c7d6d-c6dc-41fb-bd61-56110cca941e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-djhcv" Apr 20 22:25:00.199757 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:00.199761 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgfmh\" (UniqueName: \"kubernetes.io/projected/711c7d6d-c6dc-41fb-bd61-56110cca941e-kube-api-access-jgfmh\") pod \"prometheus-operator-5676c8c784-djhcv\" (UID: \"711c7d6d-c6dc-41fb-bd61-56110cca941e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-djhcv" Apr 20 22:25:00.200004 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:00.199811 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/711c7d6d-c6dc-41fb-bd61-56110cca941e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-djhcv\" (UID: \"711c7d6d-c6dc-41fb-bd61-56110cca941e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-djhcv" Apr 20 22:25:00.200004 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:00.199886 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/711c7d6d-c6dc-41fb-bd61-56110cca941e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-djhcv\" (UID: \"711c7d6d-c6dc-41fb-bd61-56110cca941e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-djhcv" Apr 20 22:25:00.301549 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:00.301012 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/711c7d6d-c6dc-41fb-bd61-56110cca941e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-djhcv\" (UID: \"711c7d6d-c6dc-41fb-bd61-56110cca941e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-djhcv" Apr 20 22:25:00.301549 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:00.301063 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jgfmh\" (UniqueName: \"kubernetes.io/projected/711c7d6d-c6dc-41fb-bd61-56110cca941e-kube-api-access-jgfmh\") pod \"prometheus-operator-5676c8c784-djhcv\" (UID: \"711c7d6d-c6dc-41fb-bd61-56110cca941e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-djhcv" Apr 20 22:25:00.301549 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:00.301168 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/711c7d6d-c6dc-41fb-bd61-56110cca941e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-djhcv\" (UID: \"711c7d6d-c6dc-41fb-bd61-56110cca941e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-djhcv" Apr 20 22:25:00.301549 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:00.301254 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/711c7d6d-c6dc-41fb-bd61-56110cca941e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-djhcv\" (UID: \"711c7d6d-c6dc-41fb-bd61-56110cca941e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-djhcv" Apr 20 22:25:00.301549 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:25:00.301521 2573 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 20 22:25:00.301922 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:25:00.301581 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/711c7d6d-c6dc-41fb-bd61-56110cca941e-prometheus-operator-tls podName:711c7d6d-c6dc-41fb-bd61-56110cca941e nodeName:}" failed. No retries permitted until 2026-04-20 22:25:00.801561073 +0000 UTC m=+73.754515671 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/711c7d6d-c6dc-41fb-bd61-56110cca941e-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-djhcv" (UID: "711c7d6d-c6dc-41fb-bd61-56110cca941e") : secret "prometheus-operator-tls" not found Apr 20 22:25:00.302070 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:00.302005 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/711c7d6d-c6dc-41fb-bd61-56110cca941e-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-djhcv\" (UID: \"711c7d6d-c6dc-41fb-bd61-56110cca941e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-djhcv" Apr 20 22:25:00.304081 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:00.304059 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/711c7d6d-c6dc-41fb-bd61-56110cca941e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-djhcv\" (UID: \"711c7d6d-c6dc-41fb-bd61-56110cca941e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-djhcv" Apr 20 22:25:00.312597 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:00.312571 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgfmh\" (UniqueName: \"kubernetes.io/projected/711c7d6d-c6dc-41fb-bd61-56110cca941e-kube-api-access-jgfmh\") pod \"prometheus-operator-5676c8c784-djhcv\" (UID: \"711c7d6d-c6dc-41fb-bd61-56110cca941e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-djhcv" Apr 20 22:25:00.806403 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:00.806358 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/711c7d6d-c6dc-41fb-bd61-56110cca941e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-djhcv\" (UID: \"711c7d6d-c6dc-41fb-bd61-56110cca941e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-djhcv" Apr 20 22:25:00.809152 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:00.809126 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/711c7d6d-c6dc-41fb-bd61-56110cca941e-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-djhcv\" (UID: \"711c7d6d-c6dc-41fb-bd61-56110cca941e\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-djhcv" Apr 20 22:25:01.013126 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:01.013091 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-djhcv" Apr 20 22:25:01.144525 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:01.144485 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-djhcv"] Apr 20 22:25:01.146487 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:25:01.146456 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod711c7d6d_c6dc_41fb_bd61_56110cca941e.slice/crio-5cd8be1e6d3f5c148e496f368169ce4f7dc6bd23bbaf5d6f55574aee5b2e97a4 WatchSource:0}: Error finding container 5cd8be1e6d3f5c148e496f368169ce4f7dc6bd23bbaf5d6f55574aee5b2e97a4: Status 404 returned error can't find the container with id 5cd8be1e6d3f5c148e496f368169ce4f7dc6bd23bbaf5d6f55574aee5b2e97a4 Apr 20 22:25:02.140778 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:02.140723 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-djhcv" event={"ID":"711c7d6d-c6dc-41fb-bd61-56110cca941e","Type":"ContainerStarted","Data":"5cd8be1e6d3f5c148e496f368169ce4f7dc6bd23bbaf5d6f55574aee5b2e97a4"} Apr 20 22:25:04.150645 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:04.150602 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-djhcv" event={"ID":"711c7d6d-c6dc-41fb-bd61-56110cca941e","Type":"ContainerStarted","Data":"ae28483e1c2898f669371c36427a9789023d8422d097e262028fc7434a7d154f"} Apr 20 22:25:04.150645 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:04.150647 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-djhcv" event={"ID":"711c7d6d-c6dc-41fb-bd61-56110cca941e","Type":"ContainerStarted","Data":"3f2a0606a6509d087251a175e7092e0e25d0f4ec364671cd2357b69155de699e"} Apr 20 22:25:04.167609 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:04.167548 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-djhcv" podStartSLOduration=2.1429737700000002 podStartE2EDuration="4.167528433s" podCreationTimestamp="2026-04-20 22:25:00 +0000 UTC" firstStartedPulling="2026-04-20 22:25:01.148672536 +0000 UTC m=+74.101627137" lastFinishedPulling="2026-04-20 22:25:03.173227202 +0000 UTC m=+76.126181800" observedRunningTime="2026-04-20 22:25:04.166839439 +0000 UTC m=+77.119794062" watchObservedRunningTime="2026-04-20 22:25:04.167528433 +0000 UTC m=+77.120483053" Apr 20 22:25:06.478235 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.477330 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-lrmr7"] Apr 20 22:25:06.501973 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.501173 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.506022 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.505993 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 22:25:06.506022 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.506013 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 22:25:06.506228 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.506069 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-5mrl8\"" Apr 20 22:25:06.506228 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.506202 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 22:25:06.660313 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.660186 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzd98\" (UniqueName: \"kubernetes.io/projected/607764aa-7c69-4da7-94af-dd7a16161e14-kube-api-access-pzd98\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.660313 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.660234 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/607764aa-7c69-4da7-94af-dd7a16161e14-node-exporter-textfile\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.660313 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.660265 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/607764aa-7c69-4da7-94af-dd7a16161e14-root\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.660621 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.660379 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/607764aa-7c69-4da7-94af-dd7a16161e14-metrics-client-ca\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.660621 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.660408 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/607764aa-7c69-4da7-94af-dd7a16161e14-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.660621 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.660457 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/607764aa-7c69-4da7-94af-dd7a16161e14-node-exporter-wtmp\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.660621 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.660510 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/607764aa-7c69-4da7-94af-dd7a16161e14-node-exporter-tls\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.660621 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.660559 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/607764aa-7c69-4da7-94af-dd7a16161e14-sys\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.660621 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.660589 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/607764aa-7c69-4da7-94af-dd7a16161e14-node-exporter-accelerators-collector-config\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.761642 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.761561 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/607764aa-7c69-4da7-94af-dd7a16161e14-node-exporter-wtmp\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.761642 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.761602 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/607764aa-7c69-4da7-94af-dd7a16161e14-node-exporter-tls\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.761642 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.761630 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/607764aa-7c69-4da7-94af-dd7a16161e14-sys\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.761948 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.761661 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/607764aa-7c69-4da7-94af-dd7a16161e14-node-exporter-accelerators-collector-config\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.761948 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.761698 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzd98\" (UniqueName: \"kubernetes.io/projected/607764aa-7c69-4da7-94af-dd7a16161e14-kube-api-access-pzd98\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.761948 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.761708 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/607764aa-7c69-4da7-94af-dd7a16161e14-sys\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.761948 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.761731 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/607764aa-7c69-4da7-94af-dd7a16161e14-node-exporter-textfile\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.761948 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.761757 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/607764aa-7c69-4da7-94af-dd7a16161e14-root\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.761948 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.761839 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/607764aa-7c69-4da7-94af-dd7a16161e14-metrics-client-ca\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.761948 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.761894 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/607764aa-7c69-4da7-94af-dd7a16161e14-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.761948 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.761902 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/607764aa-7c69-4da7-94af-dd7a16161e14-node-exporter-wtmp\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.762303 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.762066 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/607764aa-7c69-4da7-94af-dd7a16161e14-root\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.762303 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.762220 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/607764aa-7c69-4da7-94af-dd7a16161e14-node-exporter-textfile\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.762387 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.762311 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/607764aa-7c69-4da7-94af-dd7a16161e14-node-exporter-accelerators-collector-config\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.764569 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.764548 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/607764aa-7c69-4da7-94af-dd7a16161e14-node-exporter-tls\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.764672 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.764634 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/607764aa-7c69-4da7-94af-dd7a16161e14-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.774520 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.774490 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/607764aa-7c69-4da7-94af-dd7a16161e14-metrics-client-ca\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.775169 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.775146 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzd98\" (UniqueName: \"kubernetes.io/projected/607764aa-7c69-4da7-94af-dd7a16161e14-kube-api-access-pzd98\") pod \"node-exporter-lrmr7\" (UID: \"607764aa-7c69-4da7-94af-dd7a16161e14\") " pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.817490 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:06.817460 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lrmr7" Apr 20 22:25:06.828642 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:25:06.828607 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod607764aa_7c69_4da7_94af_dd7a16161e14.slice/crio-8b76b75b017d1406092be71b1a41150f643d3b5e6befaba06f97def28a0d7662 WatchSource:0}: Error finding container 8b76b75b017d1406092be71b1a41150f643d3b5e6befaba06f97def28a0d7662: Status 404 returned error can't find the container with id 8b76b75b017d1406092be71b1a41150f643d3b5e6befaba06f97def28a0d7662 Apr 20 22:25:07.115256 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:07.115224 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fj4gp" Apr 20 22:25:07.165174 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:07.165138 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lrmr7" event={"ID":"607764aa-7c69-4da7-94af-dd7a16161e14","Type":"ContainerStarted","Data":"8b76b75b017d1406092be71b1a41150f643d3b5e6befaba06f97def28a0d7662"} Apr 20 22:25:09.175555 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:09.175512 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lrmr7" event={"ID":"607764aa-7c69-4da7-94af-dd7a16161e14","Type":"ContainerStarted","Data":"9a3d75fa38c2b6abd1fe0203c34237a80cb7499cd48e017b886079e753cd7bd6"} Apr 20 22:25:12.589570 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:12.589517 2573 patch_prober.go:28] interesting pod/image-registry-5c69c58687-c7dk5 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 22:25:12.590066 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:12.589604 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" podUID="e184abd3-491a-42d4-baec-feffd1648520" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 22:25:15.094071 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:15.094038 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:25:16.207539 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:16.207503 2573 generic.go:358] "Generic (PLEG): container finished" podID="607764aa-7c69-4da7-94af-dd7a16161e14" containerID="9a3d75fa38c2b6abd1fe0203c34237a80cb7499cd48e017b886079e753cd7bd6" exitCode=0 Apr 20 22:25:16.208020 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:16.207582 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lrmr7" event={"ID":"607764aa-7c69-4da7-94af-dd7a16161e14","Type":"ContainerDied","Data":"9a3d75fa38c2b6abd1fe0203c34237a80cb7499cd48e017b886079e753cd7bd6"} Apr 20 22:25:16.209359 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:16.209327 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-h8q5v" event={"ID":"27356a39-9ad0-4501-9cca-fbecf0cc9aa9","Type":"ContainerStarted","Data":"1a586335c4cb1ed1c0d274167b9ab48f9181fbaecaa3972c1b58801037f5ee0f"} Apr 20 22:25:16.209589 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:16.209563 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-h8q5v" Apr 20 22:25:16.221472 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:16.221449 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-h8q5v" Apr 20 22:25:16.242213 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:16.242154 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-h8q5v" podStartSLOduration=2.099639517 podStartE2EDuration="19.242132272s" podCreationTimestamp="2026-04-20 22:24:57 +0000 UTC" firstStartedPulling="2026-04-20 22:24:58.553124438 +0000 UTC m=+71.506079043" lastFinishedPulling="2026-04-20 22:25:15.695617198 +0000 UTC m=+88.648571798" observedRunningTime="2026-04-20 22:25:16.240223574 +0000 UTC m=+89.193178194" watchObservedRunningTime="2026-04-20 22:25:16.242132272 +0000 UTC m=+89.195086893" Apr 20 22:25:16.885083 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:16.885050 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:25:17.216494 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:17.216396 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lrmr7" event={"ID":"607764aa-7c69-4da7-94af-dd7a16161e14","Type":"ContainerStarted","Data":"38c197fa7902959c8a69fb4b626c4fb2e99dd2dbefce031205b8e15c8f9da944"} Apr 20 22:25:17.216494 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:17.216446 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lrmr7" event={"ID":"607764aa-7c69-4da7-94af-dd7a16161e14","Type":"ContainerStarted","Data":"f0a6bc1e64ecfb4edb9f1f814fb6cee1f982d9bc22c7d5632fcc3a515cf40e38"} Apr 20 22:25:17.235506 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:17.235436 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-lrmr7" podStartSLOduration=9.70566952 podStartE2EDuration="11.23541832s" podCreationTimestamp="2026-04-20 22:25:06 +0000 UTC" firstStartedPulling="2026-04-20 22:25:06.830822361 +0000 UTC m=+79.783776973" lastFinishedPulling="2026-04-20 22:25:08.36057117 +0000 UTC m=+81.313525773" observedRunningTime="2026-04-20 22:25:17.233677252 +0000 UTC m=+90.186631873" watchObservedRunningTime="2026-04-20 22:25:17.23541832 +0000 UTC m=+90.188372940" Apr 20 22:25:21.897754 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:21.897703 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" podUID="bf5aed09-1fd8-4294-aef5-ee13e17b2bf3" containerName="registry" containerID="cri-o://b7a960552e020ef63a6300953aa00ec51681e8a062118d215a63b4c0da8e7100" gracePeriod=30 Apr 20 22:25:22.173437 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.173411 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:25:22.208011 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.207952 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-ca-trust-extracted\") pod \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " Apr 20 22:25:22.208194 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.208036 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-bound-sa-token\") pod \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " Apr 20 22:25:22.208194 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.208069 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls\") pod \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " Apr 20 22:25:22.208194 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.208129 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-trusted-ca\") pod \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " Apr 20 22:25:22.208194 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.208160 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-certificates\") pod \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " Apr 20 22:25:22.208407 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.208198 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd65m\" (UniqueName: \"kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-kube-api-access-pd65m\") pod \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " Apr 20 22:25:22.208407 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.208241 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-installation-pull-secrets\") pod \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " Apr 20 22:25:22.208407 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.208275 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-image-registry-private-configuration\") pod \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\" (UID: \"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3\") " Apr 20 22:25:22.208810 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.208746 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "bf5aed09-1fd8-4294-aef5-ee13e17b2bf3" (UID: "bf5aed09-1fd8-4294-aef5-ee13e17b2bf3"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:25:22.208978 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.208911 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf5aed09-1fd8-4294-aef5-ee13e17b2bf3" (UID: "bf5aed09-1fd8-4294-aef5-ee13e17b2bf3"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:25:22.211286 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.211118 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-kube-api-access-pd65m" (OuterVolumeSpecName: "kube-api-access-pd65m") pod "bf5aed09-1fd8-4294-aef5-ee13e17b2bf3" (UID: "bf5aed09-1fd8-4294-aef5-ee13e17b2bf3"). InnerVolumeSpecName "kube-api-access-pd65m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:25:22.211286 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.211270 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "bf5aed09-1fd8-4294-aef5-ee13e17b2bf3" (UID: "bf5aed09-1fd8-4294-aef5-ee13e17b2bf3"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:25:22.211444 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.211396 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "bf5aed09-1fd8-4294-aef5-ee13e17b2bf3" (UID: "bf5aed09-1fd8-4294-aef5-ee13e17b2bf3"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:25:22.211628 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.211595 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf5aed09-1fd8-4294-aef5-ee13e17b2bf3" (UID: "bf5aed09-1fd8-4294-aef5-ee13e17b2bf3"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:25:22.211957 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.211779 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "bf5aed09-1fd8-4294-aef5-ee13e17b2bf3" (UID: "bf5aed09-1fd8-4294-aef5-ee13e17b2bf3"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:25:22.219221 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.219193 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "bf5aed09-1fd8-4294-aef5-ee13e17b2bf3" (UID: "bf5aed09-1fd8-4294-aef5-ee13e17b2bf3"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:25:22.235960 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.235926 2573 generic.go:358] "Generic (PLEG): container finished" podID="bf5aed09-1fd8-4294-aef5-ee13e17b2bf3" containerID="b7a960552e020ef63a6300953aa00ec51681e8a062118d215a63b4c0da8e7100" exitCode=0 Apr 20 22:25:22.236117 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.235998 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" Apr 20 22:25:22.236117 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.235997 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" event={"ID":"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3","Type":"ContainerDied","Data":"b7a960552e020ef63a6300953aa00ec51681e8a062118d215a63b4c0da8e7100"} Apr 20 22:25:22.236117 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.236109 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-649dd96b44-r7c7b" event={"ID":"bf5aed09-1fd8-4294-aef5-ee13e17b2bf3","Type":"ContainerDied","Data":"29d80b5aa1e4c6d93a9c84f7071dc7eab8ca1ea6878698787de345e389e170c1"} Apr 20 22:25:22.236244 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.236136 2573 scope.go:117] "RemoveContainer" containerID="b7a960552e020ef63a6300953aa00ec51681e8a062118d215a63b4c0da8e7100" Apr 20 22:25:22.246204 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.246183 2573 scope.go:117] "RemoveContainer" containerID="b7a960552e020ef63a6300953aa00ec51681e8a062118d215a63b4c0da8e7100" Apr 20 22:25:22.246527 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:25:22.246495 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7a960552e020ef63a6300953aa00ec51681e8a062118d215a63b4c0da8e7100\": container with ID starting with b7a960552e020ef63a6300953aa00ec51681e8a062118d215a63b4c0da8e7100 not found: ID does not exist" containerID="b7a960552e020ef63a6300953aa00ec51681e8a062118d215a63b4c0da8e7100" Apr 20 22:25:22.246613 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.246543 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7a960552e020ef63a6300953aa00ec51681e8a062118d215a63b4c0da8e7100"} err="failed to get container status \"b7a960552e020ef63a6300953aa00ec51681e8a062118d215a63b4c0da8e7100\": rpc error: code = NotFound desc = could not find container \"b7a960552e020ef63a6300953aa00ec51681e8a062118d215a63b4c0da8e7100\": container with ID starting with b7a960552e020ef63a6300953aa00ec51681e8a062118d215a63b4c0da8e7100 not found: ID does not exist" Apr 20 22:25:22.258843 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.258815 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-649dd96b44-r7c7b"] Apr 20 22:25:22.262443 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.262419 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-649dd96b44-r7c7b"] Apr 20 22:25:22.309687 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.309654 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-trusted-ca\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:25:22.309687 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.309685 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-certificates\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:25:22.309947 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.309698 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pd65m\" (UniqueName: \"kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-kube-api-access-pd65m\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:25:22.309947 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.309708 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-installation-pull-secrets\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:25:22.309947 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.309719 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-image-registry-private-configuration\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:25:22.309947 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.309731 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-ca-trust-extracted\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:25:22.309947 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.309740 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-bound-sa-token\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:25:22.309947 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:22.309754 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3-registry-tls\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:25:23.602347 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:23.602309 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf5aed09-1fd8-4294-aef5-ee13e17b2bf3" path="/var/lib/kubelet/pods/bf5aed09-1fd8-4294-aef5-ee13e17b2bf3/volumes" Apr 20 22:25:25.091830 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:25.091799 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-2wb2c" Apr 20 22:25:26.240463 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:26.240431 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5c69c58687-c7dk5"] Apr 20 22:25:45.312030 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:45.311991 2573 generic.go:358] "Generic (PLEG): container finished" podID="eeb4d8a0-f553-47b6-8134-40d74089fd72" containerID="0d53cb35592b675d97da657b4c1d2deddd4c8367ece8b4858b2fb5e921d4d269" exitCode=0 Apr 20 22:25:45.312553 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:45.312060 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dmsjq" event={"ID":"eeb4d8a0-f553-47b6-8134-40d74089fd72","Type":"ContainerDied","Data":"0d53cb35592b675d97da657b4c1d2deddd4c8367ece8b4858b2fb5e921d4d269"} Apr 20 22:25:45.312553 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:45.312441 2573 scope.go:117] "RemoveContainer" containerID="0d53cb35592b675d97da657b4c1d2deddd4c8367ece8b4858b2fb5e921d4d269" Apr 20 22:25:46.315890 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:46.315832 2573 generic.go:358] "Generic (PLEG): container finished" podID="ea02661d-e4a4-469a-9451-7f11a7db90d2" containerID="1dcb1bc608878af06bdf39147b04d23e584fcbaa9e89fb928b1006811f405819" exitCode=0 Apr 20 22:25:46.316388 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:46.315906 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fzjvb" event={"ID":"ea02661d-e4a4-469a-9451-7f11a7db90d2","Type":"ContainerDied","Data":"1dcb1bc608878af06bdf39147b04d23e584fcbaa9e89fb928b1006811f405819"} Apr 20 22:25:46.316388 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:46.316277 2573 scope.go:117] "RemoveContainer" containerID="1dcb1bc608878af06bdf39147b04d23e584fcbaa9e89fb928b1006811f405819" Apr 20 22:25:46.317535 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:46.317512 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-dmsjq" event={"ID":"eeb4d8a0-f553-47b6-8134-40d74089fd72","Type":"ContainerStarted","Data":"88c93decf482d3e5f10474dd2e6560c2289ad8f211b28f83a1401945ab8d852c"} Apr 20 22:25:47.322562 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:47.322528 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-fzjvb" event={"ID":"ea02661d-e4a4-469a-9451-7f11a7db90d2","Type":"ContainerStarted","Data":"3d01649cafacf72aeba6a4634f2a23f4ff66562faa537303caad90b5f3909d31"} Apr 20 22:25:51.265026 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.264983 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" podUID="e184abd3-491a-42d4-baec-feffd1648520" containerName="registry" containerID="cri-o://1fa300847babf07a9fc0e126fd99c3c665326ca7700a647503da21c75e4197c7" gracePeriod=30 Apr 20 22:25:51.528904 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.528880 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:25:51.671055 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.671015 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq6d5\" (UniqueName: \"kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-kube-api-access-nq6d5\") pod \"e184abd3-491a-42d4-baec-feffd1648520\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " Apr 20 22:25:51.671260 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.671093 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e184abd3-491a-42d4-baec-feffd1648520-trusted-ca\") pod \"e184abd3-491a-42d4-baec-feffd1648520\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " Apr 20 22:25:51.671260 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.671191 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-bound-sa-token\") pod \"e184abd3-491a-42d4-baec-feffd1648520\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " Apr 20 22:25:51.671393 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.671265 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls\") pod \"e184abd3-491a-42d4-baec-feffd1648520\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " Apr 20 22:25:51.671393 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.671290 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e184abd3-491a-42d4-baec-feffd1648520-registry-certificates\") pod \"e184abd3-491a-42d4-baec-feffd1648520\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " Apr 20 22:25:51.671393 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.671309 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e184abd3-491a-42d4-baec-feffd1648520-installation-pull-secrets\") pod \"e184abd3-491a-42d4-baec-feffd1648520\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " Apr 20 22:25:51.671393 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.671353 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e184abd3-491a-42d4-baec-feffd1648520-image-registry-private-configuration\") pod \"e184abd3-491a-42d4-baec-feffd1648520\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " Apr 20 22:25:51.671589 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.671407 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e184abd3-491a-42d4-baec-feffd1648520-ca-trust-extracted\") pod \"e184abd3-491a-42d4-baec-feffd1648520\" (UID: \"e184abd3-491a-42d4-baec-feffd1648520\") " Apr 20 22:25:51.671640 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.671602 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e184abd3-491a-42d4-baec-feffd1648520-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e184abd3-491a-42d4-baec-feffd1648520" (UID: "e184abd3-491a-42d4-baec-feffd1648520"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:25:51.671752 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.671709 2573 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e184abd3-491a-42d4-baec-feffd1648520-trusted-ca\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:25:51.671962 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.671905 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e184abd3-491a-42d4-baec-feffd1648520-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e184abd3-491a-42d4-baec-feffd1648520" (UID: "e184abd3-491a-42d4-baec-feffd1648520"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 22:25:51.673796 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.673754 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e184abd3-491a-42d4-baec-feffd1648520-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "e184abd3-491a-42d4-baec-feffd1648520" (UID: "e184abd3-491a-42d4-baec-feffd1648520"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:25:51.673976 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.673947 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e184abd3-491a-42d4-baec-feffd1648520" (UID: "e184abd3-491a-42d4-baec-feffd1648520"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:25:51.674094 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.673955 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e184abd3-491a-42d4-baec-feffd1648520-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e184abd3-491a-42d4-baec-feffd1648520" (UID: "e184abd3-491a-42d4-baec-feffd1648520"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:25:51.674094 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.673978 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e184abd3-491a-42d4-baec-feffd1648520" (UID: "e184abd3-491a-42d4-baec-feffd1648520"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:25:51.674094 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.674047 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-kube-api-access-nq6d5" (OuterVolumeSpecName: "kube-api-access-nq6d5") pod "e184abd3-491a-42d4-baec-feffd1648520" (UID: "e184abd3-491a-42d4-baec-feffd1648520"). InnerVolumeSpecName "kube-api-access-nq6d5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:25:51.679439 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.679418 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e184abd3-491a-42d4-baec-feffd1648520-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e184abd3-491a-42d4-baec-feffd1648520" (UID: "e184abd3-491a-42d4-baec-feffd1648520"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:25:51.772643 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.772562 2573 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e184abd3-491a-42d4-baec-feffd1648520-ca-trust-extracted\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:25:51.772643 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.772592 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nq6d5\" (UniqueName: \"kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-kube-api-access-nq6d5\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:25:51.772643 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.772604 2573 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-bound-sa-token\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:25:51.772643 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.772614 2573 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e184abd3-491a-42d4-baec-feffd1648520-registry-tls\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:25:51.772643 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.772623 2573 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e184abd3-491a-42d4-baec-feffd1648520-registry-certificates\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:25:51.772643 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.772632 2573 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e184abd3-491a-42d4-baec-feffd1648520-installation-pull-secrets\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:25:51.772643 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:51.772643 2573 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e184abd3-491a-42d4-baec-feffd1648520-image-registry-private-configuration\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:25:52.352602 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:52.352565 2573 generic.go:358] "Generic (PLEG): container finished" podID="e184abd3-491a-42d4-baec-feffd1648520" containerID="1fa300847babf07a9fc0e126fd99c3c665326ca7700a647503da21c75e4197c7" exitCode=0 Apr 20 22:25:52.353072 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:52.352629 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" Apr 20 22:25:52.353072 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:52.352660 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" event={"ID":"e184abd3-491a-42d4-baec-feffd1648520","Type":"ContainerDied","Data":"1fa300847babf07a9fc0e126fd99c3c665326ca7700a647503da21c75e4197c7"} Apr 20 22:25:52.353072 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:52.352703 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c69c58687-c7dk5" event={"ID":"e184abd3-491a-42d4-baec-feffd1648520","Type":"ContainerDied","Data":"4a3ee4fe7127a12e1ffa8b4031024ff49c84184d6e1d691a1edac0e0db652cb2"} Apr 20 22:25:52.353072 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:52.352720 2573 scope.go:117] "RemoveContainer" containerID="1fa300847babf07a9fc0e126fd99c3c665326ca7700a647503da21c75e4197c7" Apr 20 22:25:52.361201 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:52.361188 2573 scope.go:117] "RemoveContainer" containerID="1fa300847babf07a9fc0e126fd99c3c665326ca7700a647503da21c75e4197c7" Apr 20 22:25:52.361461 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:25:52.361443 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fa300847babf07a9fc0e126fd99c3c665326ca7700a647503da21c75e4197c7\": container with ID starting with 1fa300847babf07a9fc0e126fd99c3c665326ca7700a647503da21c75e4197c7 not found: ID does not exist" containerID="1fa300847babf07a9fc0e126fd99c3c665326ca7700a647503da21c75e4197c7" Apr 20 22:25:52.361525 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:52.361468 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fa300847babf07a9fc0e126fd99c3c665326ca7700a647503da21c75e4197c7"} err="failed to get container status \"1fa300847babf07a9fc0e126fd99c3c665326ca7700a647503da21c75e4197c7\": rpc error: code = NotFound desc = could not find container \"1fa300847babf07a9fc0e126fd99c3c665326ca7700a647503da21c75e4197c7\": container with ID starting with 1fa300847babf07a9fc0e126fd99c3c665326ca7700a647503da21c75e4197c7 not found: ID does not exist" Apr 20 22:25:52.373493 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:52.373471 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5c69c58687-c7dk5"] Apr 20 22:25:52.377878 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:52.377839 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5c69c58687-c7dk5"] Apr 20 22:25:53.604870 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:25:53.604825 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e184abd3-491a-42d4-baec-feffd1648520" path="/var/lib/kubelet/pods/e184abd3-491a-42d4-baec-feffd1648520/volumes" Apr 20 22:26:05.394461 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:05.394426 2573 generic.go:358] "Generic (PLEG): container finished" podID="bb95d71c-3b6d-407a-9ff3-a70562af1b93" containerID="c08422570a15d84cda8161a375c8edbeed27f5065237660dd51832c38b1228ab" exitCode=0 Apr 20 22:26:05.394952 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:05.394473 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-7n7hh" event={"ID":"bb95d71c-3b6d-407a-9ff3-a70562af1b93","Type":"ContainerDied","Data":"c08422570a15d84cda8161a375c8edbeed27f5065237660dd51832c38b1228ab"} Apr 20 22:26:05.394952 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:05.394797 2573 scope.go:117] "RemoveContainer" containerID="c08422570a15d84cda8161a375c8edbeed27f5065237660dd51832c38b1228ab" Apr 20 22:26:06.398909 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:06.398870 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-7n7hh" event={"ID":"bb95d71c-3b6d-407a-9ff3-a70562af1b93","Type":"ContainerStarted","Data":"790652552fba8e342532790bc687c6d9faca932d0819d100f5a3881cffd1b936"} Apr 20 22:26:08.320889 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:08.320839 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-lxvk8_c06378d7-946b-49c3-ac21-44605e27cdd5/cluster-monitoring-operator/0.log" Apr 20 22:26:10.117441 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:10.117414 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lrmr7_607764aa-7c69-4da7-94af-dd7a16161e14/init-textfile/0.log" Apr 20 22:26:10.319333 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:10.319297 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lrmr7_607764aa-7c69-4da7-94af-dd7a16161e14/node-exporter/0.log" Apr 20 22:26:10.517450 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:10.517424 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lrmr7_607764aa-7c69-4da7-94af-dd7a16161e14/kube-rbac-proxy/0.log" Apr 20 22:26:10.947443 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:10.947338 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" podUID="908bd97a-6313-4646-ab52-90bb7ffefdaa" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 22:26:13.319935 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:13.319898 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-djhcv_711c7d6d-c6dc-41fb-bd61-56110cca941e/prometheus-operator/0.log" Apr 20 22:26:13.517825 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:13.517793 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-djhcv_711c7d6d-c6dc-41fb-bd61-56110cca941e/kube-rbac-proxy/0.log" Apr 20 22:26:13.718161 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:13.718080 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-vqj8t_790e03ca-74f0-4b4c-8111-f962f1503d6f/prometheus-operator-admission-webhook/0.log" Apr 20 22:26:15.718443 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:15.718408 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-4fd8s_51e93c8f-5d21-4964-8b01-3ddb5f2e5c86/networking-console-plugin/0.log" Apr 20 22:26:15.917679 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:15.917647 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xnj54_4c475df6-d751-4f10-81c7-a1e56dec9176/console-operator/1.log" Apr 20 22:26:16.121038 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:16.121013 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xnj54_4c475df6-d751-4f10-81c7-a1e56dec9176/console-operator/2.log" Apr 20 22:26:16.719922 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:16.719892 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-h8q5v_27356a39-9ad0-4501-9cca-fbecf0cc9aa9/download-server/0.log" Apr 20 22:26:16.918881 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:16.918815 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-84455b6c98-44svx_7c99a639-1f48-429a-a14e-800ce227becb/router/0.log" Apr 20 22:26:17.517789 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:17.517758 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-d5s8x_82357e1f-f9a8-4cf7-b3dd-fe77912c49a1/serve-healthcheck-canary/0.log" Apr 20 22:26:20.946761 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:20.946722 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" podUID="908bd97a-6313-4646-ab52-90bb7ffefdaa" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 22:26:30.946558 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:30.946511 2573 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" podUID="908bd97a-6313-4646-ab52-90bb7ffefdaa" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 22:26:30.947048 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:30.946595 2573 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" Apr 20 22:26:30.947231 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:30.947208 2573 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"cd27339081a619d08cb621f78de14f3e933198c4a9f1b3c49d83ce771df69802"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 20 22:26:30.947304 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:30.947258 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" podUID="908bd97a-6313-4646-ab52-90bb7ffefdaa" containerName="service-proxy" containerID="cri-o://cd27339081a619d08cb621f78de14f3e933198c4a9f1b3c49d83ce771df69802" gracePeriod=30 Apr 20 22:26:31.500994 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:31.500960 2573 generic.go:358] "Generic (PLEG): container finished" podID="908bd97a-6313-4646-ab52-90bb7ffefdaa" containerID="cd27339081a619d08cb621f78de14f3e933198c4a9f1b3c49d83ce771df69802" exitCode=2 Apr 20 22:26:31.501189 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:31.501030 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" event={"ID":"908bd97a-6313-4646-ab52-90bb7ffefdaa","Type":"ContainerDied","Data":"cd27339081a619d08cb621f78de14f3e933198c4a9f1b3c49d83ce771df69802"} Apr 20 22:26:31.501189 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:26:31.501072 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7cf55cf9d7-wz4qk" event={"ID":"908bd97a-6313-4646-ab52-90bb7ffefdaa","Type":"ContainerStarted","Data":"5a38766b4d29942af90c8b3af5de029c0ab1f5516f7535304a55664dcd3dd068"} Apr 20 22:28:47.545783 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:28:47.545749 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xnj54_4c475df6-d751-4f10-81c7-a1e56dec9176/console-operator/1.log" Apr 20 22:28:47.546426 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:28:47.546391 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xnj54_4c475df6-d751-4f10-81c7-a1e56dec9176/console-operator/1.log" Apr 20 22:28:47.557757 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:28:47.557736 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 22:29:30.214219 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.214187 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d8d569d47-w48jf"] Apr 20 22:29:30.214662 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.214537 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf5aed09-1fd8-4294-aef5-ee13e17b2bf3" containerName="registry" Apr 20 22:29:30.214662 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.214548 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5aed09-1fd8-4294-aef5-ee13e17b2bf3" containerName="registry" Apr 20 22:29:30.214662 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.214562 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e184abd3-491a-42d4-baec-feffd1648520" containerName="registry" Apr 20 22:29:30.214662 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.214567 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="e184abd3-491a-42d4-baec-feffd1648520" containerName="registry" Apr 20 22:29:30.214662 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.214621 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="e184abd3-491a-42d4-baec-feffd1648520" containerName="registry" Apr 20 22:29:30.214662 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.214639 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf5aed09-1fd8-4294-aef5-ee13e17b2bf3" containerName="registry" Apr 20 22:29:30.217378 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.217362 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-w48jf" Apr 20 22:29:30.219997 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.219950 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 22:29:30.220206 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.220184 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 22:29:30.220441 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.220267 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-q26g9\"" Apr 20 22:29:30.220577 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.220479 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 22:29:30.220577 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.220530 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 22:29:30.232593 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.232572 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d8d569d47-w48jf"] Apr 20 22:29:30.351511 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.351482 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl2gj\" (UniqueName: \"kubernetes.io/projected/965a673a-9e44-490c-8dfa-b522b1bebe78-kube-api-access-vl2gj\") pod \"opendatahub-operator-controller-manager-5d8d569d47-w48jf\" (UID: \"965a673a-9e44-490c-8dfa-b522b1bebe78\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-w48jf" Apr 20 22:29:30.351681 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.351523 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/965a673a-9e44-490c-8dfa-b522b1bebe78-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d8d569d47-w48jf\" (UID: \"965a673a-9e44-490c-8dfa-b522b1bebe78\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-w48jf" Apr 20 22:29:30.351681 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.351637 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/965a673a-9e44-490c-8dfa-b522b1bebe78-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d8d569d47-w48jf\" (UID: \"965a673a-9e44-490c-8dfa-b522b1bebe78\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-w48jf" Apr 20 22:29:30.452977 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.452940 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/965a673a-9e44-490c-8dfa-b522b1bebe78-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d8d569d47-w48jf\" (UID: \"965a673a-9e44-490c-8dfa-b522b1bebe78\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-w48jf" Apr 20 22:29:30.453140 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.452986 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vl2gj\" (UniqueName: \"kubernetes.io/projected/965a673a-9e44-490c-8dfa-b522b1bebe78-kube-api-access-vl2gj\") pod \"opendatahub-operator-controller-manager-5d8d569d47-w48jf\" (UID: \"965a673a-9e44-490c-8dfa-b522b1bebe78\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-w48jf" Apr 20 22:29:30.453140 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.453046 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/965a673a-9e44-490c-8dfa-b522b1bebe78-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d8d569d47-w48jf\" (UID: \"965a673a-9e44-490c-8dfa-b522b1bebe78\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-w48jf" Apr 20 22:29:30.455646 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.455614 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/965a673a-9e44-490c-8dfa-b522b1bebe78-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5d8d569d47-w48jf\" (UID: \"965a673a-9e44-490c-8dfa-b522b1bebe78\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-w48jf" Apr 20 22:29:30.455766 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.455665 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/965a673a-9e44-490c-8dfa-b522b1bebe78-webhook-cert\") pod \"opendatahub-operator-controller-manager-5d8d569d47-w48jf\" (UID: \"965a673a-9e44-490c-8dfa-b522b1bebe78\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-w48jf" Apr 20 22:29:30.464090 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.464066 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl2gj\" (UniqueName: \"kubernetes.io/projected/965a673a-9e44-490c-8dfa-b522b1bebe78-kube-api-access-vl2gj\") pod \"opendatahub-operator-controller-manager-5d8d569d47-w48jf\" (UID: \"965a673a-9e44-490c-8dfa-b522b1bebe78\") " pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-w48jf" Apr 20 22:29:30.527773 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.527740 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-w48jf" Apr 20 22:29:30.671349 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.671321 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5d8d569d47-w48jf"] Apr 20 22:29:30.674140 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:29:30.674113 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod965a673a_9e44_490c_8dfa_b522b1bebe78.slice/crio-9bcce25daff9a83013c74955e83d392a4c327b1ba40001b892dfe4b7bb5e4eeb WatchSource:0}: Error finding container 9bcce25daff9a83013c74955e83d392a4c327b1ba40001b892dfe4b7bb5e4eeb: Status 404 returned error can't find the container with id 9bcce25daff9a83013c74955e83d392a4c327b1ba40001b892dfe4b7bb5e4eeb Apr 20 22:29:30.675767 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:30.675750 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 22:29:31.056543 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:31.056510 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-w48jf" event={"ID":"965a673a-9e44-490c-8dfa-b522b1bebe78","Type":"ContainerStarted","Data":"9bcce25daff9a83013c74955e83d392a4c327b1ba40001b892dfe4b7bb5e4eeb"} Apr 20 22:29:34.067939 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:34.067895 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-w48jf" event={"ID":"965a673a-9e44-490c-8dfa-b522b1bebe78","Type":"ContainerStarted","Data":"2b80f0b37435c6064d96dacff477ba2585e05c7722597b8b18238ed3898d3acb"} Apr 20 22:29:34.068323 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:34.068026 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-w48jf" Apr 20 22:29:34.089545 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:34.089491 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-w48jf" podStartSLOduration=1.640936735 podStartE2EDuration="4.089478562s" podCreationTimestamp="2026-04-20 22:29:30 +0000 UTC" firstStartedPulling="2026-04-20 22:29:30.67589381 +0000 UTC m=+343.628848407" lastFinishedPulling="2026-04-20 22:29:33.124435634 +0000 UTC m=+346.077390234" observedRunningTime="2026-04-20 22:29:34.087800706 +0000 UTC m=+347.040755324" watchObservedRunningTime="2026-04-20 22:29:34.089478562 +0000 UTC m=+347.042433183" Apr 20 22:29:45.074415 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:45.074336 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5d8d569d47-w48jf" Apr 20 22:29:49.088383 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:49.088340 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-7755c94fdf-r68sk"] Apr 20 22:29:49.091695 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:49.091675 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7755c94fdf-r68sk" Apr 20 22:29:49.094382 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:49.094359 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 22:29:49.095373 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:49.095356 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-4xpcs\"" Apr 20 22:29:49.095449 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:49.095359 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 22:29:49.102103 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:49.102077 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7755c94fdf-r68sk"] Apr 20 22:29:49.213589 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:49.213547 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/23c619f3-1c6c-439d-8c08-21fb43ee960e-tls-certs\") pod \"kube-auth-proxy-7755c94fdf-r68sk\" (UID: \"23c619f3-1c6c-439d-8c08-21fb43ee960e\") " pod="openshift-ingress/kube-auth-proxy-7755c94fdf-r68sk" Apr 20 22:29:49.213758 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:49.213599 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqtss\" (UniqueName: \"kubernetes.io/projected/23c619f3-1c6c-439d-8c08-21fb43ee960e-kube-api-access-zqtss\") pod \"kube-auth-proxy-7755c94fdf-r68sk\" (UID: \"23c619f3-1c6c-439d-8c08-21fb43ee960e\") " pod="openshift-ingress/kube-auth-proxy-7755c94fdf-r68sk" Apr 20 22:29:49.213758 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:49.213665 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23c619f3-1c6c-439d-8c08-21fb43ee960e-tmp\") pod \"kube-auth-proxy-7755c94fdf-r68sk\" (UID: \"23c619f3-1c6c-439d-8c08-21fb43ee960e\") " pod="openshift-ingress/kube-auth-proxy-7755c94fdf-r68sk" Apr 20 22:29:49.315039 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:49.314996 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zqtss\" (UniqueName: \"kubernetes.io/projected/23c619f3-1c6c-439d-8c08-21fb43ee960e-kube-api-access-zqtss\") pod \"kube-auth-proxy-7755c94fdf-r68sk\" (UID: \"23c619f3-1c6c-439d-8c08-21fb43ee960e\") " pod="openshift-ingress/kube-auth-proxy-7755c94fdf-r68sk" Apr 20 22:29:49.315039 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:49.315042 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23c619f3-1c6c-439d-8c08-21fb43ee960e-tmp\") pod \"kube-auth-proxy-7755c94fdf-r68sk\" (UID: \"23c619f3-1c6c-439d-8c08-21fb43ee960e\") " pod="openshift-ingress/kube-auth-proxy-7755c94fdf-r68sk" Apr 20 22:29:49.315299 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:49.315108 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/23c619f3-1c6c-439d-8c08-21fb43ee960e-tls-certs\") pod \"kube-auth-proxy-7755c94fdf-r68sk\" (UID: \"23c619f3-1c6c-439d-8c08-21fb43ee960e\") " pod="openshift-ingress/kube-auth-proxy-7755c94fdf-r68sk" Apr 20 22:29:49.317357 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:49.317332 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/23c619f3-1c6c-439d-8c08-21fb43ee960e-tmp\") pod \"kube-auth-proxy-7755c94fdf-r68sk\" (UID: \"23c619f3-1c6c-439d-8c08-21fb43ee960e\") " pod="openshift-ingress/kube-auth-proxy-7755c94fdf-r68sk" Apr 20 22:29:49.317576 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:49.317558 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/23c619f3-1c6c-439d-8c08-21fb43ee960e-tls-certs\") pod \"kube-auth-proxy-7755c94fdf-r68sk\" (UID: \"23c619f3-1c6c-439d-8c08-21fb43ee960e\") " pod="openshift-ingress/kube-auth-proxy-7755c94fdf-r68sk" Apr 20 22:29:49.322910 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:49.322892 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqtss\" (UniqueName: \"kubernetes.io/projected/23c619f3-1c6c-439d-8c08-21fb43ee960e-kube-api-access-zqtss\") pod \"kube-auth-proxy-7755c94fdf-r68sk\" (UID: \"23c619f3-1c6c-439d-8c08-21fb43ee960e\") " pod="openshift-ingress/kube-auth-proxy-7755c94fdf-r68sk" Apr 20 22:29:49.402071 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:49.401985 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-7755c94fdf-r68sk" Apr 20 22:29:49.527454 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:49.527426 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-7755c94fdf-r68sk"] Apr 20 22:29:49.529696 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:29:49.529672 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23c619f3_1c6c_439d_8c08_21fb43ee960e.slice/crio-f4d6f93065d0abfa12de4022ef01baa59f314b60928ae443208f6a3de50e48c2 WatchSource:0}: Error finding container f4d6f93065d0abfa12de4022ef01baa59f314b60928ae443208f6a3de50e48c2: Status 404 returned error can't find the container with id f4d6f93065d0abfa12de4022ef01baa59f314b60928ae443208f6a3de50e48c2 Apr 20 22:29:50.120583 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:50.120536 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7755c94fdf-r68sk" event={"ID":"23c619f3-1c6c-439d-8c08-21fb43ee960e","Type":"ContainerStarted","Data":"f4d6f93065d0abfa12de4022ef01baa59f314b60928ae443208f6a3de50e48c2"} Apr 20 22:29:53.131976 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:53.131896 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-7755c94fdf-r68sk" event={"ID":"23c619f3-1c6c-439d-8c08-21fb43ee960e","Type":"ContainerStarted","Data":"517cff41cf0a8333e25358c48d1a2ad4d465911d35132de8cbfd5a15cefa5d4b"} Apr 20 22:29:53.149008 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:53.148959 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-7755c94fdf-r68sk" podStartSLOduration=0.835940197 podStartE2EDuration="4.148944279s" podCreationTimestamp="2026-04-20 22:29:49 +0000 UTC" firstStartedPulling="2026-04-20 22:29:49.531981218 +0000 UTC m=+362.484935815" lastFinishedPulling="2026-04-20 22:29:52.844985297 +0000 UTC m=+365.797939897" observedRunningTime="2026-04-20 22:29:53.147622422 +0000 UTC m=+366.100577041" watchObservedRunningTime="2026-04-20 22:29:53.148944279 +0000 UTC m=+366.101898897" Apr 20 22:29:56.972119 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:56.972084 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-zjrpw"] Apr 20 22:29:56.975463 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:56.975448 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-zjrpw" Apr 20 22:29:56.978067 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:56.978043 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 20 22:29:56.978199 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:56.978112 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-nwglt\"" Apr 20 22:29:56.978601 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:56.978579 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e8fe991-f30a-4442-b41c-1e04e82e2fd8-cert\") pod \"kserve-controller-manager-856948b99f-zjrpw\" (UID: \"9e8fe991-f30a-4442-b41c-1e04e82e2fd8\") " pod="opendatahub/kserve-controller-manager-856948b99f-zjrpw" Apr 20 22:29:56.978672 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:56.978657 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtq5f\" (UniqueName: \"kubernetes.io/projected/9e8fe991-f30a-4442-b41c-1e04e82e2fd8-kube-api-access-qtq5f\") pod \"kserve-controller-manager-856948b99f-zjrpw\" (UID: \"9e8fe991-f30a-4442-b41c-1e04e82e2fd8\") " pod="opendatahub/kserve-controller-manager-856948b99f-zjrpw" Apr 20 22:29:56.986564 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:56.986542 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-zjrpw"] Apr 20 22:29:57.079125 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:57.079088 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtq5f\" (UniqueName: \"kubernetes.io/projected/9e8fe991-f30a-4442-b41c-1e04e82e2fd8-kube-api-access-qtq5f\") pod \"kserve-controller-manager-856948b99f-zjrpw\" (UID: \"9e8fe991-f30a-4442-b41c-1e04e82e2fd8\") " pod="opendatahub/kserve-controller-manager-856948b99f-zjrpw" Apr 20 22:29:57.079288 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:57.079135 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e8fe991-f30a-4442-b41c-1e04e82e2fd8-cert\") pod \"kserve-controller-manager-856948b99f-zjrpw\" (UID: \"9e8fe991-f30a-4442-b41c-1e04e82e2fd8\") " pod="opendatahub/kserve-controller-manager-856948b99f-zjrpw" Apr 20 22:29:57.079288 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:29:57.079258 2573 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 20 22:29:57.079366 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:29:57.079317 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e8fe991-f30a-4442-b41c-1e04e82e2fd8-cert podName:9e8fe991-f30a-4442-b41c-1e04e82e2fd8 nodeName:}" failed. No retries permitted until 2026-04-20 22:29:57.579300577 +0000 UTC m=+370.532255174 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9e8fe991-f30a-4442-b41c-1e04e82e2fd8-cert") pod "kserve-controller-manager-856948b99f-zjrpw" (UID: "9e8fe991-f30a-4442-b41c-1e04e82e2fd8") : secret "kserve-webhook-server-cert" not found Apr 20 22:29:57.098584 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:57.098550 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtq5f\" (UniqueName: \"kubernetes.io/projected/9e8fe991-f30a-4442-b41c-1e04e82e2fd8-kube-api-access-qtq5f\") pod \"kserve-controller-manager-856948b99f-zjrpw\" (UID: \"9e8fe991-f30a-4442-b41c-1e04e82e2fd8\") " pod="opendatahub/kserve-controller-manager-856948b99f-zjrpw" Apr 20 22:29:57.583457 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:57.583416 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e8fe991-f30a-4442-b41c-1e04e82e2fd8-cert\") pod \"kserve-controller-manager-856948b99f-zjrpw\" (UID: \"9e8fe991-f30a-4442-b41c-1e04e82e2fd8\") " pod="opendatahub/kserve-controller-manager-856948b99f-zjrpw" Apr 20 22:29:57.585691 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:57.585669 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e8fe991-f30a-4442-b41c-1e04e82e2fd8-cert\") pod \"kserve-controller-manager-856948b99f-zjrpw\" (UID: \"9e8fe991-f30a-4442-b41c-1e04e82e2fd8\") " pod="opendatahub/kserve-controller-manager-856948b99f-zjrpw" Apr 20 22:29:57.587613 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:57.587588 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-zjrpw" Apr 20 22:29:57.932168 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:57.932020 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-zjrpw"] Apr 20 22:29:57.934790 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:29:57.934742 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e8fe991_f30a_4442_b41c_1e04e82e2fd8.slice/crio-9ef25635e8a17667e94aaeabc6cf14c15cee3c24b5d34546924eb99bb9310006 WatchSource:0}: Error finding container 9ef25635e8a17667e94aaeabc6cf14c15cee3c24b5d34546924eb99bb9310006: Status 404 returned error can't find the container with id 9ef25635e8a17667e94aaeabc6cf14c15cee3c24b5d34546924eb99bb9310006 Apr 20 22:29:58.150660 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:29:58.150620 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-zjrpw" event={"ID":"9e8fe991-f30a-4442-b41c-1e04e82e2fd8","Type":"ContainerStarted","Data":"9ef25635e8a17667e94aaeabc6cf14c15cee3c24b5d34546924eb99bb9310006"} Apr 20 22:30:01.162946 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:01.162905 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-zjrpw" event={"ID":"9e8fe991-f30a-4442-b41c-1e04e82e2fd8","Type":"ContainerStarted","Data":"2f9097825671a296aac1e7847024300f8cd56691e76f713311c4f2b0bfae4689"} Apr 20 22:30:01.163378 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:01.162973 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-zjrpw" Apr 20 22:30:01.229594 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:01.229541 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-zjrpw" podStartSLOduration=2.595661549 podStartE2EDuration="5.229526608s" podCreationTimestamp="2026-04-20 22:29:56 +0000 UTC" firstStartedPulling="2026-04-20 22:29:57.936204628 +0000 UTC m=+370.889159224" lastFinishedPulling="2026-04-20 22:30:00.570069686 +0000 UTC m=+373.523024283" observedRunningTime="2026-04-20 22:30:01.228238274 +0000 UTC m=+374.181192893" watchObservedRunningTime="2026-04-20 22:30:01.229526608 +0000 UTC m=+374.182481268" Apr 20 22:30:03.398588 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:03.398553 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-jtrrq"] Apr 20 22:30:03.402107 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:03.402085 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-jtrrq" Apr 20 22:30:03.405017 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:03.404994 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 20 22:30:03.405144 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:03.404998 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-bsf49\"" Apr 20 22:30:03.405144 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:03.405046 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 20 22:30:03.416480 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:03.416457 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-jtrrq"] Apr 20 22:30:03.425081 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:03.425054 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/aabb8992-960a-4dcf-954c-ccf691a6ecac-operator-config\") pod \"servicemesh-operator3-55f49c5f94-jtrrq\" (UID: \"aabb8992-960a-4dcf-954c-ccf691a6ecac\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-jtrrq" Apr 20 22:30:03.425207 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:03.425135 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcz8h\" (UniqueName: \"kubernetes.io/projected/aabb8992-960a-4dcf-954c-ccf691a6ecac-kube-api-access-pcz8h\") pod \"servicemesh-operator3-55f49c5f94-jtrrq\" (UID: \"aabb8992-960a-4dcf-954c-ccf691a6ecac\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-jtrrq" Apr 20 22:30:03.526129 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:03.526090 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcz8h\" (UniqueName: \"kubernetes.io/projected/aabb8992-960a-4dcf-954c-ccf691a6ecac-kube-api-access-pcz8h\") pod \"servicemesh-operator3-55f49c5f94-jtrrq\" (UID: \"aabb8992-960a-4dcf-954c-ccf691a6ecac\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-jtrrq" Apr 20 22:30:03.526129 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:03.526133 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/aabb8992-960a-4dcf-954c-ccf691a6ecac-operator-config\") pod \"servicemesh-operator3-55f49c5f94-jtrrq\" (UID: \"aabb8992-960a-4dcf-954c-ccf691a6ecac\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-jtrrq" Apr 20 22:30:03.528625 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:03.528602 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/aabb8992-960a-4dcf-954c-ccf691a6ecac-operator-config\") pod \"servicemesh-operator3-55f49c5f94-jtrrq\" (UID: \"aabb8992-960a-4dcf-954c-ccf691a6ecac\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-jtrrq" Apr 20 22:30:03.535722 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:03.535695 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcz8h\" (UniqueName: \"kubernetes.io/projected/aabb8992-960a-4dcf-954c-ccf691a6ecac-kube-api-access-pcz8h\") pod \"servicemesh-operator3-55f49c5f94-jtrrq\" (UID: \"aabb8992-960a-4dcf-954c-ccf691a6ecac\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-jtrrq" Apr 20 22:30:03.711703 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:03.711615 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-jtrrq" Apr 20 22:30:04.058081 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:04.058056 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-jtrrq"] Apr 20 22:30:04.063325 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:30:04.063290 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaabb8992_960a_4dcf_954c_ccf691a6ecac.slice/crio-0d968063f7346e36b7499aa01b431bef24ff0bc13069e041cda4d4edd1a4bcc5 WatchSource:0}: Error finding container 0d968063f7346e36b7499aa01b431bef24ff0bc13069e041cda4d4edd1a4bcc5: Status 404 returned error can't find the container with id 0d968063f7346e36b7499aa01b431bef24ff0bc13069e041cda4d4edd1a4bcc5 Apr 20 22:30:04.174338 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:04.174303 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-jtrrq" event={"ID":"aabb8992-960a-4dcf-954c-ccf691a6ecac","Type":"ContainerStarted","Data":"0d968063f7346e36b7499aa01b431bef24ff0bc13069e041cda4d4edd1a4bcc5"} Apr 20 22:30:11.199684 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:11.199644 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-jtrrq" event={"ID":"aabb8992-960a-4dcf-954c-ccf691a6ecac","Type":"ContainerStarted","Data":"e58b011dbe260298c3185ff486b534cb72eefaf01d9c5414593b9a55dceb5f7e"} Apr 20 22:30:11.200103 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:11.199752 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-jtrrq" Apr 20 22:30:11.225293 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:11.225244 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-jtrrq" podStartSLOduration=2.152393983 podStartE2EDuration="8.225230408s" podCreationTimestamp="2026-04-20 22:30:03 +0000 UTC" firstStartedPulling="2026-04-20 22:30:04.066276905 +0000 UTC m=+377.019231502" lastFinishedPulling="2026-04-20 22:30:10.139113324 +0000 UTC m=+383.092067927" observedRunningTime="2026-04-20 22:30:11.222237857 +0000 UTC m=+384.175192476" watchObservedRunningTime="2026-04-20 22:30:11.225230408 +0000 UTC m=+384.178185027" Apr 20 22:30:22.205545 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:22.205510 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-jtrrq" Apr 20 22:30:32.172000 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:32.171970 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-zjrpw" Apr 20 22:30:38.251888 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.251840 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4"] Apr 20 22:30:38.256084 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.256061 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.259061 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.259038 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-h9xm5\"" Apr 20 22:30:38.259182 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.259042 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 22:30:38.270234 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.270210 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4"] Apr 20 22:30:38.314744 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.314708 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twbd7\" (UniqueName: \"kubernetes.io/projected/2b080c9d-5122-42a5-bb9a-83082cadae1b-kube-api-access-twbd7\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.314906 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.314759 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2b080c9d-5122-42a5-bb9a-83082cadae1b-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.314906 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.314810 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2b080c9d-5122-42a5-bb9a-83082cadae1b-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.314906 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.314893 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2b080c9d-5122-42a5-bb9a-83082cadae1b-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.315059 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.314912 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2b080c9d-5122-42a5-bb9a-83082cadae1b-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.315059 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.314928 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2b080c9d-5122-42a5-bb9a-83082cadae1b-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.315059 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.314960 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2b080c9d-5122-42a5-bb9a-83082cadae1b-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.315059 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.315030 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2b080c9d-5122-42a5-bb9a-83082cadae1b-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.315231 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.315135 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2b080c9d-5122-42a5-bb9a-83082cadae1b-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.415987 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.415950 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2b080c9d-5122-42a5-bb9a-83082cadae1b-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.415987 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.415992 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twbd7\" (UniqueName: \"kubernetes.io/projected/2b080c9d-5122-42a5-bb9a-83082cadae1b-kube-api-access-twbd7\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.416199 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.416015 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2b080c9d-5122-42a5-bb9a-83082cadae1b-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.416199 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.416039 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2b080c9d-5122-42a5-bb9a-83082cadae1b-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.416199 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.416070 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2b080c9d-5122-42a5-bb9a-83082cadae1b-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.416199 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.416093 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2b080c9d-5122-42a5-bb9a-83082cadae1b-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.416199 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.416116 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2b080c9d-5122-42a5-bb9a-83082cadae1b-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.416440 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.416384 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2b080c9d-5122-42a5-bb9a-83082cadae1b-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.416498 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.416445 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2b080c9d-5122-42a5-bb9a-83082cadae1b-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.416556 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.416538 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/2b080c9d-5122-42a5-bb9a-83082cadae1b-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.416681 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.416657 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/2b080c9d-5122-42a5-bb9a-83082cadae1b-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.416785 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.416760 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/2b080c9d-5122-42a5-bb9a-83082cadae1b-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.416866 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.416794 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/2b080c9d-5122-42a5-bb9a-83082cadae1b-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.417299 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.417278 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/2b080c9d-5122-42a5-bb9a-83082cadae1b-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.418686 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.418660 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/2b080c9d-5122-42a5-bb9a-83082cadae1b-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.419155 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.419134 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2b080c9d-5122-42a5-bb9a-83082cadae1b-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.427382 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.427356 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twbd7\" (UniqueName: \"kubernetes.io/projected/2b080c9d-5122-42a5-bb9a-83082cadae1b-kube-api-access-twbd7\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.429358 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.429335 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/2b080c9d-5122-42a5-bb9a-83082cadae1b-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4\" (UID: \"2b080c9d-5122-42a5-bb9a-83082cadae1b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.566207 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.566165 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:38.714123 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:38.714097 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4"] Apr 20 22:30:38.716615 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:30:38.716591 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b080c9d_5122_42a5_bb9a_83082cadae1b.slice/crio-42efb9edd5d52bcbce1500b6cb609d4e271c4439327162e8f09efda84d70d545 WatchSource:0}: Error finding container 42efb9edd5d52bcbce1500b6cb609d4e271c4439327162e8f09efda84d70d545: Status 404 returned error can't find the container with id 42efb9edd5d52bcbce1500b6cb609d4e271c4439327162e8f09efda84d70d545 Apr 20 22:30:39.296612 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:39.296575 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" event={"ID":"2b080c9d-5122-42a5-bb9a-83082cadae1b","Type":"ContainerStarted","Data":"42efb9edd5d52bcbce1500b6cb609d4e271c4439327162e8f09efda84d70d545"} Apr 20 22:30:41.116683 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:41.116645 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 20 22:30:41.116996 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:41.116715 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 20 22:30:41.116996 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:41.116744 2573 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 20 22:30:41.304946 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:41.304899 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" event={"ID":"2b080c9d-5122-42a5-bb9a-83082cadae1b","Type":"ContainerStarted","Data":"b3015f4d98191d3905bcbaa6569e2882784827fe1bbac20ecc89e70c69e59792"} Apr 20 22:30:41.333431 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:41.333386 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" podStartSLOduration=0.935377914 podStartE2EDuration="3.333372245s" podCreationTimestamp="2026-04-20 22:30:38 +0000 UTC" firstStartedPulling="2026-04-20 22:30:38.718402713 +0000 UTC m=+411.671357310" lastFinishedPulling="2026-04-20 22:30:41.116397035 +0000 UTC m=+414.069351641" observedRunningTime="2026-04-20 22:30:41.330640025 +0000 UTC m=+414.283594645" watchObservedRunningTime="2026-04-20 22:30:41.333372245 +0000 UTC m=+414.286326864" Apr 20 22:30:41.566940 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:41.566904 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:42.570647 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:42.570618 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:43.312208 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:43.312179 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:43.313113 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:43.313089 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4" Apr 20 22:30:48.781101 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:48.781064 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-cbkgs"] Apr 20 22:30:48.783265 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:48.783249 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-cbkgs" Apr 20 22:30:48.786264 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:48.786242 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 22:30:48.786398 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:48.786290 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 22:30:48.787414 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:48.787395 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-mztdq\"" Apr 20 22:30:48.800022 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:48.799999 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-cbkgs"] Apr 20 22:30:48.903916 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:48.903880 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg7hc\" (UniqueName: \"kubernetes.io/projected/874cc89d-2c43-4380-8b01-b02a3ae56c66-kube-api-access-rg7hc\") pod \"kuadrant-operator-catalog-cbkgs\" (UID: \"874cc89d-2c43-4380-8b01-b02a3ae56c66\") " pod="kuadrant-system/kuadrant-operator-catalog-cbkgs" Apr 20 22:30:49.005290 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:49.005255 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rg7hc\" (UniqueName: \"kubernetes.io/projected/874cc89d-2c43-4380-8b01-b02a3ae56c66-kube-api-access-rg7hc\") pod \"kuadrant-operator-catalog-cbkgs\" (UID: \"874cc89d-2c43-4380-8b01-b02a3ae56c66\") " pod="kuadrant-system/kuadrant-operator-catalog-cbkgs" Apr 20 22:30:49.022673 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:49.022649 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg7hc\" (UniqueName: \"kubernetes.io/projected/874cc89d-2c43-4380-8b01-b02a3ae56c66-kube-api-access-rg7hc\") pod \"kuadrant-operator-catalog-cbkgs\" (UID: \"874cc89d-2c43-4380-8b01-b02a3ae56c66\") " pod="kuadrant-system/kuadrant-operator-catalog-cbkgs" Apr 20 22:30:49.092720 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:49.092658 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-cbkgs" Apr 20 22:30:49.108755 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:49.108730 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-cbkgs"] Apr 20 22:30:49.223809 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:49.223784 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-cbkgs"] Apr 20 22:30:49.226127 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:30:49.226094 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod874cc89d_2c43_4380_8b01_b02a3ae56c66.slice/crio-faf05ba8aedb6aa3b35c2771e687b45a0094474f7d749a1fb154fd3ea4c451ef WatchSource:0}: Error finding container faf05ba8aedb6aa3b35c2771e687b45a0094474f7d749a1fb154fd3ea4c451ef: Status 404 returned error can't find the container with id faf05ba8aedb6aa3b35c2771e687b45a0094474f7d749a1fb154fd3ea4c451ef Apr 20 22:30:49.321486 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:49.321452 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-xjm2q"] Apr 20 22:30:49.324149 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:49.324129 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-xjm2q" Apr 20 22:30:49.334015 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:49.333986 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-cbkgs" event={"ID":"874cc89d-2c43-4380-8b01-b02a3ae56c66","Type":"ContainerStarted","Data":"faf05ba8aedb6aa3b35c2771e687b45a0094474f7d749a1fb154fd3ea4c451ef"} Apr 20 22:30:49.335066 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:49.335046 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-xjm2q"] Apr 20 22:30:49.408969 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:49.408881 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqx6w\" (UniqueName: \"kubernetes.io/projected/f9614d14-fcc6-4518-8ae9-a96316e3b111-kube-api-access-gqx6w\") pod \"kuadrant-operator-catalog-xjm2q\" (UID: \"f9614d14-fcc6-4518-8ae9-a96316e3b111\") " pod="kuadrant-system/kuadrant-operator-catalog-xjm2q" Apr 20 22:30:49.510080 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:49.510042 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqx6w\" (UniqueName: \"kubernetes.io/projected/f9614d14-fcc6-4518-8ae9-a96316e3b111-kube-api-access-gqx6w\") pod \"kuadrant-operator-catalog-xjm2q\" (UID: \"f9614d14-fcc6-4518-8ae9-a96316e3b111\") " pod="kuadrant-system/kuadrant-operator-catalog-xjm2q" Apr 20 22:30:49.527283 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:49.527246 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqx6w\" (UniqueName: \"kubernetes.io/projected/f9614d14-fcc6-4518-8ae9-a96316e3b111-kube-api-access-gqx6w\") pod \"kuadrant-operator-catalog-xjm2q\" (UID: \"f9614d14-fcc6-4518-8ae9-a96316e3b111\") " pod="kuadrant-system/kuadrant-operator-catalog-xjm2q" Apr 20 22:30:49.634524 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:49.634489 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-xjm2q" Apr 20 22:30:49.783615 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:49.783586 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-xjm2q"] Apr 20 22:30:49.829771 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:30:49.829733 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9614d14_fcc6_4518_8ae9_a96316e3b111.slice/crio-d262fc6e61166c0500ad21ade040b925cb457faf39f281a7447c54bc960f9c8c WatchSource:0}: Error finding container d262fc6e61166c0500ad21ade040b925cb457faf39f281a7447c54bc960f9c8c: Status 404 returned error can't find the container with id d262fc6e61166c0500ad21ade040b925cb457faf39f281a7447c54bc960f9c8c Apr 20 22:30:50.343462 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:50.343412 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-xjm2q" event={"ID":"f9614d14-fcc6-4518-8ae9-a96316e3b111","Type":"ContainerStarted","Data":"d262fc6e61166c0500ad21ade040b925cb457faf39f281a7447c54bc960f9c8c"} Apr 20 22:30:52.352323 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:52.352282 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-xjm2q" event={"ID":"f9614d14-fcc6-4518-8ae9-a96316e3b111","Type":"ContainerStarted","Data":"87331248362ff6328195d6c7f8d0f9460a3df599cb2f0c4207c901030e638d3d"} Apr 20 22:30:52.353599 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:52.353576 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-cbkgs" event={"ID":"874cc89d-2c43-4380-8b01-b02a3ae56c66","Type":"ContainerStarted","Data":"b1c5385ed21cba063e2e75aa907f127bf3d0625fa813aab9b08ea1562c30f2d4"} Apr 20 22:30:52.353694 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:52.353623 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-cbkgs" podUID="874cc89d-2c43-4380-8b01-b02a3ae56c66" containerName="registry-server" containerID="cri-o://b1c5385ed21cba063e2e75aa907f127bf3d0625fa813aab9b08ea1562c30f2d4" gracePeriod=2 Apr 20 22:30:52.387967 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:52.387921 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-xjm2q" podStartSLOduration=1.92154917 podStartE2EDuration="3.387907038s" podCreationTimestamp="2026-04-20 22:30:49 +0000 UTC" firstStartedPulling="2026-04-20 22:30:49.831211968 +0000 UTC m=+422.784166566" lastFinishedPulling="2026-04-20 22:30:51.297569835 +0000 UTC m=+424.250524434" observedRunningTime="2026-04-20 22:30:52.3843376 +0000 UTC m=+425.337292219" watchObservedRunningTime="2026-04-20 22:30:52.387907038 +0000 UTC m=+425.340861657" Apr 20 22:30:52.413753 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:52.413702 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-cbkgs" podStartSLOduration=2.346657343 podStartE2EDuration="4.413685517s" podCreationTimestamp="2026-04-20 22:30:48 +0000 UTC" firstStartedPulling="2026-04-20 22:30:49.227495463 +0000 UTC m=+422.180450064" lastFinishedPulling="2026-04-20 22:30:51.294523639 +0000 UTC m=+424.247478238" observedRunningTime="2026-04-20 22:30:52.413439296 +0000 UTC m=+425.366393926" watchObservedRunningTime="2026-04-20 22:30:52.413685517 +0000 UTC m=+425.366640134" Apr 20 22:30:52.588970 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:52.588941 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-cbkgs" Apr 20 22:30:52.739079 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:52.738993 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg7hc\" (UniqueName: \"kubernetes.io/projected/874cc89d-2c43-4380-8b01-b02a3ae56c66-kube-api-access-rg7hc\") pod \"874cc89d-2c43-4380-8b01-b02a3ae56c66\" (UID: \"874cc89d-2c43-4380-8b01-b02a3ae56c66\") " Apr 20 22:30:52.741195 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:52.741172 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/874cc89d-2c43-4380-8b01-b02a3ae56c66-kube-api-access-rg7hc" (OuterVolumeSpecName: "kube-api-access-rg7hc") pod "874cc89d-2c43-4380-8b01-b02a3ae56c66" (UID: "874cc89d-2c43-4380-8b01-b02a3ae56c66"). InnerVolumeSpecName "kube-api-access-rg7hc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:30:52.840609 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:52.840582 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rg7hc\" (UniqueName: \"kubernetes.io/projected/874cc89d-2c43-4380-8b01-b02a3ae56c66-kube-api-access-rg7hc\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:30:53.357626 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:53.357587 2573 generic.go:358] "Generic (PLEG): container finished" podID="874cc89d-2c43-4380-8b01-b02a3ae56c66" containerID="b1c5385ed21cba063e2e75aa907f127bf3d0625fa813aab9b08ea1562c30f2d4" exitCode=0 Apr 20 22:30:53.358059 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:53.357637 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-cbkgs" Apr 20 22:30:53.358059 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:53.357668 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-cbkgs" event={"ID":"874cc89d-2c43-4380-8b01-b02a3ae56c66","Type":"ContainerDied","Data":"b1c5385ed21cba063e2e75aa907f127bf3d0625fa813aab9b08ea1562c30f2d4"} Apr 20 22:30:53.358059 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:53.357708 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-cbkgs" event={"ID":"874cc89d-2c43-4380-8b01-b02a3ae56c66","Type":"ContainerDied","Data":"faf05ba8aedb6aa3b35c2771e687b45a0094474f7d749a1fb154fd3ea4c451ef"} Apr 20 22:30:53.358059 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:53.357727 2573 scope.go:117] "RemoveContainer" containerID="b1c5385ed21cba063e2e75aa907f127bf3d0625fa813aab9b08ea1562c30f2d4" Apr 20 22:30:53.366323 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:53.366289 2573 scope.go:117] "RemoveContainer" containerID="b1c5385ed21cba063e2e75aa907f127bf3d0625fa813aab9b08ea1562c30f2d4" Apr 20 22:30:53.366549 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:30:53.366531 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1c5385ed21cba063e2e75aa907f127bf3d0625fa813aab9b08ea1562c30f2d4\": container with ID starting with b1c5385ed21cba063e2e75aa907f127bf3d0625fa813aab9b08ea1562c30f2d4 not found: ID does not exist" containerID="b1c5385ed21cba063e2e75aa907f127bf3d0625fa813aab9b08ea1562c30f2d4" Apr 20 22:30:53.366621 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:53.366562 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1c5385ed21cba063e2e75aa907f127bf3d0625fa813aab9b08ea1562c30f2d4"} err="failed to get container status \"b1c5385ed21cba063e2e75aa907f127bf3d0625fa813aab9b08ea1562c30f2d4\": rpc error: code = NotFound desc = could not find container \"b1c5385ed21cba063e2e75aa907f127bf3d0625fa813aab9b08ea1562c30f2d4\": container with ID starting with b1c5385ed21cba063e2e75aa907f127bf3d0625fa813aab9b08ea1562c30f2d4 not found: ID does not exist" Apr 20 22:30:53.385111 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:53.385077 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-cbkgs"] Apr 20 22:30:53.393638 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:53.393618 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-cbkgs"] Apr 20 22:30:53.600770 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:53.600741 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="874cc89d-2c43-4380-8b01-b02a3ae56c66" path="/var/lib/kubelet/pods/874cc89d-2c43-4380-8b01-b02a3ae56c66/volumes" Apr 20 22:30:59.634670 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:59.634639 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-xjm2q" Apr 20 22:30:59.634670 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:59.634680 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-xjm2q" Apr 20 22:30:59.655995 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:30:59.655972 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-xjm2q" Apr 20 22:31:00.402341 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:00.402314 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-xjm2q" Apr 20 22:31:18.111092 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:18.111009 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-nb7xc"] Apr 20 22:31:18.111470 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:18.111350 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="874cc89d-2c43-4380-8b01-b02a3ae56c66" containerName="registry-server" Apr 20 22:31:18.111470 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:18.111361 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="874cc89d-2c43-4380-8b01-b02a3ae56c66" containerName="registry-server" Apr 20 22:31:18.111470 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:18.111427 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="874cc89d-2c43-4380-8b01-b02a3ae56c66" containerName="registry-server" Apr 20 22:31:18.119095 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:18.119076 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-nb7xc" Apr 20 22:31:18.124497 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:18.124476 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-7sk94\"" Apr 20 22:31:18.129285 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:18.129267 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7b7p\" (UniqueName: \"kubernetes.io/projected/b43b525d-634f-421c-9fc2-19aed38025cd-kube-api-access-w7b7p\") pod \"authorino-operator-657f44b778-nb7xc\" (UID: \"b43b525d-634f-421c-9fc2-19aed38025cd\") " pod="kuadrant-system/authorino-operator-657f44b778-nb7xc" Apr 20 22:31:18.195576 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:18.195548 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-nb7xc"] Apr 20 22:31:18.230534 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:18.230506 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7b7p\" (UniqueName: \"kubernetes.io/projected/b43b525d-634f-421c-9fc2-19aed38025cd-kube-api-access-w7b7p\") pod \"authorino-operator-657f44b778-nb7xc\" (UID: \"b43b525d-634f-421c-9fc2-19aed38025cd\") " pod="kuadrant-system/authorino-operator-657f44b778-nb7xc" Apr 20 22:31:18.261904 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:18.261879 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7b7p\" (UniqueName: \"kubernetes.io/projected/b43b525d-634f-421c-9fc2-19aed38025cd-kube-api-access-w7b7p\") pod \"authorino-operator-657f44b778-nb7xc\" (UID: \"b43b525d-634f-421c-9fc2-19aed38025cd\") " pod="kuadrant-system/authorino-operator-657f44b778-nb7xc" Apr 20 22:31:18.429379 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:18.429297 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-nb7xc" Apr 20 22:31:18.583556 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:18.583524 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-nb7xc"] Apr 20 22:31:18.586842 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:31:18.586811 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb43b525d_634f_421c_9fc2_19aed38025cd.slice/crio-5c4cf5de3a59a9a469e2f6deabe81f13a3470f07f4a8bc01be7e814211950c23 WatchSource:0}: Error finding container 5c4cf5de3a59a9a469e2f6deabe81f13a3470f07f4a8bc01be7e814211950c23: Status 404 returned error can't find the container with id 5c4cf5de3a59a9a469e2f6deabe81f13a3470f07f4a8bc01be7e814211950c23 Apr 20 22:31:19.453307 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:19.453261 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-nb7xc" event={"ID":"b43b525d-634f-421c-9fc2-19aed38025cd","Type":"ContainerStarted","Data":"5c4cf5de3a59a9a469e2f6deabe81f13a3470f07f4a8bc01be7e814211950c23"} Apr 20 22:31:20.459225 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:20.459189 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-nb7xc" event={"ID":"b43b525d-634f-421c-9fc2-19aed38025cd","Type":"ContainerStarted","Data":"b86bd90c5183cd1ad9731eb4c9713e4546112dfc966775c3c636a3d031829b48"} Apr 20 22:31:20.459614 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:20.459333 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-nb7xc" Apr 20 22:31:20.497430 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:20.497382 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-nb7xc" podStartSLOduration=0.764457229 podStartE2EDuration="2.497368462s" podCreationTimestamp="2026-04-20 22:31:18 +0000 UTC" firstStartedPulling="2026-04-20 22:31:18.588900141 +0000 UTC m=+451.541854752" lastFinishedPulling="2026-04-20 22:31:20.321811371 +0000 UTC m=+453.274765985" observedRunningTime="2026-04-20 22:31:20.495148158 +0000 UTC m=+453.448102776" watchObservedRunningTime="2026-04-20 22:31:20.497368462 +0000 UTC m=+453.450323082" Apr 20 22:31:20.805471 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:20.805435 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-zbj5h"] Apr 20 22:31:20.808835 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:20.808817 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zbj5h" Apr 20 22:31:20.814823 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:20.814802 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-mm87t\"" Apr 20 22:31:20.818805 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:20.818785 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 20 22:31:20.851347 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:20.851305 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h5xv\" (UniqueName: \"kubernetes.io/projected/97af97ae-77b9-4101-83c3-59bc855f235e-kube-api-access-9h5xv\") pod \"dns-operator-controller-manager-648d5c98bc-zbj5h\" (UID: \"97af97ae-77b9-4101-83c3-59bc855f235e\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zbj5h" Apr 20 22:31:20.855838 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:20.855810 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-zbj5h"] Apr 20 22:31:20.952038 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:20.952002 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9h5xv\" (UniqueName: \"kubernetes.io/projected/97af97ae-77b9-4101-83c3-59bc855f235e-kube-api-access-9h5xv\") pod \"dns-operator-controller-manager-648d5c98bc-zbj5h\" (UID: \"97af97ae-77b9-4101-83c3-59bc855f235e\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zbj5h" Apr 20 22:31:20.973365 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:20.973333 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h5xv\" (UniqueName: \"kubernetes.io/projected/97af97ae-77b9-4101-83c3-59bc855f235e-kube-api-access-9h5xv\") pod \"dns-operator-controller-manager-648d5c98bc-zbj5h\" (UID: \"97af97ae-77b9-4101-83c3-59bc855f235e\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zbj5h" Apr 20 22:31:21.119370 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:21.119276 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zbj5h" Apr 20 22:31:21.274326 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:21.274271 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-zbj5h"] Apr 20 22:31:21.277066 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:31:21.277031 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97af97ae_77b9_4101_83c3_59bc855f235e.slice/crio-268075ff1aed2401ccc96c73383fdedbfda61032000bc6831629c121bbda0256 WatchSource:0}: Error finding container 268075ff1aed2401ccc96c73383fdedbfda61032000bc6831629c121bbda0256: Status 404 returned error can't find the container with id 268075ff1aed2401ccc96c73383fdedbfda61032000bc6831629c121bbda0256 Apr 20 22:31:21.463934 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:21.463813 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zbj5h" event={"ID":"97af97ae-77b9-4101-83c3-59bc855f235e","Type":"ContainerStarted","Data":"268075ff1aed2401ccc96c73383fdedbfda61032000bc6831629c121bbda0256"} Apr 20 22:31:24.476315 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:24.476233 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zbj5h" event={"ID":"97af97ae-77b9-4101-83c3-59bc855f235e","Type":"ContainerStarted","Data":"8a8644685c7cb3985037606ef0b6926c33ae4de9dda44f50e9c9a05c3d6dd8ba"} Apr 20 22:31:24.476658 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:24.476357 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zbj5h" Apr 20 22:31:25.350436 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:25.350384 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zbj5h" podStartSLOduration=2.73510265 podStartE2EDuration="5.35036665s" podCreationTimestamp="2026-04-20 22:31:20 +0000 UTC" firstStartedPulling="2026-04-20 22:31:21.279047277 +0000 UTC m=+454.232001874" lastFinishedPulling="2026-04-20 22:31:23.894311265 +0000 UTC m=+456.847265874" observedRunningTime="2026-04-20 22:31:24.561466967 +0000 UTC m=+457.514421585" watchObservedRunningTime="2026-04-20 22:31:25.35036665 +0000 UTC m=+458.303321269" Apr 20 22:31:25.351441 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:25.351423 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjvcz"] Apr 20 22:31:25.356409 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:25.356172 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjvcz" Apr 20 22:31:25.360059 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:25.360039 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-zbxz8\"" Apr 20 22:31:25.382438 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:25.382412 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5s59\" (UniqueName: \"kubernetes.io/projected/a3780854-4f14-493e-8b2c-dc3127bbad7c-kube-api-access-k5s59\") pod \"limitador-operator-controller-manager-85c4996f8c-xjvcz\" (UID: \"a3780854-4f14-493e-8b2c-dc3127bbad7c\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjvcz" Apr 20 22:31:25.389586 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:25.389562 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjvcz"] Apr 20 22:31:25.483117 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:25.483082 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5s59\" (UniqueName: \"kubernetes.io/projected/a3780854-4f14-493e-8b2c-dc3127bbad7c-kube-api-access-k5s59\") pod \"limitador-operator-controller-manager-85c4996f8c-xjvcz\" (UID: \"a3780854-4f14-493e-8b2c-dc3127bbad7c\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjvcz" Apr 20 22:31:25.505465 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:25.505432 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5s59\" (UniqueName: \"kubernetes.io/projected/a3780854-4f14-493e-8b2c-dc3127bbad7c-kube-api-access-k5s59\") pod \"limitador-operator-controller-manager-85c4996f8c-xjvcz\" (UID: \"a3780854-4f14-493e-8b2c-dc3127bbad7c\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjvcz" Apr 20 22:31:25.668320 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:25.668216 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjvcz" Apr 20 22:31:25.849204 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:25.849179 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjvcz"] Apr 20 22:31:25.851476 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:31:25.851447 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3780854_4f14_493e_8b2c_dc3127bbad7c.slice/crio-3107fbe171cd7c730ced38eb82a6cdbc3c4ae29313d4f5dacb7d9777be61c05a WatchSource:0}: Error finding container 3107fbe171cd7c730ced38eb82a6cdbc3c4ae29313d4f5dacb7d9777be61c05a: Status 404 returned error can't find the container with id 3107fbe171cd7c730ced38eb82a6cdbc3c4ae29313d4f5dacb7d9777be61c05a Apr 20 22:31:26.483911 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:26.483879 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjvcz" event={"ID":"a3780854-4f14-493e-8b2c-dc3127bbad7c","Type":"ContainerStarted","Data":"3107fbe171cd7c730ced38eb82a6cdbc3c4ae29313d4f5dacb7d9777be61c05a"} Apr 20 22:31:28.492365 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:28.492330 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjvcz" event={"ID":"a3780854-4f14-493e-8b2c-dc3127bbad7c","Type":"ContainerStarted","Data":"59d79c61356af79d2c940a081e4e180c6444efa3fb428312034f2c8341778ba1"} Apr 20 22:31:28.492735 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:28.492436 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjvcz" Apr 20 22:31:28.539547 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:28.539502 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjvcz" podStartSLOduration=1.567368394 podStartE2EDuration="3.539487329s" podCreationTimestamp="2026-04-20 22:31:25 +0000 UTC" firstStartedPulling="2026-04-20 22:31:25.853376677 +0000 UTC m=+458.806331274" lastFinishedPulling="2026-04-20 22:31:27.825495604 +0000 UTC m=+460.778450209" observedRunningTime="2026-04-20 22:31:28.538592016 +0000 UTC m=+461.491546635" watchObservedRunningTime="2026-04-20 22:31:28.539487329 +0000 UTC m=+461.492441948" Apr 20 22:31:30.878759 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:30.878725 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh"] Apr 20 22:31:30.882247 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:30.882231 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" Apr 20 22:31:30.887202 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:30.887184 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-st7w6\"" Apr 20 22:31:30.916793 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:30.916757 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh"] Apr 20 22:31:30.927681 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:30.927651 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/370e1b6d-23d3-4e9d-a133-4617676d7869-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vwhwh\" (UID: \"370e1b6d-23d3-4e9d-a133-4617676d7869\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" Apr 20 22:31:30.927817 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:30.927697 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44cmv\" (UniqueName: \"kubernetes.io/projected/370e1b6d-23d3-4e9d-a133-4617676d7869-kube-api-access-44cmv\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vwhwh\" (UID: \"370e1b6d-23d3-4e9d-a133-4617676d7869\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" Apr 20 22:31:31.028480 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:31.028451 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/370e1b6d-23d3-4e9d-a133-4617676d7869-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vwhwh\" (UID: \"370e1b6d-23d3-4e9d-a133-4617676d7869\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" Apr 20 22:31:31.028480 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:31.028487 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44cmv\" (UniqueName: \"kubernetes.io/projected/370e1b6d-23d3-4e9d-a133-4617676d7869-kube-api-access-44cmv\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vwhwh\" (UID: \"370e1b6d-23d3-4e9d-a133-4617676d7869\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" Apr 20 22:31:31.028817 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:31.028799 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/370e1b6d-23d3-4e9d-a133-4617676d7869-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vwhwh\" (UID: \"370e1b6d-23d3-4e9d-a133-4617676d7869\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" Apr 20 22:31:31.053070 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:31.053039 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44cmv\" (UniqueName: \"kubernetes.io/projected/370e1b6d-23d3-4e9d-a133-4617676d7869-kube-api-access-44cmv\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-vwhwh\" (UID: \"370e1b6d-23d3-4e9d-a133-4617676d7869\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" Apr 20 22:31:31.191928 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:31.191818 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" Apr 20 22:31:31.358351 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:31.358321 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh"] Apr 20 22:31:31.361917 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:31:31.361885 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod370e1b6d_23d3_4e9d_a133_4617676d7869.slice/crio-af0aeb42b1970fe7f450a008be217c9c24d29fa1b2f20292ef58a3dd347a3141 WatchSource:0}: Error finding container af0aeb42b1970fe7f450a008be217c9c24d29fa1b2f20292ef58a3dd347a3141: Status 404 returned error can't find the container with id af0aeb42b1970fe7f450a008be217c9c24d29fa1b2f20292ef58a3dd347a3141 Apr 20 22:31:31.466582 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:31.466500 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-nb7xc" Apr 20 22:31:31.504779 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:31.504742 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" event={"ID":"370e1b6d-23d3-4e9d-a133-4617676d7869","Type":"ContainerStarted","Data":"af0aeb42b1970fe7f450a008be217c9c24d29fa1b2f20292ef58a3dd347a3141"} Apr 20 22:31:35.483144 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:35.483112 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-zbj5h" Apr 20 22:31:36.527344 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:36.527310 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" event={"ID":"370e1b6d-23d3-4e9d-a133-4617676d7869","Type":"ContainerStarted","Data":"1b3a1d7de3e2ef7a44610b857b4df54f6bbc512eb29ae1049f7879f0b958a384"} Apr 20 22:31:36.527710 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:36.527367 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" Apr 20 22:31:36.581963 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:36.581912 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" podStartSLOduration=1.530128237 podStartE2EDuration="6.581896884s" podCreationTimestamp="2026-04-20 22:31:30 +0000 UTC" firstStartedPulling="2026-04-20 22:31:31.364505854 +0000 UTC m=+464.317460451" lastFinishedPulling="2026-04-20 22:31:36.41627449 +0000 UTC m=+469.369229098" observedRunningTime="2026-04-20 22:31:36.578813932 +0000 UTC m=+469.531768551" watchObservedRunningTime="2026-04-20 22:31:36.581896884 +0000 UTC m=+469.534851502" Apr 20 22:31:39.498207 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:39.498172 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjvcz" Apr 20 22:31:47.534234 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:47.534199 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" Apr 20 22:31:49.743222 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.743176 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh"] Apr 20 22:31:49.743712 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.743423 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" podUID="370e1b6d-23d3-4e9d-a133-4617676d7869" containerName="manager" containerID="cri-o://1b3a1d7de3e2ef7a44610b857b4df54f6bbc512eb29ae1049f7879f0b958a384" gracePeriod=2 Apr 20 22:31:49.753010 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.752976 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh"] Apr 20 22:31:49.769382 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.769356 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjvcz"] Apr 20 22:31:49.769625 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.769600 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjvcz" podUID="a3780854-4f14-493e-8b2c-dc3127bbad7c" containerName="manager" containerID="cri-o://59d79c61356af79d2c940a081e4e180c6444efa3fb428312034f2c8341778ba1" gracePeriod=2 Apr 20 22:31:49.788629 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.788597 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjvcz"] Apr 20 22:31:49.789838 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.789793 2573 status_manager.go:895] "Failed to get status for pod" podUID="370e1b6d-23d3-4e9d-a133-4617676d7869" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vwhwh\" is forbidden: User \"system:node:ip-10-0-130-91.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-91.ec2.internal' and this object" Apr 20 22:31:49.790393 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.790365 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh"] Apr 20 22:31:49.790743 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.790719 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3780854-4f14-493e-8b2c-dc3127bbad7c" containerName="manager" Apr 20 22:31:49.790743 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.790737 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3780854-4f14-493e-8b2c-dc3127bbad7c" containerName="manager" Apr 20 22:31:49.790929 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.790765 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="370e1b6d-23d3-4e9d-a133-4617676d7869" containerName="manager" Apr 20 22:31:49.790929 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.790771 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="370e1b6d-23d3-4e9d-a133-4617676d7869" containerName="manager" Apr 20 22:31:49.790929 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.790842 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="370e1b6d-23d3-4e9d-a133-4617676d7869" containerName="manager" Apr 20 22:31:49.790929 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.790874 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3780854-4f14-493e-8b2c-dc3127bbad7c" containerName="manager" Apr 20 22:31:49.793723 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.793706 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh" Apr 20 22:31:49.811498 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.811469 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh"] Apr 20 22:31:49.832076 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.832052 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j5dk5"] Apr 20 22:31:49.835367 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.835348 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j5dk5" Apr 20 22:31:49.859907 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.859833 2573 status_manager.go:895] "Failed to get status for pod" podUID="370e1b6d-23d3-4e9d-a133-4617676d7869" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vwhwh\" is forbidden: User \"system:node:ip-10-0-130-91.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-91.ec2.internal' and this object" Apr 20 22:31:49.862154 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.862130 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j5dk5"] Apr 20 22:31:49.862391 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.862362 2573 status_manager.go:895] "Failed to get status for pod" podUID="370e1b6d-23d3-4e9d-a133-4617676d7869" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vwhwh\" is forbidden: User \"system:node:ip-10-0-130-91.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-91.ec2.internal' and this object" Apr 20 22:31:49.890349 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.890321 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz649\" (UniqueName: \"kubernetes.io/projected/0cb057dc-2749-4a27-8f80-66b116ce4aea-kube-api-access-kz649\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-mnbrh\" (UID: \"0cb057dc-2749-4a27-8f80-66b116ce4aea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh" Apr 20 22:31:49.890440 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.890405 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0cb057dc-2749-4a27-8f80-66b116ce4aea-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-mnbrh\" (UID: \"0cb057dc-2749-4a27-8f80-66b116ce4aea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh" Apr 20 22:31:49.890507 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.890464 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mljvd\" (UniqueName: \"kubernetes.io/projected/f246becd-26e3-46c7-92a3-af2a74f5c1be-kube-api-access-mljvd\") pod \"limitador-operator-controller-manager-85c4996f8c-j5dk5\" (UID: \"f246becd-26e3-46c7-92a3-af2a74f5c1be\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j5dk5" Apr 20 22:31:49.991425 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.991397 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0cb057dc-2749-4a27-8f80-66b116ce4aea-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-mnbrh\" (UID: \"0cb057dc-2749-4a27-8f80-66b116ce4aea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh" Apr 20 22:31:49.991526 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.991434 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mljvd\" (UniqueName: \"kubernetes.io/projected/f246becd-26e3-46c7-92a3-af2a74f5c1be-kube-api-access-mljvd\") pod \"limitador-operator-controller-manager-85c4996f8c-j5dk5\" (UID: \"f246becd-26e3-46c7-92a3-af2a74f5c1be\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j5dk5" Apr 20 22:31:49.991526 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.991481 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kz649\" (UniqueName: \"kubernetes.io/projected/0cb057dc-2749-4a27-8f80-66b116ce4aea-kube-api-access-kz649\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-mnbrh\" (UID: \"0cb057dc-2749-4a27-8f80-66b116ce4aea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh" Apr 20 22:31:49.991832 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:49.991808 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0cb057dc-2749-4a27-8f80-66b116ce4aea-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-mnbrh\" (UID: \"0cb057dc-2749-4a27-8f80-66b116ce4aea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh" Apr 20 22:31:50.006167 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.006140 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz649\" (UniqueName: \"kubernetes.io/projected/0cb057dc-2749-4a27-8f80-66b116ce4aea-kube-api-access-kz649\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-mnbrh\" (UID: \"0cb057dc-2749-4a27-8f80-66b116ce4aea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh" Apr 20 22:31:50.006400 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.006378 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mljvd\" (UniqueName: \"kubernetes.io/projected/f246becd-26e3-46c7-92a3-af2a74f5c1be-kube-api-access-mljvd\") pod \"limitador-operator-controller-manager-85c4996f8c-j5dk5\" (UID: \"f246becd-26e3-46c7-92a3-af2a74f5c1be\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j5dk5" Apr 20 22:31:50.009846 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.009830 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" Apr 20 22:31:50.013203 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.013187 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjvcz" Apr 20 22:31:50.013398 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.013377 2573 status_manager.go:895] "Failed to get status for pod" podUID="370e1b6d-23d3-4e9d-a133-4617676d7869" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vwhwh\" is forbidden: User \"system:node:ip-10-0-130-91.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-91.ec2.internal' and this object" Apr 20 22:31:50.016210 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.016188 2573 status_manager.go:895] "Failed to get status for pod" podUID="a3780854-4f14-493e-8b2c-dc3127bbad7c" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjvcz" err="pods \"limitador-operator-controller-manager-85c4996f8c-xjvcz\" is forbidden: User \"system:node:ip-10-0-130-91.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-91.ec2.internal' and this object" Apr 20 22:31:50.018314 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.018295 2573 status_manager.go:895] "Failed to get status for pod" podUID="370e1b6d-23d3-4e9d-a133-4617676d7869" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vwhwh\" is forbidden: User \"system:node:ip-10-0-130-91.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-91.ec2.internal' and this object" Apr 20 22:31:50.092830 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.092800 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/370e1b6d-23d3-4e9d-a133-4617676d7869-extensions-socket-volume\") pod \"370e1b6d-23d3-4e9d-a133-4617676d7869\" (UID: \"370e1b6d-23d3-4e9d-a133-4617676d7869\") " Apr 20 22:31:50.093040 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.092848 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5s59\" (UniqueName: \"kubernetes.io/projected/a3780854-4f14-493e-8b2c-dc3127bbad7c-kube-api-access-k5s59\") pod \"a3780854-4f14-493e-8b2c-dc3127bbad7c\" (UID: \"a3780854-4f14-493e-8b2c-dc3127bbad7c\") " Apr 20 22:31:50.093040 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.092907 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44cmv\" (UniqueName: \"kubernetes.io/projected/370e1b6d-23d3-4e9d-a133-4617676d7869-kube-api-access-44cmv\") pod \"370e1b6d-23d3-4e9d-a133-4617676d7869\" (UID: \"370e1b6d-23d3-4e9d-a133-4617676d7869\") " Apr 20 22:31:50.093210 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.093180 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/370e1b6d-23d3-4e9d-a133-4617676d7869-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "370e1b6d-23d3-4e9d-a133-4617676d7869" (UID: "370e1b6d-23d3-4e9d-a133-4617676d7869"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:31:50.094904 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.094868 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/370e1b6d-23d3-4e9d-a133-4617676d7869-kube-api-access-44cmv" (OuterVolumeSpecName: "kube-api-access-44cmv") pod "370e1b6d-23d3-4e9d-a133-4617676d7869" (UID: "370e1b6d-23d3-4e9d-a133-4617676d7869"). InnerVolumeSpecName "kube-api-access-44cmv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:31:50.094904 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.094894 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3780854-4f14-493e-8b2c-dc3127bbad7c-kube-api-access-k5s59" (OuterVolumeSpecName: "kube-api-access-k5s59") pod "a3780854-4f14-493e-8b2c-dc3127bbad7c" (UID: "a3780854-4f14-493e-8b2c-dc3127bbad7c"). InnerVolumeSpecName "kube-api-access-k5s59". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:31:50.184380 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.184328 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh" Apr 20 22:31:50.191186 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.191160 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j5dk5" Apr 20 22:31:50.194300 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.194273 2573 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/370e1b6d-23d3-4e9d-a133-4617676d7869-extensions-socket-volume\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:31:50.194355 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.194309 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k5s59\" (UniqueName: \"kubernetes.io/projected/a3780854-4f14-493e-8b2c-dc3127bbad7c-kube-api-access-k5s59\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:31:50.194355 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.194328 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-44cmv\" (UniqueName: \"kubernetes.io/projected/370e1b6d-23d3-4e9d-a133-4617676d7869-kube-api-access-44cmv\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:31:50.342935 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.342906 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh"] Apr 20 22:31:50.344218 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:31:50.344190 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cb057dc_2749_4a27_8f80_66b116ce4aea.slice/crio-ddd11f7ed3fa3050de61c84030753d73d1e982167a0b45320580c2fcff97302b WatchSource:0}: Error finding container ddd11f7ed3fa3050de61c84030753d73d1e982167a0b45320580c2fcff97302b: Status 404 returned error can't find the container with id ddd11f7ed3fa3050de61c84030753d73d1e982167a0b45320580c2fcff97302b Apr 20 22:31:50.354308 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.354285 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j5dk5"] Apr 20 22:31:50.357876 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:31:50.357832 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf246becd_26e3_46c7_92a3_af2a74f5c1be.slice/crio-8221de68950969c474f295afe4f5e23c18449da5e56f83672c02349acce6d27e WatchSource:0}: Error finding container 8221de68950969c474f295afe4f5e23c18449da5e56f83672c02349acce6d27e: Status 404 returned error can't find the container with id 8221de68950969c474f295afe4f5e23c18449da5e56f83672c02349acce6d27e Apr 20 22:31:50.580922 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.580820 2573 generic.go:358] "Generic (PLEG): container finished" podID="370e1b6d-23d3-4e9d-a133-4617676d7869" containerID="1b3a1d7de3e2ef7a44610b857b4df54f6bbc512eb29ae1049f7879f0b958a384" exitCode=0 Apr 20 22:31:50.580922 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.580889 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" Apr 20 22:31:50.580922 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.580920 2573 scope.go:117] "RemoveContainer" containerID="1b3a1d7de3e2ef7a44610b857b4df54f6bbc512eb29ae1049f7879f0b958a384" Apr 20 22:31:50.582462 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.582435 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh" event={"ID":"0cb057dc-2749-4a27-8f80-66b116ce4aea","Type":"ContainerStarted","Data":"32ddcc8601a682e35110c570b537017ce7a8aa988a79038fe30fbef154991dc8"} Apr 20 22:31:50.582582 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.582466 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh" event={"ID":"0cb057dc-2749-4a27-8f80-66b116ce4aea","Type":"ContainerStarted","Data":"ddd11f7ed3fa3050de61c84030753d73d1e982167a0b45320580c2fcff97302b"} Apr 20 22:31:50.582582 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.582574 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh" Apr 20 22:31:50.583807 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.583770 2573 generic.go:358] "Generic (PLEG): container finished" podID="a3780854-4f14-493e-8b2c-dc3127bbad7c" containerID="59d79c61356af79d2c940a081e4e180c6444efa3fb428312034f2c8341778ba1" exitCode=0 Apr 20 22:31:50.583917 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.583813 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjvcz" Apr 20 22:31:50.585263 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.585242 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j5dk5" event={"ID":"f246becd-26e3-46c7-92a3-af2a74f5c1be","Type":"ContainerStarted","Data":"3ce9095d7fa0f56dc99f5340ce3d7207f2f56da4b1e07b71b9d1838e7f8f3ff5"} Apr 20 22:31:50.585439 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.585265 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j5dk5" event={"ID":"f246becd-26e3-46c7-92a3-af2a74f5c1be","Type":"ContainerStarted","Data":"8221de68950969c474f295afe4f5e23c18449da5e56f83672c02349acce6d27e"} Apr 20 22:31:50.585439 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.585367 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j5dk5" Apr 20 22:31:50.586704 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.586677 2573 status_manager.go:895] "Failed to get status for pod" podUID="a3780854-4f14-493e-8b2c-dc3127bbad7c" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjvcz" err="pods \"limitador-operator-controller-manager-85c4996f8c-xjvcz\" is forbidden: User \"system:node:ip-10-0-130-91.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-91.ec2.internal' and this object" Apr 20 22:31:50.590063 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.590045 2573 scope.go:117] "RemoveContainer" containerID="1b3a1d7de3e2ef7a44610b857b4df54f6bbc512eb29ae1049f7879f0b958a384" Apr 20 22:31:50.590315 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:31:50.590296 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b3a1d7de3e2ef7a44610b857b4df54f6bbc512eb29ae1049f7879f0b958a384\": container with ID starting with 1b3a1d7de3e2ef7a44610b857b4df54f6bbc512eb29ae1049f7879f0b958a384 not found: ID does not exist" containerID="1b3a1d7de3e2ef7a44610b857b4df54f6bbc512eb29ae1049f7879f0b958a384" Apr 20 22:31:50.590405 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.590328 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b3a1d7de3e2ef7a44610b857b4df54f6bbc512eb29ae1049f7879f0b958a384"} err="failed to get container status \"1b3a1d7de3e2ef7a44610b857b4df54f6bbc512eb29ae1049f7879f0b958a384\": rpc error: code = NotFound desc = could not find container \"1b3a1d7de3e2ef7a44610b857b4df54f6bbc512eb29ae1049f7879f0b958a384\": container with ID starting with 1b3a1d7de3e2ef7a44610b857b4df54f6bbc512eb29ae1049f7879f0b958a384 not found: ID does not exist" Apr 20 22:31:50.590405 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.590351 2573 scope.go:117] "RemoveContainer" containerID="59d79c61356af79d2c940a081e4e180c6444efa3fb428312034f2c8341778ba1" Apr 20 22:31:50.598791 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.598776 2573 scope.go:117] "RemoveContainer" containerID="59d79c61356af79d2c940a081e4e180c6444efa3fb428312034f2c8341778ba1" Apr 20 22:31:50.599067 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:31:50.599049 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59d79c61356af79d2c940a081e4e180c6444efa3fb428312034f2c8341778ba1\": container with ID starting with 59d79c61356af79d2c940a081e4e180c6444efa3fb428312034f2c8341778ba1 not found: ID does not exist" containerID="59d79c61356af79d2c940a081e4e180c6444efa3fb428312034f2c8341778ba1" Apr 20 22:31:50.599126 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.599074 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59d79c61356af79d2c940a081e4e180c6444efa3fb428312034f2c8341778ba1"} err="failed to get container status \"59d79c61356af79d2c940a081e4e180c6444efa3fb428312034f2c8341778ba1\": rpc error: code = NotFound desc = could not find container \"59d79c61356af79d2c940a081e4e180c6444efa3fb428312034f2c8341778ba1\": container with ID starting with 59d79c61356af79d2c940a081e4e180c6444efa3fb428312034f2c8341778ba1 not found: ID does not exist" Apr 20 22:31:50.603084 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.603063 2573 status_manager.go:895] "Failed to get status for pod" podUID="370e1b6d-23d3-4e9d-a133-4617676d7869" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vwhwh\" is forbidden: User \"system:node:ip-10-0-130-91.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-91.ec2.internal' and this object" Apr 20 22:31:50.605604 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.605581 2573 status_manager.go:895] "Failed to get status for pod" podUID="a3780854-4f14-493e-8b2c-dc3127bbad7c" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjvcz" err="pods \"limitador-operator-controller-manager-85c4996f8c-xjvcz\" is forbidden: User \"system:node:ip-10-0-130-91.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-91.ec2.internal' and this object" Apr 20 22:31:50.609008 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.608986 2573 status_manager.go:895] "Failed to get status for pod" podUID="370e1b6d-23d3-4e9d-a133-4617676d7869" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vwhwh\" is forbidden: User \"system:node:ip-10-0-130-91.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-91.ec2.internal' and this object" Apr 20 22:31:50.640828 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.640782 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j5dk5" podStartSLOduration=1.640767811 podStartE2EDuration="1.640767811s" podCreationTimestamp="2026-04-20 22:31:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:31:50.639740126 +0000 UTC m=+483.592694744" watchObservedRunningTime="2026-04-20 22:31:50.640767811 +0000 UTC m=+483.593722430" Apr 20 22:31:50.707450 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:50.707404 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh" podStartSLOduration=1.707388834 podStartE2EDuration="1.707388834s" podCreationTimestamp="2026-04-20 22:31:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:31:50.706235813 +0000 UTC m=+483.659190432" watchObservedRunningTime="2026-04-20 22:31:50.707388834 +0000 UTC m=+483.660343452" Apr 20 22:31:51.492425 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:51.492386 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-t9fwn"] Apr 20 22:31:51.497250 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:51.497227 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-t9fwn" Apr 20 22:31:51.500138 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:51.500112 2573 status_manager.go:895] "Failed to get status for pod" podUID="370e1b6d-23d3-4e9d-a133-4617676d7869" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-vwhwh" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-vwhwh\" is forbidden: User \"system:node:ip-10-0-130-91.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-91.ec2.internal' and this object" Apr 20 22:31:51.512914 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:51.512890 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-t9fwn"] Apr 20 22:31:51.546160 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:51.546124 2573 status_manager.go:895] "Failed to get status for pod" podUID="a3780854-4f14-493e-8b2c-dc3127bbad7c" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-xjvcz" err="pods \"limitador-operator-controller-manager-85c4996f8c-xjvcz\" is forbidden: User \"system:node:ip-10-0-130-91.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-91.ec2.internal' and this object" Apr 20 22:31:51.600986 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:51.600957 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="370e1b6d-23d3-4e9d-a133-4617676d7869" path="/var/lib/kubelet/pods/370e1b6d-23d3-4e9d-a133-4617676d7869/volumes" Apr 20 22:31:51.601290 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:51.601278 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3780854-4f14-493e-8b2c-dc3127bbad7c" path="/var/lib/kubelet/pods/a3780854-4f14-493e-8b2c-dc3127bbad7c/volumes" Apr 20 22:31:51.607551 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:51.607532 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75rjj\" (UniqueName: \"kubernetes.io/projected/65057b6c-ab07-4e47-bddc-77b868f3d0bd-kube-api-access-75rjj\") pod \"kuadrant-operator-controller-manager-55c7f4c975-t9fwn\" (UID: \"65057b6c-ab07-4e47-bddc-77b868f3d0bd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-t9fwn" Apr 20 22:31:51.607607 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:51.607564 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/65057b6c-ab07-4e47-bddc-77b868f3d0bd-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-t9fwn\" (UID: \"65057b6c-ab07-4e47-bddc-77b868f3d0bd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-t9fwn" Apr 20 22:31:51.708172 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:51.708130 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-75rjj\" (UniqueName: \"kubernetes.io/projected/65057b6c-ab07-4e47-bddc-77b868f3d0bd-kube-api-access-75rjj\") pod \"kuadrant-operator-controller-manager-55c7f4c975-t9fwn\" (UID: \"65057b6c-ab07-4e47-bddc-77b868f3d0bd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-t9fwn" Apr 20 22:31:51.708172 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:51.708171 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/65057b6c-ab07-4e47-bddc-77b868f3d0bd-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-t9fwn\" (UID: \"65057b6c-ab07-4e47-bddc-77b868f3d0bd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-t9fwn" Apr 20 22:31:51.708491 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:51.708473 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/65057b6c-ab07-4e47-bddc-77b868f3d0bd-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-t9fwn\" (UID: \"65057b6c-ab07-4e47-bddc-77b868f3d0bd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-t9fwn" Apr 20 22:31:51.728872 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:51.724712 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-75rjj\" (UniqueName: \"kubernetes.io/projected/65057b6c-ab07-4e47-bddc-77b868f3d0bd-kube-api-access-75rjj\") pod \"kuadrant-operator-controller-manager-55c7f4c975-t9fwn\" (UID: \"65057b6c-ab07-4e47-bddc-77b868f3d0bd\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-t9fwn" Apr 20 22:31:51.807716 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:51.807677 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-t9fwn" Apr 20 22:31:51.948737 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:51.948711 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-t9fwn"] Apr 20 22:31:51.950372 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:31:51.950335 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65057b6c_ab07_4e47_bddc_77b868f3d0bd.slice/crio-770c2f8e95901ba772fa5bccebdcc603e8721bdfd0fa12b69a1fc885055ee9d8 WatchSource:0}: Error finding container 770c2f8e95901ba772fa5bccebdcc603e8721bdfd0fa12b69a1fc885055ee9d8: Status 404 returned error can't find the container with id 770c2f8e95901ba772fa5bccebdcc603e8721bdfd0fa12b69a1fc885055ee9d8 Apr 20 22:31:52.595829 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:52.595795 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-t9fwn" event={"ID":"65057b6c-ab07-4e47-bddc-77b868f3d0bd","Type":"ContainerStarted","Data":"64d91171ea68a3bf9580c8e04a39476756957d4a96a5fd62bfb18bea494e0524"} Apr 20 22:31:52.595829 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:52.595832 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-t9fwn" event={"ID":"65057b6c-ab07-4e47-bddc-77b868f3d0bd","Type":"ContainerStarted","Data":"770c2f8e95901ba772fa5bccebdcc603e8721bdfd0fa12b69a1fc885055ee9d8"} Apr 20 22:31:52.596264 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:52.595999 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-t9fwn" Apr 20 22:31:52.624018 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:31:52.623966 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-t9fwn" podStartSLOduration=1.623951923 podStartE2EDuration="1.623951923s" podCreationTimestamp="2026-04-20 22:31:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:31:52.622693109 +0000 UTC m=+485.575647727" watchObservedRunningTime="2026-04-20 22:31:52.623951923 +0000 UTC m=+485.576906542" Apr 20 22:32:01.593900 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:01.593837 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-j5dk5" Apr 20 22:32:01.594274 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:01.594079 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh" Apr 20 22:32:03.604469 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:03.604444 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-t9fwn" Apr 20 22:32:03.657155 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:03.657123 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh"] Apr 20 22:32:03.657332 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:03.657304 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh" podUID="0cb057dc-2749-4a27-8f80-66b116ce4aea" containerName="manager" containerID="cri-o://32ddcc8601a682e35110c570b537017ce7a8aa988a79038fe30fbef154991dc8" gracePeriod=10 Apr 20 22:32:03.905944 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:03.905921 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh" Apr 20 22:32:04.014041 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:04.014010 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0cb057dc-2749-4a27-8f80-66b116ce4aea-extensions-socket-volume\") pod \"0cb057dc-2749-4a27-8f80-66b116ce4aea\" (UID: \"0cb057dc-2749-4a27-8f80-66b116ce4aea\") " Apr 20 22:32:04.014210 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:04.014064 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz649\" (UniqueName: \"kubernetes.io/projected/0cb057dc-2749-4a27-8f80-66b116ce4aea-kube-api-access-kz649\") pod \"0cb057dc-2749-4a27-8f80-66b116ce4aea\" (UID: \"0cb057dc-2749-4a27-8f80-66b116ce4aea\") " Apr 20 22:32:04.014392 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:04.014361 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cb057dc-2749-4a27-8f80-66b116ce4aea-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "0cb057dc-2749-4a27-8f80-66b116ce4aea" (UID: "0cb057dc-2749-4a27-8f80-66b116ce4aea"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 22:32:04.016268 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:04.016244 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cb057dc-2749-4a27-8f80-66b116ce4aea-kube-api-access-kz649" (OuterVolumeSpecName: "kube-api-access-kz649") pod "0cb057dc-2749-4a27-8f80-66b116ce4aea" (UID: "0cb057dc-2749-4a27-8f80-66b116ce4aea"). InnerVolumeSpecName "kube-api-access-kz649". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:32:04.115434 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:04.115406 2573 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/0cb057dc-2749-4a27-8f80-66b116ce4aea-extensions-socket-volume\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:32:04.115434 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:04.115430 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kz649\" (UniqueName: \"kubernetes.io/projected/0cb057dc-2749-4a27-8f80-66b116ce4aea-kube-api-access-kz649\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:32:04.643032 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:04.642997 2573 generic.go:358] "Generic (PLEG): container finished" podID="0cb057dc-2749-4a27-8f80-66b116ce4aea" containerID="32ddcc8601a682e35110c570b537017ce7a8aa988a79038fe30fbef154991dc8" exitCode=0 Apr 20 22:32:04.643495 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:04.643054 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh" Apr 20 22:32:04.643495 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:04.643088 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh" event={"ID":"0cb057dc-2749-4a27-8f80-66b116ce4aea","Type":"ContainerDied","Data":"32ddcc8601a682e35110c570b537017ce7a8aa988a79038fe30fbef154991dc8"} Apr 20 22:32:04.643495 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:04.643135 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh" event={"ID":"0cb057dc-2749-4a27-8f80-66b116ce4aea","Type":"ContainerDied","Data":"ddd11f7ed3fa3050de61c84030753d73d1e982167a0b45320580c2fcff97302b"} Apr 20 22:32:04.643495 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:04.643160 2573 scope.go:117] "RemoveContainer" containerID="32ddcc8601a682e35110c570b537017ce7a8aa988a79038fe30fbef154991dc8" Apr 20 22:32:04.652350 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:04.652334 2573 scope.go:117] "RemoveContainer" containerID="32ddcc8601a682e35110c570b537017ce7a8aa988a79038fe30fbef154991dc8" Apr 20 22:32:04.652587 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:32:04.652569 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32ddcc8601a682e35110c570b537017ce7a8aa988a79038fe30fbef154991dc8\": container with ID starting with 32ddcc8601a682e35110c570b537017ce7a8aa988a79038fe30fbef154991dc8 not found: ID does not exist" containerID="32ddcc8601a682e35110c570b537017ce7a8aa988a79038fe30fbef154991dc8" Apr 20 22:32:04.652650 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:04.652599 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32ddcc8601a682e35110c570b537017ce7a8aa988a79038fe30fbef154991dc8"} err="failed to get container status \"32ddcc8601a682e35110c570b537017ce7a8aa988a79038fe30fbef154991dc8\": rpc error: code = NotFound desc = could not find container \"32ddcc8601a682e35110c570b537017ce7a8aa988a79038fe30fbef154991dc8\": container with ID starting with 32ddcc8601a682e35110c570b537017ce7a8aa988a79038fe30fbef154991dc8 not found: ID does not exist" Apr 20 22:32:04.676624 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:04.676595 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh"] Apr 20 22:32:04.681474 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:04.681451 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mnbrh"] Apr 20 22:32:05.601107 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:05.601075 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cb057dc-2749-4a27-8f80-66b116ce4aea" path="/var/lib/kubelet/pods/0cb057dc-2749-4a27-8f80-66b116ce4aea/volumes" Apr 20 22:32:26.096327 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:26.096291 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-dpt58"] Apr 20 22:32:26.096744 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:26.096689 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0cb057dc-2749-4a27-8f80-66b116ce4aea" containerName="manager" Apr 20 22:32:26.096744 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:26.096704 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb057dc-2749-4a27-8f80-66b116ce4aea" containerName="manager" Apr 20 22:32:26.096819 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:26.096773 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="0cb057dc-2749-4a27-8f80-66b116ce4aea" containerName="manager" Apr 20 22:32:26.100163 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:26.100148 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-dpt58" Apr 20 22:32:26.104542 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:26.104513 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-k66vw\"" Apr 20 22:32:26.110343 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:26.110318 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-dpt58"] Apr 20 22:32:26.208230 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:26.208189 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8db2\" (UniqueName: \"kubernetes.io/projected/6f32d626-913d-40d3-96a6-6ce4958c5ec5-kube-api-access-l8db2\") pod \"authorino-7498df8756-dpt58\" (UID: \"6f32d626-913d-40d3-96a6-6ce4958c5ec5\") " pod="kuadrant-system/authorino-7498df8756-dpt58" Apr 20 22:32:26.309561 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:26.309526 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l8db2\" (UniqueName: \"kubernetes.io/projected/6f32d626-913d-40d3-96a6-6ce4958c5ec5-kube-api-access-l8db2\") pod \"authorino-7498df8756-dpt58\" (UID: \"6f32d626-913d-40d3-96a6-6ce4958c5ec5\") " pod="kuadrant-system/authorino-7498df8756-dpt58" Apr 20 22:32:26.318019 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:26.317988 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8db2\" (UniqueName: \"kubernetes.io/projected/6f32d626-913d-40d3-96a6-6ce4958c5ec5-kube-api-access-l8db2\") pod \"authorino-7498df8756-dpt58\" (UID: \"6f32d626-913d-40d3-96a6-6ce4958c5ec5\") " pod="kuadrant-system/authorino-7498df8756-dpt58" Apr 20 22:32:26.409600 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:26.409526 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-dpt58" Apr 20 22:32:26.535134 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:26.535088 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-dpt58"] Apr 20 22:32:26.537760 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:32:26.537727 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f32d626_913d_40d3_96a6_6ce4958c5ec5.slice/crio-dafc0e6240417ef95255bf9c2d89945fec51b237c95778b20f79c4d2c0f02c71 WatchSource:0}: Error finding container dafc0e6240417ef95255bf9c2d89945fec51b237c95778b20f79c4d2c0f02c71: Status 404 returned error can't find the container with id dafc0e6240417ef95255bf9c2d89945fec51b237c95778b20f79c4d2c0f02c71 Apr 20 22:32:26.737814 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:26.737729 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-dpt58" event={"ID":"6f32d626-913d-40d3-96a6-6ce4958c5ec5","Type":"ContainerStarted","Data":"dafc0e6240417ef95255bf9c2d89945fec51b237c95778b20f79c4d2c0f02c71"} Apr 20 22:32:30.753757 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:30.753716 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-dpt58" event={"ID":"6f32d626-913d-40d3-96a6-6ce4958c5ec5","Type":"ContainerStarted","Data":"a7b07f2d2b6f9ffd7f835152c80eda4bc4a0ea6f2bb0c3c9a0f49b415899f80a"} Apr 20 22:32:30.770343 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:32:30.770285 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-dpt58" podStartSLOduration=1.6178947240000001 podStartE2EDuration="4.770265189s" podCreationTimestamp="2026-04-20 22:32:26 +0000 UTC" firstStartedPulling="2026-04-20 22:32:26.539156091 +0000 UTC m=+519.492110688" lastFinishedPulling="2026-04-20 22:32:29.691526556 +0000 UTC m=+522.644481153" observedRunningTime="2026-04-20 22:32:30.76825807 +0000 UTC m=+523.721212688" watchObservedRunningTime="2026-04-20 22:32:30.770265189 +0000 UTC m=+523.723219808" Apr 20 22:33:18.059898 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:18.059867 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 22:33:18.062333 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:18.062310 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 20 22:33:18.064908 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:18.064887 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 20 22:33:18.065010 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:18.064895 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-9kd65\"" Apr 20 22:33:18.065010 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:18.064937 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 20 22:33:18.065010 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:18.064960 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 20 22:33:18.070598 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:18.070256 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 22:33:18.144803 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:18.144776 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr22l\" (UniqueName: \"kubernetes.io/projected/d1ce6875-cf18-4745-9c97-0712ef017e22-kube-api-access-kr22l\") pod \"maas-keycloak-0\" (UID: \"d1ce6875-cf18-4745-9c97-0712ef017e22\") " pod="keycloak-system/maas-keycloak-0" Apr 20 22:33:18.245197 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:18.245168 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kr22l\" (UniqueName: \"kubernetes.io/projected/d1ce6875-cf18-4745-9c97-0712ef017e22-kube-api-access-kr22l\") pod \"maas-keycloak-0\" (UID: \"d1ce6875-cf18-4745-9c97-0712ef017e22\") " pod="keycloak-system/maas-keycloak-0" Apr 20 22:33:18.252667 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:18.252639 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr22l\" (UniqueName: \"kubernetes.io/projected/d1ce6875-cf18-4745-9c97-0712ef017e22-kube-api-access-kr22l\") pod \"maas-keycloak-0\" (UID: \"d1ce6875-cf18-4745-9c97-0712ef017e22\") " pod="keycloak-system/maas-keycloak-0" Apr 20 22:33:18.372888 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:18.372775 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 20 22:33:18.495288 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:18.495263 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 22:33:18.497756 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:33:18.497727 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1ce6875_cf18_4745_9c97_0712ef017e22.slice/crio-3c57134fc60e8fd844c2fc6e79d3fc6cfff326febce9ea14bdac1563daa9d2f6 WatchSource:0}: Error finding container 3c57134fc60e8fd844c2fc6e79d3fc6cfff326febce9ea14bdac1563daa9d2f6: Status 404 returned error can't find the container with id 3c57134fc60e8fd844c2fc6e79d3fc6cfff326febce9ea14bdac1563daa9d2f6 Apr 20 22:33:18.925627 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:18.925593 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"d1ce6875-cf18-4745-9c97-0712ef017e22","Type":"ContainerStarted","Data":"3c57134fc60e8fd844c2fc6e79d3fc6cfff326febce9ea14bdac1563daa9d2f6"} Apr 20 22:33:22.944760 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:22.944726 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"d1ce6875-cf18-4745-9c97-0712ef017e22","Type":"ContainerStarted","Data":"6b4f5351737260f376ebcf700f097a1ff4c711686d69e50a760fb8c84666ec25"} Apr 20 22:33:22.962146 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:22.962095 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=0.59857064 podStartE2EDuration="4.962081728s" podCreationTimestamp="2026-04-20 22:33:18 +0000 UTC" firstStartedPulling="2026-04-20 22:33:18.499117874 +0000 UTC m=+571.452072471" lastFinishedPulling="2026-04-20 22:33:22.862628947 +0000 UTC m=+575.815583559" observedRunningTime="2026-04-20 22:33:22.960740117 +0000 UTC m=+575.913694737" watchObservedRunningTime="2026-04-20 22:33:22.962081728 +0000 UTC m=+575.915036346" Apr 20 22:33:23.373233 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:23.373181 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 20 22:33:23.374752 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:23.374715 2573 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d1ce6875-cf18-4745-9c97-0712ef017e22" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.44:9000/health/started\": dial tcp 10.132.0.44:9000: connect: connection refused" Apr 20 22:33:24.374135 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:24.374085 2573 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d1ce6875-cf18-4745-9c97-0712ef017e22" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.44:9000/health/started\": dial tcp 10.132.0.44:9000: connect: connection refused" Apr 20 22:33:25.374194 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:25.374144 2573 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d1ce6875-cf18-4745-9c97-0712ef017e22" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.44:9000/health/started\": dial tcp 10.132.0.44:9000: connect: connection refused" Apr 20 22:33:26.373505 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:26.373449 2573 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d1ce6875-cf18-4745-9c97-0712ef017e22" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.44:9000/health/started\": dial tcp 10.132.0.44:9000: connect: connection refused" Apr 20 22:33:27.373298 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:27.373203 2573 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d1ce6875-cf18-4745-9c97-0712ef017e22" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.44:9000/health/started\": dial tcp 10.132.0.44:9000: connect: connection refused" Apr 20 22:33:28.373452 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:28.373418 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 20 22:33:28.374498 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:28.373471 2573 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d1ce6875-cf18-4745-9c97-0712ef017e22" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.44:9000/health/started\": dial tcp 10.132.0.44:9000: connect: connection refused" Apr 20 22:33:29.373386 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:29.373343 2573 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d1ce6875-cf18-4745-9c97-0712ef017e22" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.44:9000/health/started\": dial tcp 10.132.0.44:9000: connect: connection refused" Apr 20 22:33:30.374286 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:30.374231 2573 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d1ce6875-cf18-4745-9c97-0712ef017e22" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.44:9000/health/started\": dial tcp 10.132.0.44:9000: connect: connection refused" Apr 20 22:33:31.373633 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:31.373576 2573 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d1ce6875-cf18-4745-9c97-0712ef017e22" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.44:9000/health/started\": dial tcp 10.132.0.44:9000: connect: connection refused" Apr 20 22:33:32.373428 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:32.373377 2573 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d1ce6875-cf18-4745-9c97-0712ef017e22" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.44:9000/health/started\": dial tcp 10.132.0.44:9000: connect: connection refused" Apr 20 22:33:33.373519 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:33.373462 2573 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d1ce6875-cf18-4745-9c97-0712ef017e22" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.44:9000/health/started\": dial tcp 10.132.0.44:9000: connect: connection refused" Apr 20 22:33:34.373246 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:34.373190 2573 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d1ce6875-cf18-4745-9c97-0712ef017e22" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.44:9000/health/started\": dial tcp 10.132.0.44:9000: connect: connection refused" Apr 20 22:33:35.373704 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:35.373642 2573 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d1ce6875-cf18-4745-9c97-0712ef017e22" containerName="keycloak" probeResult="failure" output="Get \"http://10.132.0.44:9000/health/started\": dial tcp 10.132.0.44:9000: connect: connection refused" Apr 20 22:33:36.504697 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:36.504645 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 20 22:33:36.523931 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:36.523888 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="d1ce6875-cf18-4745-9c97-0712ef017e22" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 22:33:46.511420 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:46.511383 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 20 22:33:47.573427 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:47.573396 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xnj54_4c475df6-d751-4f10-81c7-a1e56dec9176/console-operator/1.log" Apr 20 22:33:47.574085 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:47.574064 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xnj54_4c475df6-d751-4f10-81c7-a1e56dec9176/console-operator/1.log" Apr 20 22:33:50.077172 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:50.077138 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-dpt58"] Apr 20 22:33:50.077707 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:50.077338 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-dpt58" podUID="6f32d626-913d-40d3-96a6-6ce4958c5ec5" containerName="authorino" containerID="cri-o://a7b07f2d2b6f9ffd7f835152c80eda4bc4a0ea6f2bb0c3c9a0f49b415899f80a" gracePeriod=30 Apr 20 22:33:50.322680 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:50.322657 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-dpt58" Apr 20 22:33:50.465947 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:50.465848 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8db2\" (UniqueName: \"kubernetes.io/projected/6f32d626-913d-40d3-96a6-6ce4958c5ec5-kube-api-access-l8db2\") pod \"6f32d626-913d-40d3-96a6-6ce4958c5ec5\" (UID: \"6f32d626-913d-40d3-96a6-6ce4958c5ec5\") " Apr 20 22:33:50.467895 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:50.467848 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f32d626-913d-40d3-96a6-6ce4958c5ec5-kube-api-access-l8db2" (OuterVolumeSpecName: "kube-api-access-l8db2") pod "6f32d626-913d-40d3-96a6-6ce4958c5ec5" (UID: "6f32d626-913d-40d3-96a6-6ce4958c5ec5"). InnerVolumeSpecName "kube-api-access-l8db2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:33:50.567472 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:50.567445 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l8db2\" (UniqueName: \"kubernetes.io/projected/6f32d626-913d-40d3-96a6-6ce4958c5ec5-kube-api-access-l8db2\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:33:50.854017 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:50.853986 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-2gtgz"] Apr 20 22:33:50.854420 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:50.854405 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f32d626-913d-40d3-96a6-6ce4958c5ec5" containerName="authorino" Apr 20 22:33:50.854477 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:50.854422 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f32d626-913d-40d3-96a6-6ce4958c5ec5" containerName="authorino" Apr 20 22:33:50.854514 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:50.854504 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f32d626-913d-40d3-96a6-6ce4958c5ec5" containerName="authorino" Apr 20 22:33:50.857736 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:50.857714 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-2gtgz" Apr 20 22:33:50.861055 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:50.861034 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-82ldk\"" Apr 20 22:33:50.871916 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:50.871895 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-2gtgz"] Apr 20 22:33:50.970791 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:50.970761 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-822mf\" (UniqueName: \"kubernetes.io/projected/653f7faf-039e-42de-a447-daeda6684e27-kube-api-access-822mf\") pod \"maas-controller-6d4c8f55f9-2gtgz\" (UID: \"653f7faf-039e-42de-a447-daeda6684e27\") " pod="opendatahub/maas-controller-6d4c8f55f9-2gtgz" Apr 20 22:33:51.014634 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.014598 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6b6455cf76-wdcb9"] Apr 20 22:33:51.018341 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.018320 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6b6455cf76-wdcb9" Apr 20 22:33:51.025684 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.025649 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6b6455cf76-wdcb9"] Apr 20 22:33:51.071257 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.071227 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-822mf\" (UniqueName: \"kubernetes.io/projected/653f7faf-039e-42de-a447-daeda6684e27-kube-api-access-822mf\") pod \"maas-controller-6d4c8f55f9-2gtgz\" (UID: \"653f7faf-039e-42de-a447-daeda6684e27\") " pod="opendatahub/maas-controller-6d4c8f55f9-2gtgz" Apr 20 22:33:51.071970 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.071941 2573 generic.go:358] "Generic (PLEG): container finished" podID="6f32d626-913d-40d3-96a6-6ce4958c5ec5" containerID="a7b07f2d2b6f9ffd7f835152c80eda4bc4a0ea6f2bb0c3c9a0f49b415899f80a" exitCode=0 Apr 20 22:33:51.072071 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.071991 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-dpt58" Apr 20 22:33:51.072071 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.072028 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-dpt58" event={"ID":"6f32d626-913d-40d3-96a6-6ce4958c5ec5","Type":"ContainerDied","Data":"a7b07f2d2b6f9ffd7f835152c80eda4bc4a0ea6f2bb0c3c9a0f49b415899f80a"} Apr 20 22:33:51.072071 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.072066 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-dpt58" event={"ID":"6f32d626-913d-40d3-96a6-6ce4958c5ec5","Type":"ContainerDied","Data":"dafc0e6240417ef95255bf9c2d89945fec51b237c95778b20f79c4d2c0f02c71"} Apr 20 22:33:51.072177 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.072082 2573 scope.go:117] "RemoveContainer" containerID="a7b07f2d2b6f9ffd7f835152c80eda4bc4a0ea6f2bb0c3c9a0f49b415899f80a" Apr 20 22:33:51.079208 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.079160 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-822mf\" (UniqueName: \"kubernetes.io/projected/653f7faf-039e-42de-a447-daeda6684e27-kube-api-access-822mf\") pod \"maas-controller-6d4c8f55f9-2gtgz\" (UID: \"653f7faf-039e-42de-a447-daeda6684e27\") " pod="opendatahub/maas-controller-6d4c8f55f9-2gtgz" Apr 20 22:33:51.080786 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.080660 2573 scope.go:117] "RemoveContainer" containerID="a7b07f2d2b6f9ffd7f835152c80eda4bc4a0ea6f2bb0c3c9a0f49b415899f80a" Apr 20 22:33:51.080984 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:33:51.080965 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7b07f2d2b6f9ffd7f835152c80eda4bc4a0ea6f2bb0c3c9a0f49b415899f80a\": container with ID starting with a7b07f2d2b6f9ffd7f835152c80eda4bc4a0ea6f2bb0c3c9a0f49b415899f80a not found: ID does not exist" containerID="a7b07f2d2b6f9ffd7f835152c80eda4bc4a0ea6f2bb0c3c9a0f49b415899f80a" Apr 20 22:33:51.081045 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.080994 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7b07f2d2b6f9ffd7f835152c80eda4bc4a0ea6f2bb0c3c9a0f49b415899f80a"} err="failed to get container status \"a7b07f2d2b6f9ffd7f835152c80eda4bc4a0ea6f2bb0c3c9a0f49b415899f80a\": rpc error: code = NotFound desc = could not find container \"a7b07f2d2b6f9ffd7f835152c80eda4bc4a0ea6f2bb0c3c9a0f49b415899f80a\": container with ID starting with a7b07f2d2b6f9ffd7f835152c80eda4bc4a0ea6f2bb0c3c9a0f49b415899f80a not found: ID does not exist" Apr 20 22:33:51.100127 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.100100 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-dpt58"] Apr 20 22:33:51.103977 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.103960 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-dpt58"] Apr 20 22:33:51.139585 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.139521 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6b6455cf76-wdcb9"] Apr 20 22:33:51.139775 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:33:51.139758 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-5w6lk], unattached volumes=[], failed to process volumes=[]: context canceled" pod="opendatahub/maas-controller-6b6455cf76-wdcb9" podUID="753cccc6-b50a-43a8-875a-02f549564308" Apr 20 22:33:51.167699 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.167674 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-2gtgz" Apr 20 22:33:51.171942 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.171921 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w6lk\" (UniqueName: \"kubernetes.io/projected/753cccc6-b50a-43a8-875a-02f549564308-kube-api-access-5w6lk\") pod \"maas-controller-6b6455cf76-wdcb9\" (UID: \"753cccc6-b50a-43a8-875a-02f549564308\") " pod="opendatahub/maas-controller-6b6455cf76-wdcb9" Apr 20 22:33:51.175121 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.175095 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6bf6f495b6-4m7cd"] Apr 20 22:33:51.180066 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.180048 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6bf6f495b6-4m7cd" Apr 20 22:33:51.186345 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.186325 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6bf6f495b6-4m7cd"] Apr 20 22:33:51.272505 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.272472 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5w6lk\" (UniqueName: \"kubernetes.io/projected/753cccc6-b50a-43a8-875a-02f549564308-kube-api-access-5w6lk\") pod \"maas-controller-6b6455cf76-wdcb9\" (UID: \"753cccc6-b50a-43a8-875a-02f549564308\") " pod="opendatahub/maas-controller-6b6455cf76-wdcb9" Apr 20 22:33:51.285234 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.285205 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w6lk\" (UniqueName: \"kubernetes.io/projected/753cccc6-b50a-43a8-875a-02f549564308-kube-api-access-5w6lk\") pod \"maas-controller-6b6455cf76-wdcb9\" (UID: \"753cccc6-b50a-43a8-875a-02f549564308\") " pod="opendatahub/maas-controller-6b6455cf76-wdcb9" Apr 20 22:33:51.297772 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.297674 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-2gtgz"] Apr 20 22:33:51.300400 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:33:51.300372 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod653f7faf_039e_42de_a447_daeda6684e27.slice/crio-7f07440f58a2e08ee663d3d66b4be5877fa6a17a12d5e70cb8f0e09ce7144b85 WatchSource:0}: Error finding container 7f07440f58a2e08ee663d3d66b4be5877fa6a17a12d5e70cb8f0e09ce7144b85: Status 404 returned error can't find the container with id 7f07440f58a2e08ee663d3d66b4be5877fa6a17a12d5e70cb8f0e09ce7144b85 Apr 20 22:33:51.372908 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.372881 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qcg6\" (UniqueName: \"kubernetes.io/projected/25788b1d-1682-4b2f-865f-b0a7f5652cde-kube-api-access-4qcg6\") pod \"maas-controller-6bf6f495b6-4m7cd\" (UID: \"25788b1d-1682-4b2f-865f-b0a7f5652cde\") " pod="opendatahub/maas-controller-6bf6f495b6-4m7cd" Apr 20 22:33:51.474340 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.474265 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qcg6\" (UniqueName: \"kubernetes.io/projected/25788b1d-1682-4b2f-865f-b0a7f5652cde-kube-api-access-4qcg6\") pod \"maas-controller-6bf6f495b6-4m7cd\" (UID: \"25788b1d-1682-4b2f-865f-b0a7f5652cde\") " pod="opendatahub/maas-controller-6bf6f495b6-4m7cd" Apr 20 22:33:51.482041 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.482014 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qcg6\" (UniqueName: \"kubernetes.io/projected/25788b1d-1682-4b2f-865f-b0a7f5652cde-kube-api-access-4qcg6\") pod \"maas-controller-6bf6f495b6-4m7cd\" (UID: \"25788b1d-1682-4b2f-865f-b0a7f5652cde\") " pod="opendatahub/maas-controller-6bf6f495b6-4m7cd" Apr 20 22:33:51.501795 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.501769 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6bf6f495b6-4m7cd" Apr 20 22:33:51.603253 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.603221 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f32d626-913d-40d3-96a6-6ce4958c5ec5" path="/var/lib/kubelet/pods/6f32d626-913d-40d3-96a6-6ce4958c5ec5/volumes" Apr 20 22:33:51.625112 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:51.625086 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6bf6f495b6-4m7cd"] Apr 20 22:33:51.626720 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:33:51.626691 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25788b1d_1682_4b2f_865f_b0a7f5652cde.slice/crio-570f246a411d6010af57b88f8ea269089793267103a26a8627856262ccd1b3c1 WatchSource:0}: Error finding container 570f246a411d6010af57b88f8ea269089793267103a26a8627856262ccd1b3c1: Status 404 returned error can't find the container with id 570f246a411d6010af57b88f8ea269089793267103a26a8627856262ccd1b3c1 Apr 20 22:33:52.078017 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:52.077980 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6bf6f495b6-4m7cd" event={"ID":"25788b1d-1682-4b2f-865f-b0a7f5652cde","Type":"ContainerStarted","Data":"570f246a411d6010af57b88f8ea269089793267103a26a8627856262ccd1b3c1"} Apr 20 22:33:52.085183 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:52.085084 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-2gtgz" event={"ID":"653f7faf-039e-42de-a447-daeda6684e27","Type":"ContainerStarted","Data":"7f07440f58a2e08ee663d3d66b4be5877fa6a17a12d5e70cb8f0e09ce7144b85"} Apr 20 22:33:52.087879 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:52.087412 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6b6455cf76-wdcb9" Apr 20 22:33:52.094069 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:52.094049 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6b6455cf76-wdcb9" Apr 20 22:33:52.190095 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:52.188765 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w6lk\" (UniqueName: \"kubernetes.io/projected/753cccc6-b50a-43a8-875a-02f549564308-kube-api-access-5w6lk\") pod \"753cccc6-b50a-43a8-875a-02f549564308\" (UID: \"753cccc6-b50a-43a8-875a-02f549564308\") " Apr 20 22:33:52.192166 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:52.192119 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/753cccc6-b50a-43a8-875a-02f549564308-kube-api-access-5w6lk" (OuterVolumeSpecName: "kube-api-access-5w6lk") pod "753cccc6-b50a-43a8-875a-02f549564308" (UID: "753cccc6-b50a-43a8-875a-02f549564308"). InnerVolumeSpecName "kube-api-access-5w6lk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:33:52.289942 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:52.289892 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5w6lk\" (UniqueName: \"kubernetes.io/projected/753cccc6-b50a-43a8-875a-02f549564308-kube-api-access-5w6lk\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:33:53.092381 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:53.092348 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6b6455cf76-wdcb9" Apr 20 22:33:53.126683 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:53.126648 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6b6455cf76-wdcb9"] Apr 20 22:33:53.130653 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:53.130626 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6b6455cf76-wdcb9"] Apr 20 22:33:53.602109 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:53.602068 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="753cccc6-b50a-43a8-875a-02f549564308" path="/var/lib/kubelet/pods/753cccc6-b50a-43a8-875a-02f549564308/volumes" Apr 20 22:33:55.101272 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:55.101237 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6bf6f495b6-4m7cd" event={"ID":"25788b1d-1682-4b2f-865f-b0a7f5652cde","Type":"ContainerStarted","Data":"e4fdb91fdfd699c8a5aaeae6e45ec2880a0b800bb795272c73c083474bed1790"} Apr 20 22:33:55.101729 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:55.101311 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6bf6f495b6-4m7cd" Apr 20 22:33:55.102569 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:55.102547 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-2gtgz" event={"ID":"653f7faf-039e-42de-a447-daeda6684e27","Type":"ContainerStarted","Data":"49d406a0e002e6bd91dfb1dfc79931cf70c7913feaee3ca626abd41b4b28876b"} Apr 20 22:33:55.102726 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:55.102708 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-2gtgz" Apr 20 22:33:55.122732 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:55.122692 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6bf6f495b6-4m7cd" podStartSLOduration=1.40745464 podStartE2EDuration="4.122680863s" podCreationTimestamp="2026-04-20 22:33:51 +0000 UTC" firstStartedPulling="2026-04-20 22:33:51.628022262 +0000 UTC m=+604.580976858" lastFinishedPulling="2026-04-20 22:33:54.343248481 +0000 UTC m=+607.296203081" observedRunningTime="2026-04-20 22:33:55.12111335 +0000 UTC m=+608.074067971" watchObservedRunningTime="2026-04-20 22:33:55.122680863 +0000 UTC m=+608.075635483" Apr 20 22:33:55.138664 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:55.138619 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-2gtgz" podStartSLOduration=2.100325753 podStartE2EDuration="5.138607194s" podCreationTimestamp="2026-04-20 22:33:50 +0000 UTC" firstStartedPulling="2026-04-20 22:33:51.301719541 +0000 UTC m=+604.254674138" lastFinishedPulling="2026-04-20 22:33:54.340000974 +0000 UTC m=+607.292955579" observedRunningTime="2026-04-20 22:33:55.136380455 +0000 UTC m=+608.089335077" watchObservedRunningTime="2026-04-20 22:33:55.138607194 +0000 UTC m=+608.091561812" Apr 20 22:33:56.501060 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:56.501024 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-598d5cf9b8-52sdn"] Apr 20 22:33:56.504636 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:56.504619 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-598d5cf9b8-52sdn" Apr 20 22:33:56.507311 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:56.507285 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 22:33:56.507438 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:56.507285 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 22:33:56.507438 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:56.507332 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-47dzf\"" Apr 20 22:33:56.515466 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:56.515446 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-598d5cf9b8-52sdn"] Apr 20 22:33:56.528538 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:56.528497 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbrjl\" (UniqueName: \"kubernetes.io/projected/2aabe1ae-a61a-41c9-9e17-01343380dc94-kube-api-access-tbrjl\") pod \"maas-api-598d5cf9b8-52sdn\" (UID: \"2aabe1ae-a61a-41c9-9e17-01343380dc94\") " pod="opendatahub/maas-api-598d5cf9b8-52sdn" Apr 20 22:33:56.528661 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:56.528594 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/2aabe1ae-a61a-41c9-9e17-01343380dc94-maas-api-tls\") pod \"maas-api-598d5cf9b8-52sdn\" (UID: \"2aabe1ae-a61a-41c9-9e17-01343380dc94\") " pod="opendatahub/maas-api-598d5cf9b8-52sdn" Apr 20 22:33:56.630018 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:56.629984 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/2aabe1ae-a61a-41c9-9e17-01343380dc94-maas-api-tls\") pod \"maas-api-598d5cf9b8-52sdn\" (UID: \"2aabe1ae-a61a-41c9-9e17-01343380dc94\") " pod="opendatahub/maas-api-598d5cf9b8-52sdn" Apr 20 22:33:56.630242 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:56.630223 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbrjl\" (UniqueName: \"kubernetes.io/projected/2aabe1ae-a61a-41c9-9e17-01343380dc94-kube-api-access-tbrjl\") pod \"maas-api-598d5cf9b8-52sdn\" (UID: \"2aabe1ae-a61a-41c9-9e17-01343380dc94\") " pod="opendatahub/maas-api-598d5cf9b8-52sdn" Apr 20 22:33:56.630337 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:33:56.630321 2573 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 20 22:33:56.630403 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:33:56.630396 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2aabe1ae-a61a-41c9-9e17-01343380dc94-maas-api-tls podName:2aabe1ae-a61a-41c9-9e17-01343380dc94 nodeName:}" failed. No retries permitted until 2026-04-20 22:33:57.130375536 +0000 UTC m=+610.083330136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/2aabe1ae-a61a-41c9-9e17-01343380dc94-maas-api-tls") pod "maas-api-598d5cf9b8-52sdn" (UID: "2aabe1ae-a61a-41c9-9e17-01343380dc94") : secret "maas-api-serving-cert" not found Apr 20 22:33:56.641252 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:56.641223 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbrjl\" (UniqueName: \"kubernetes.io/projected/2aabe1ae-a61a-41c9-9e17-01343380dc94-kube-api-access-tbrjl\") pod \"maas-api-598d5cf9b8-52sdn\" (UID: \"2aabe1ae-a61a-41c9-9e17-01343380dc94\") " pod="opendatahub/maas-api-598d5cf9b8-52sdn" Apr 20 22:33:57.135486 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:57.135439 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/2aabe1ae-a61a-41c9-9e17-01343380dc94-maas-api-tls\") pod \"maas-api-598d5cf9b8-52sdn\" (UID: \"2aabe1ae-a61a-41c9-9e17-01343380dc94\") " pod="opendatahub/maas-api-598d5cf9b8-52sdn" Apr 20 22:33:57.137881 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:57.137842 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/2aabe1ae-a61a-41c9-9e17-01343380dc94-maas-api-tls\") pod \"maas-api-598d5cf9b8-52sdn\" (UID: \"2aabe1ae-a61a-41c9-9e17-01343380dc94\") " pod="opendatahub/maas-api-598d5cf9b8-52sdn" Apr 20 22:33:57.417462 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:57.417373 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-598d5cf9b8-52sdn" Apr 20 22:33:57.544347 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:57.544296 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-598d5cf9b8-52sdn"] Apr 20 22:33:57.547348 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:33:57.547320 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aabe1ae_a61a_41c9_9e17_01343380dc94.slice/crio-e82745b54ff19ba21858cf07461c0d430d0836e83cb6fee7f860033530fb50f4 WatchSource:0}: Error finding container e82745b54ff19ba21858cf07461c0d430d0836e83cb6fee7f860033530fb50f4: Status 404 returned error can't find the container with id e82745b54ff19ba21858cf07461c0d430d0836e83cb6fee7f860033530fb50f4 Apr 20 22:33:58.114920 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:58.114876 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-598d5cf9b8-52sdn" event={"ID":"2aabe1ae-a61a-41c9-9e17-01343380dc94","Type":"ContainerStarted","Data":"e82745b54ff19ba21858cf07461c0d430d0836e83cb6fee7f860033530fb50f4"} Apr 20 22:33:59.120705 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:59.120671 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-598d5cf9b8-52sdn" event={"ID":"2aabe1ae-a61a-41c9-9e17-01343380dc94","Type":"ContainerStarted","Data":"f37ee5b0cd98bced2544569f5ac53a30bb65ce25542944a1b15fa8faefe75f8c"} Apr 20 22:33:59.121194 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:59.120767 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-598d5cf9b8-52sdn" Apr 20 22:33:59.139650 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:33:59.139597 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-598d5cf9b8-52sdn" podStartSLOduration=2.095913626 podStartE2EDuration="3.139583562s" podCreationTimestamp="2026-04-20 22:33:56 +0000 UTC" firstStartedPulling="2026-04-20 22:33:57.54870572 +0000 UTC m=+610.501660317" lastFinishedPulling="2026-04-20 22:33:58.592375652 +0000 UTC m=+611.545330253" observedRunningTime="2026-04-20 22:33:59.137728346 +0000 UTC m=+612.090682979" watchObservedRunningTime="2026-04-20 22:33:59.139583562 +0000 UTC m=+612.092538220" Apr 20 22:34:05.129234 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:05.129206 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-598d5cf9b8-52sdn" Apr 20 22:34:06.113695 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:06.113659 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6d4c8f55f9-2gtgz" Apr 20 22:34:06.119242 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:06.119214 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6bf6f495b6-4m7cd" Apr 20 22:34:06.170339 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:06.170307 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-2gtgz"] Apr 20 22:34:06.170734 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:06.170492 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-2gtgz" podUID="653f7faf-039e-42de-a447-daeda6684e27" containerName="manager" containerID="cri-o://49d406a0e002e6bd91dfb1dfc79931cf70c7913feaee3ca626abd41b4b28876b" gracePeriod=10 Apr 20 22:34:06.407086 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:06.407064 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-2gtgz" Apr 20 22:34:06.468922 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:06.468884 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7f4779679f-mtglr"] Apr 20 22:34:06.469648 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:06.469625 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="653f7faf-039e-42de-a447-daeda6684e27" containerName="manager" Apr 20 22:34:06.469648 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:06.469647 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="653f7faf-039e-42de-a447-daeda6684e27" containerName="manager" Apr 20 22:34:06.469793 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:06.469731 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="653f7faf-039e-42de-a447-daeda6684e27" containerName="manager" Apr 20 22:34:06.472974 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:06.472958 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7f4779679f-mtglr" Apr 20 22:34:06.479896 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:06.479874 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7f4779679f-mtglr"] Apr 20 22:34:06.519694 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:06.519653 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-822mf\" (UniqueName: \"kubernetes.io/projected/653f7faf-039e-42de-a447-daeda6684e27-kube-api-access-822mf\") pod \"653f7faf-039e-42de-a447-daeda6684e27\" (UID: \"653f7faf-039e-42de-a447-daeda6684e27\") " Apr 20 22:34:06.519971 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:06.519949 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfftx\" (UniqueName: \"kubernetes.io/projected/f83d457f-aa83-49f2-9a34-7b1c5f3ec787-kube-api-access-lfftx\") pod \"maas-controller-7f4779679f-mtglr\" (UID: \"f83d457f-aa83-49f2-9a34-7b1c5f3ec787\") " pod="opendatahub/maas-controller-7f4779679f-mtglr" Apr 20 22:34:06.521891 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:06.521847 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/653f7faf-039e-42de-a447-daeda6684e27-kube-api-access-822mf" (OuterVolumeSpecName: "kube-api-access-822mf") pod "653f7faf-039e-42de-a447-daeda6684e27" (UID: "653f7faf-039e-42de-a447-daeda6684e27"). InnerVolumeSpecName "kube-api-access-822mf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:34:06.621309 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:06.621258 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfftx\" (UniqueName: \"kubernetes.io/projected/f83d457f-aa83-49f2-9a34-7b1c5f3ec787-kube-api-access-lfftx\") pod \"maas-controller-7f4779679f-mtglr\" (UID: \"f83d457f-aa83-49f2-9a34-7b1c5f3ec787\") " pod="opendatahub/maas-controller-7f4779679f-mtglr" Apr 20 22:34:06.621482 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:06.621347 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-822mf\" (UniqueName: \"kubernetes.io/projected/653f7faf-039e-42de-a447-daeda6684e27-kube-api-access-822mf\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:34:06.630154 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:06.630128 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfftx\" (UniqueName: \"kubernetes.io/projected/f83d457f-aa83-49f2-9a34-7b1c5f3ec787-kube-api-access-lfftx\") pod \"maas-controller-7f4779679f-mtglr\" (UID: \"f83d457f-aa83-49f2-9a34-7b1c5f3ec787\") " pod="opendatahub/maas-controller-7f4779679f-mtglr" Apr 20 22:34:06.784593 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:06.784557 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7f4779679f-mtglr" Apr 20 22:34:06.907564 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:06.907539 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7f4779679f-mtglr"] Apr 20 22:34:06.909454 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:34:06.909425 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf83d457f_aa83_49f2_9a34_7b1c5f3ec787.slice/crio-8f10e62b0313f627b8717a8b80bde9a929144cb4badb0a2da2b37a4bb41cc99e WatchSource:0}: Error finding container 8f10e62b0313f627b8717a8b80bde9a929144cb4badb0a2da2b37a4bb41cc99e: Status 404 returned error can't find the container with id 8f10e62b0313f627b8717a8b80bde9a929144cb4badb0a2da2b37a4bb41cc99e Apr 20 22:34:07.154988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:07.154900 2573 generic.go:358] "Generic (PLEG): container finished" podID="653f7faf-039e-42de-a447-daeda6684e27" containerID="49d406a0e002e6bd91dfb1dfc79931cf70c7913feaee3ca626abd41b4b28876b" exitCode=0 Apr 20 22:34:07.154988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:07.154955 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-2gtgz" Apr 20 22:34:07.154988 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:07.154971 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-2gtgz" event={"ID":"653f7faf-039e-42de-a447-daeda6684e27","Type":"ContainerDied","Data":"49d406a0e002e6bd91dfb1dfc79931cf70c7913feaee3ca626abd41b4b28876b"} Apr 20 22:34:07.155289 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:07.155009 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-2gtgz" event={"ID":"653f7faf-039e-42de-a447-daeda6684e27","Type":"ContainerDied","Data":"7f07440f58a2e08ee663d3d66b4be5877fa6a17a12d5e70cb8f0e09ce7144b85"} Apr 20 22:34:07.155289 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:07.155027 2573 scope.go:117] "RemoveContainer" containerID="49d406a0e002e6bd91dfb1dfc79931cf70c7913feaee3ca626abd41b4b28876b" Apr 20 22:34:07.156467 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:07.156426 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7f4779679f-mtglr" event={"ID":"f83d457f-aa83-49f2-9a34-7b1c5f3ec787","Type":"ContainerStarted","Data":"8f10e62b0313f627b8717a8b80bde9a929144cb4badb0a2da2b37a4bb41cc99e"} Apr 20 22:34:07.164552 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:07.164508 2573 scope.go:117] "RemoveContainer" containerID="49d406a0e002e6bd91dfb1dfc79931cf70c7913feaee3ca626abd41b4b28876b" Apr 20 22:34:07.164817 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:34:07.164789 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49d406a0e002e6bd91dfb1dfc79931cf70c7913feaee3ca626abd41b4b28876b\": container with ID starting with 49d406a0e002e6bd91dfb1dfc79931cf70c7913feaee3ca626abd41b4b28876b not found: ID does not exist" containerID="49d406a0e002e6bd91dfb1dfc79931cf70c7913feaee3ca626abd41b4b28876b" Apr 20 22:34:07.164949 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:07.164824 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49d406a0e002e6bd91dfb1dfc79931cf70c7913feaee3ca626abd41b4b28876b"} err="failed to get container status \"49d406a0e002e6bd91dfb1dfc79931cf70c7913feaee3ca626abd41b4b28876b\": rpc error: code = NotFound desc = could not find container \"49d406a0e002e6bd91dfb1dfc79931cf70c7913feaee3ca626abd41b4b28876b\": container with ID starting with 49d406a0e002e6bd91dfb1dfc79931cf70c7913feaee3ca626abd41b4b28876b not found: ID does not exist" Apr 20 22:34:07.177781 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:07.177760 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-2gtgz"] Apr 20 22:34:07.183030 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:07.183010 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-2gtgz"] Apr 20 22:34:07.601807 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:07.601775 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="653f7faf-039e-42de-a447-daeda6684e27" path="/var/lib/kubelet/pods/653f7faf-039e-42de-a447-daeda6684e27/volumes" Apr 20 22:34:08.162092 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:08.162058 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7f4779679f-mtglr" event={"ID":"f83d457f-aa83-49f2-9a34-7b1c5f3ec787","Type":"ContainerStarted","Data":"f3f7fe348838da179025c676349a3f0bd1abac95b53ef4414c3e519e3a2f0d37"} Apr 20 22:34:08.162259 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:08.162240 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7f4779679f-mtglr" Apr 20 22:34:08.178632 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:08.178578 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7f4779679f-mtglr" podStartSLOduration=1.839210212 podStartE2EDuration="2.178567499s" podCreationTimestamp="2026-04-20 22:34:06 +0000 UTC" firstStartedPulling="2026-04-20 22:34:06.910807175 +0000 UTC m=+619.863761775" lastFinishedPulling="2026-04-20 22:34:07.250164466 +0000 UTC m=+620.203119062" observedRunningTime="2026-04-20 22:34:08.177310926 +0000 UTC m=+621.130265546" watchObservedRunningTime="2026-04-20 22:34:08.178567499 +0000 UTC m=+621.131522118" Apr 20 22:34:19.170809 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:19.170778 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7f4779679f-mtglr" Apr 20 22:34:19.213123 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:19.213088 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6bf6f495b6-4m7cd"] Apr 20 22:34:19.213419 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:19.213393 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6bf6f495b6-4m7cd" podUID="25788b1d-1682-4b2f-865f-b0a7f5652cde" containerName="manager" containerID="cri-o://e4fdb91fdfd699c8a5aaeae6e45ec2880a0b800bb795272c73c083474bed1790" gracePeriod=10 Apr 20 22:34:19.457392 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:19.457369 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6bf6f495b6-4m7cd" Apr 20 22:34:19.539500 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:19.539470 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qcg6\" (UniqueName: \"kubernetes.io/projected/25788b1d-1682-4b2f-865f-b0a7f5652cde-kube-api-access-4qcg6\") pod \"25788b1d-1682-4b2f-865f-b0a7f5652cde\" (UID: \"25788b1d-1682-4b2f-865f-b0a7f5652cde\") " Apr 20 22:34:19.541540 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:19.541507 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25788b1d-1682-4b2f-865f-b0a7f5652cde-kube-api-access-4qcg6" (OuterVolumeSpecName: "kube-api-access-4qcg6") pod "25788b1d-1682-4b2f-865f-b0a7f5652cde" (UID: "25788b1d-1682-4b2f-865f-b0a7f5652cde"). InnerVolumeSpecName "kube-api-access-4qcg6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:34:19.641003 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:19.640972 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4qcg6\" (UniqueName: \"kubernetes.io/projected/25788b1d-1682-4b2f-865f-b0a7f5652cde-kube-api-access-4qcg6\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:34:20.207990 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:20.207953 2573 generic.go:358] "Generic (PLEG): container finished" podID="25788b1d-1682-4b2f-865f-b0a7f5652cde" containerID="e4fdb91fdfd699c8a5aaeae6e45ec2880a0b800bb795272c73c083474bed1790" exitCode=0 Apr 20 22:34:20.208389 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:20.208050 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6bf6f495b6-4m7cd" Apr 20 22:34:20.208389 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:20.208051 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6bf6f495b6-4m7cd" event={"ID":"25788b1d-1682-4b2f-865f-b0a7f5652cde","Type":"ContainerDied","Data":"e4fdb91fdfd699c8a5aaeae6e45ec2880a0b800bb795272c73c083474bed1790"} Apr 20 22:34:20.208389 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:20.208104 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6bf6f495b6-4m7cd" event={"ID":"25788b1d-1682-4b2f-865f-b0a7f5652cde","Type":"ContainerDied","Data":"570f246a411d6010af57b88f8ea269089793267103a26a8627856262ccd1b3c1"} Apr 20 22:34:20.208389 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:20.208130 2573 scope.go:117] "RemoveContainer" containerID="e4fdb91fdfd699c8a5aaeae6e45ec2880a0b800bb795272c73c083474bed1790" Apr 20 22:34:20.217420 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:20.217206 2573 scope.go:117] "RemoveContainer" containerID="e4fdb91fdfd699c8a5aaeae6e45ec2880a0b800bb795272c73c083474bed1790" Apr 20 22:34:20.217510 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:34:20.217485 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4fdb91fdfd699c8a5aaeae6e45ec2880a0b800bb795272c73c083474bed1790\": container with ID starting with e4fdb91fdfd699c8a5aaeae6e45ec2880a0b800bb795272c73c083474bed1790 not found: ID does not exist" containerID="e4fdb91fdfd699c8a5aaeae6e45ec2880a0b800bb795272c73c083474bed1790" Apr 20 22:34:20.217554 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:20.217527 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4fdb91fdfd699c8a5aaeae6e45ec2880a0b800bb795272c73c083474bed1790"} err="failed to get container status \"e4fdb91fdfd699c8a5aaeae6e45ec2880a0b800bb795272c73c083474bed1790\": rpc error: code = NotFound desc = could not find container \"e4fdb91fdfd699c8a5aaeae6e45ec2880a0b800bb795272c73c083474bed1790\": container with ID starting with e4fdb91fdfd699c8a5aaeae6e45ec2880a0b800bb795272c73c083474bed1790 not found: ID does not exist" Apr 20 22:34:20.225725 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:20.225703 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6bf6f495b6-4m7cd"] Apr 20 22:34:20.230431 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:20.230409 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6bf6f495b6-4m7cd"] Apr 20 22:34:21.601474 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:21.601441 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25788b1d-1682-4b2f-865f-b0a7f5652cde" path="/var/lib/kubelet/pods/25788b1d-1682-4b2f-865f-b0a7f5652cde/volumes" Apr 20 22:34:23.183327 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:23.183293 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-zjrpw_9e8fe991-f30a-4442-b41c-1e04e82e2fd8/manager/0.log" Apr 20 22:34:23.292680 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:23.292649 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-598d5cf9b8-52sdn_2aabe1ae-a61a-41c9-9e17-01343380dc94/maas-api/0.log" Apr 20 22:34:23.402350 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:23.402320 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-7f4779679f-mtglr_f83d457f-aa83-49f2-9a34-7b1c5f3ec787/manager/0.log" Apr 20 22:34:23.896242 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:23.896213 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5d8d569d47-w48jf_965a673a-9e44-490c-8dfa-b522b1bebe78/manager/0.log" Apr 20 22:34:25.398845 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:25.398811 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-nb7xc_b43b525d-634f-421c-9fc2-19aed38025cd/manager/0.log" Apr 20 22:34:25.508637 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:25.508610 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-zbj5h_97af97ae-77b9-4101-83c3-59bc855f235e/manager/0.log" Apr 20 22:34:25.732774 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:25.732684 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-xjm2q_f9614d14-fcc6-4518-8ae9-a96316e3b111/registry-server/0.log" Apr 20 22:34:25.852344 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:25.852313 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-t9fwn_65057b6c-ab07-4e47-bddc-77b868f3d0bd/manager/0.log" Apr 20 22:34:26.084506 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:26.084476 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-j5dk5_f246becd-26e3-46c7-92a3-af2a74f5c1be/manager/0.log" Apr 20 22:34:26.449035 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:26.448955 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4_2b080c9d-5122-42a5-bb9a-83082cadae1b/istio-proxy/0.log" Apr 20 22:34:26.791971 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:26.791940 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7755c94fdf-r68sk_23c619f3-1c6c-439d-8c08-21fb43ee960e/kube-auth-proxy/0.log" Apr 20 22:34:27.021024 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:27.020996 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-84455b6c98-44svx_7c99a639-1f48-429a-a14e-800ce227becb/router/0.log" Apr 20 22:34:34.088983 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:34.088951 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-5m4dx_29a8ab11-5eff-49a0-b910-dbb219ffd462/global-pull-secret-syncer/0.log" Apr 20 22:34:34.212126 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:34.212095 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-n2bhm_42a32769-748f-43a7-95c5-8aea7b36621e/konnectivity-agent/0.log" Apr 20 22:34:34.286953 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:34.286922 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-91.ec2.internal_0d7985ee616f47b485ff61a5cec01dca/haproxy/0.log" Apr 20 22:34:38.613915 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:38.613878 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-nb7xc_b43b525d-634f-421c-9fc2-19aed38025cd/manager/0.log" Apr 20 22:34:38.637264 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:38.637236 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-zbj5h_97af97ae-77b9-4101-83c3-59bc855f235e/manager/0.log" Apr 20 22:34:38.689316 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:38.689286 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-xjm2q_f9614d14-fcc6-4518-8ae9-a96316e3b111/registry-server/0.log" Apr 20 22:34:38.720974 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:38.720940 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-t9fwn_65057b6c-ab07-4e47-bddc-77b868f3d0bd/manager/0.log" Apr 20 22:34:38.776957 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:38.776916 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-j5dk5_f246becd-26e3-46c7-92a3-af2a74f5c1be/manager/0.log" Apr 20 22:34:39.680771 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:39.680737 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-5867446944-kcmnt"] Apr 20 22:34:39.681345 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:39.681245 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25788b1d-1682-4b2f-865f-b0a7f5652cde" containerName="manager" Apr 20 22:34:39.681345 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:39.681266 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="25788b1d-1682-4b2f-865f-b0a7f5652cde" containerName="manager" Apr 20 22:34:39.681467 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:39.681397 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="25788b1d-1682-4b2f-865f-b0a7f5652cde" containerName="manager" Apr 20 22:34:39.684599 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:39.684579 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5867446944-kcmnt" Apr 20 22:34:39.691791 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:39.691766 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-5867446944-kcmnt"] Apr 20 22:34:39.818054 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:39.818017 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/bd6f976f-d1a4-46df-8d30-6537bd6c2d8f-maas-api-tls\") pod \"maas-api-5867446944-kcmnt\" (UID: \"bd6f976f-d1a4-46df-8d30-6537bd6c2d8f\") " pod="opendatahub/maas-api-5867446944-kcmnt" Apr 20 22:34:39.818054 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:39.818065 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsg85\" (UniqueName: \"kubernetes.io/projected/bd6f976f-d1a4-46df-8d30-6537bd6c2d8f-kube-api-access-vsg85\") pod \"maas-api-5867446944-kcmnt\" (UID: \"bd6f976f-d1a4-46df-8d30-6537bd6c2d8f\") " pod="opendatahub/maas-api-5867446944-kcmnt" Apr 20 22:34:39.918947 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:39.918905 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/bd6f976f-d1a4-46df-8d30-6537bd6c2d8f-maas-api-tls\") pod \"maas-api-5867446944-kcmnt\" (UID: \"bd6f976f-d1a4-46df-8d30-6537bd6c2d8f\") " pod="opendatahub/maas-api-5867446944-kcmnt" Apr 20 22:34:39.918947 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:39.918953 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsg85\" (UniqueName: \"kubernetes.io/projected/bd6f976f-d1a4-46df-8d30-6537bd6c2d8f-kube-api-access-vsg85\") pod \"maas-api-5867446944-kcmnt\" (UID: \"bd6f976f-d1a4-46df-8d30-6537bd6c2d8f\") " pod="opendatahub/maas-api-5867446944-kcmnt" Apr 20 22:34:39.921295 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:39.921276 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/bd6f976f-d1a4-46df-8d30-6537bd6c2d8f-maas-api-tls\") pod \"maas-api-5867446944-kcmnt\" (UID: \"bd6f976f-d1a4-46df-8d30-6537bd6c2d8f\") " pod="opendatahub/maas-api-5867446944-kcmnt" Apr 20 22:34:39.927242 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:39.927215 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsg85\" (UniqueName: \"kubernetes.io/projected/bd6f976f-d1a4-46df-8d30-6537bd6c2d8f-kube-api-access-vsg85\") pod \"maas-api-5867446944-kcmnt\" (UID: \"bd6f976f-d1a4-46df-8d30-6537bd6c2d8f\") " pod="opendatahub/maas-api-5867446944-kcmnt" Apr 20 22:34:39.996583 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:39.996498 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5867446944-kcmnt" Apr 20 22:34:40.135424 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:40.135400 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-5867446944-kcmnt"] Apr 20 22:34:40.137592 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:34:40.137559 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd6f976f_d1a4_46df_8d30_6537bd6c2d8f.slice/crio-5edd74f6940845aa0ebcf7c2bf8bf4e141677ba0c3765716246d4f0fdb83b896 WatchSource:0}: Error finding container 5edd74f6940845aa0ebcf7c2bf8bf4e141677ba0c3765716246d4f0fdb83b896: Status 404 returned error can't find the container with id 5edd74f6940845aa0ebcf7c2bf8bf4e141677ba0c3765716246d4f0fdb83b896 Apr 20 22:34:40.138938 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:40.138919 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 22:34:40.291308 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:40.291276 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5867446944-kcmnt" event={"ID":"bd6f976f-d1a4-46df-8d30-6537bd6c2d8f","Type":"ContainerStarted","Data":"5edd74f6940845aa0ebcf7c2bf8bf4e141677ba0c3765716246d4f0fdb83b896"} Apr 20 22:34:40.365043 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:40.365010 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-lxvk8_c06378d7-946b-49c3-ac21-44605e27cdd5/cluster-monitoring-operator/0.log" Apr 20 22:34:40.617096 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:40.617015 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lrmr7_607764aa-7c69-4da7-94af-dd7a16161e14/node-exporter/0.log" Apr 20 22:34:40.639287 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:40.639264 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lrmr7_607764aa-7c69-4da7-94af-dd7a16161e14/kube-rbac-proxy/0.log" Apr 20 22:34:40.664453 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:40.664431 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lrmr7_607764aa-7c69-4da7-94af-dd7a16161e14/init-textfile/0.log" Apr 20 22:34:41.016280 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:41.016246 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-djhcv_711c7d6d-c6dc-41fb-bd61-56110cca941e/prometheus-operator/0.log" Apr 20 22:34:41.035685 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:41.035646 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-djhcv_711c7d6d-c6dc-41fb-bd61-56110cca941e/kube-rbac-proxy/0.log" Apr 20 22:34:41.059641 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:41.059606 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-vqj8t_790e03ca-74f0-4b4c-8111-f962f1503d6f/prometheus-operator-admission-webhook/0.log" Apr 20 22:34:42.296078 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.296042 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-4fd8s_51e93c8f-5d21-4964-8b01-3ddb5f2e5c86/networking-console-plugin/0.log" Apr 20 22:34:42.300422 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.300389 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5867446944-kcmnt" event={"ID":"bd6f976f-d1a4-46df-8d30-6537bd6c2d8f","Type":"ContainerStarted","Data":"c59423d8d617fa37c86c28f2b2d9ff3a2e7dd54967b3bf84f6c89c6e41b39151"} Apr 20 22:34:42.300565 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.300442 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-5867446944-kcmnt" Apr 20 22:34:42.316086 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.316047 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-5867446944-kcmnt" podStartSLOduration=1.869354608 podStartE2EDuration="3.316033461s" podCreationTimestamp="2026-04-20 22:34:39 +0000 UTC" firstStartedPulling="2026-04-20 22:34:40.139095611 +0000 UTC m=+653.092050209" lastFinishedPulling="2026-04-20 22:34:41.585774466 +0000 UTC m=+654.538729062" observedRunningTime="2026-04-20 22:34:42.31494611 +0000 UTC m=+655.267900731" watchObservedRunningTime="2026-04-20 22:34:42.316033461 +0000 UTC m=+655.268988080" Apr 20 22:34:42.773422 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.773388 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p"] Apr 20 22:34:42.781011 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.779354 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p" Apr 20 22:34:42.782524 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.782483 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5wqp7\"/\"default-dockercfg-4qwxg\"" Apr 20 22:34:42.782686 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.782539 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5wqp7\"/\"kube-root-ca.crt\"" Apr 20 22:34:42.782686 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.782655 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5wqp7\"/\"openshift-service-ca.crt\"" Apr 20 22:34:42.784593 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.784566 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p"] Apr 20 22:34:42.840621 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.840588 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjs27\" (UniqueName: \"kubernetes.io/projected/69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7-kube-api-access-qjs27\") pod \"perf-node-gather-daemonset-rwr2p\" (UID: \"69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p" Apr 20 22:34:42.840802 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.840632 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7-podres\") pod \"perf-node-gather-daemonset-rwr2p\" (UID: \"69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p" Apr 20 22:34:42.840802 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.840700 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7-sys\") pod \"perf-node-gather-daemonset-rwr2p\" (UID: \"69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p" Apr 20 22:34:42.840802 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.840753 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7-lib-modules\") pod \"perf-node-gather-daemonset-rwr2p\" (UID: \"69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p" Apr 20 22:34:42.840960 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.840804 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7-proc\") pod \"perf-node-gather-daemonset-rwr2p\" (UID: \"69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p" Apr 20 22:34:42.863726 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.863695 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xnj54_4c475df6-d751-4f10-81c7-a1e56dec9176/console-operator/1.log" Apr 20 22:34:42.871910 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.871886 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-xnj54_4c475df6-d751-4f10-81c7-a1e56dec9176/console-operator/2.log" Apr 20 22:34:42.941259 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.941223 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7-podres\") pod \"perf-node-gather-daemonset-rwr2p\" (UID: \"69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p" Apr 20 22:34:42.941507 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.941284 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7-sys\") pod \"perf-node-gather-daemonset-rwr2p\" (UID: \"69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p" Apr 20 22:34:42.941507 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.941316 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7-lib-modules\") pod \"perf-node-gather-daemonset-rwr2p\" (UID: \"69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p" Apr 20 22:34:42.941507 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.941352 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7-proc\") pod \"perf-node-gather-daemonset-rwr2p\" (UID: \"69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p" Apr 20 22:34:42.941507 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.941374 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjs27\" (UniqueName: \"kubernetes.io/projected/69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7-kube-api-access-qjs27\") pod \"perf-node-gather-daemonset-rwr2p\" (UID: \"69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p" Apr 20 22:34:42.941507 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.941406 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7-podres\") pod \"perf-node-gather-daemonset-rwr2p\" (UID: \"69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p" Apr 20 22:34:42.941507 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.941412 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7-sys\") pod \"perf-node-gather-daemonset-rwr2p\" (UID: \"69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p" Apr 20 22:34:42.941507 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.941455 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7-proc\") pod \"perf-node-gather-daemonset-rwr2p\" (UID: \"69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p" Apr 20 22:34:42.941507 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.941484 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7-lib-modules\") pod \"perf-node-gather-daemonset-rwr2p\" (UID: \"69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p" Apr 20 22:34:42.950046 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:42.950020 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjs27\" (UniqueName: \"kubernetes.io/projected/69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7-kube-api-access-qjs27\") pod \"perf-node-gather-daemonset-rwr2p\" (UID: \"69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7\") " pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p" Apr 20 22:34:43.095746 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:43.095657 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p" Apr 20 22:34:43.221961 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:43.221933 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p"] Apr 20 22:34:43.224019 ip-10-0-130-91 kubenswrapper[2573]: W0420 22:34:43.223990 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod69d3e6da_f7c6_4cd0_bc71_aa4ddaa28be7.slice/crio-6a824e686a0c9cfec5834ae1a28a1fa3f0a5653dfbf4163db5da1cf351ea4b1b WatchSource:0}: Error finding container 6a824e686a0c9cfec5834ae1a28a1fa3f0a5653dfbf4163db5da1cf351ea4b1b: Status 404 returned error can't find the container with id 6a824e686a0c9cfec5834ae1a28a1fa3f0a5653dfbf4163db5da1cf351ea4b1b Apr 20 22:34:43.304540 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:43.304499 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p" event={"ID":"69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7","Type":"ContainerStarted","Data":"6a824e686a0c9cfec5834ae1a28a1fa3f0a5653dfbf4163db5da1cf351ea4b1b"} Apr 20 22:34:43.396479 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:43.396414 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-h8q5v_27356a39-9ad0-4501-9cca-fbecf0cc9aa9/download-server/0.log" Apr 20 22:34:43.887204 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:43.887168 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-rfgkr_7c73411b-82b9-42c4-bbc3-edc35e00606d/volume-data-source-validator/0.log" Apr 20 22:34:44.309346 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:44.309310 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p" event={"ID":"69d3e6da-f7c6-4cd0-bc71-aa4ddaa28be7","Type":"ContainerStarted","Data":"fd62e1538322a7a9e7fdd9a3d548c5a2a9499ec1ad476a2ecd3bc8da38735878"} Apr 20 22:34:44.309811 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:44.309370 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p" Apr 20 22:34:44.327432 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:44.327390 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p" podStartSLOduration=2.327375623 podStartE2EDuration="2.327375623s" podCreationTimestamp="2026-04-20 22:34:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 22:34:44.325035765 +0000 UTC m=+657.277990384" watchObservedRunningTime="2026-04-20 22:34:44.327375623 +0000 UTC m=+657.280330243" Apr 20 22:34:44.809470 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:44.809441 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fj4gp_e4ed632e-0c77-4b80-b076-66bdfd17da84/dns/0.log" Apr 20 22:34:44.829072 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:44.829049 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fj4gp_e4ed632e-0c77-4b80-b076-66bdfd17da84/kube-rbac-proxy/0.log" Apr 20 22:34:44.849990 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:44.849964 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-56xnh_3533f633-4984-403c-9826-8812fe861cca/dns-node-resolver/0.log" Apr 20 22:34:45.470419 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:45.470385 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-m4npw_1d341b24-8cdf-4d59-a97c-54cecc195860/node-ca/0.log" Apr 20 22:34:46.310508 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:46.310477 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfp8kq4_2b080c9d-5122-42a5-bb9a-83082cadae1b/istio-proxy/0.log" Apr 20 22:34:46.412442 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:46.412416 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-7755c94fdf-r68sk_23c619f3-1c6c-439d-8c08-21fb43ee960e/kube-auth-proxy/0.log" Apr 20 22:34:46.462502 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:46.462465 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-84455b6c98-44svx_7c99a639-1f48-429a-a14e-800ce227becb/router/0.log" Apr 20 22:34:47.005304 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:47.005270 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-d5s8x_82357e1f-f9a8-4cf7-b3dd-fe77912c49a1/serve-healthcheck-canary/0.log" Apr 20 22:34:47.464545 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:47.464507 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-7n7hh_bb95d71c-3b6d-407a-9ff3-a70562af1b93/insights-operator/0.log" Apr 20 22:34:47.467564 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:47.467540 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-7n7hh_bb95d71c-3b6d-407a-9ff3-a70562af1b93/insights-operator/1.log" Apr 20 22:34:47.488152 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:47.488127 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8bv96_ce2b0f82-932f-457f-bd81-3a5c0a321390/kube-rbac-proxy/0.log" Apr 20 22:34:47.507331 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:47.507305 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8bv96_ce2b0f82-932f-457f-bd81-3a5c0a321390/exporter/0.log" Apr 20 22:34:47.528674 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:47.528646 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8bv96_ce2b0f82-932f-457f-bd81-3a5c0a321390/extractor/0.log" Apr 20 22:34:48.310167 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:48.310139 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-5867446944-kcmnt" Apr 20 22:34:48.356105 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:48.356075 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-598d5cf9b8-52sdn"] Apr 20 22:34:48.356340 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:48.356311 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-598d5cf9b8-52sdn" podUID="2aabe1ae-a61a-41c9-9e17-01343380dc94" containerName="maas-api" containerID="cri-o://f37ee5b0cd98bced2544569f5ac53a30bb65ce25542944a1b15fa8faefe75f8c" gracePeriod=30 Apr 20 22:34:48.613000 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:48.612982 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-598d5cf9b8-52sdn" Apr 20 22:34:48.694902 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:48.694830 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/2aabe1ae-a61a-41c9-9e17-01343380dc94-maas-api-tls\") pod \"2aabe1ae-a61a-41c9-9e17-01343380dc94\" (UID: \"2aabe1ae-a61a-41c9-9e17-01343380dc94\") " Apr 20 22:34:48.695069 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:48.694930 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbrjl\" (UniqueName: \"kubernetes.io/projected/2aabe1ae-a61a-41c9-9e17-01343380dc94-kube-api-access-tbrjl\") pod \"2aabe1ae-a61a-41c9-9e17-01343380dc94\" (UID: \"2aabe1ae-a61a-41c9-9e17-01343380dc94\") " Apr 20 22:34:48.697003 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:48.696978 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aabe1ae-a61a-41c9-9e17-01343380dc94-kube-api-access-tbrjl" (OuterVolumeSpecName: "kube-api-access-tbrjl") pod "2aabe1ae-a61a-41c9-9e17-01343380dc94" (UID: "2aabe1ae-a61a-41c9-9e17-01343380dc94"). InnerVolumeSpecName "kube-api-access-tbrjl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 22:34:48.697108 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:48.697053 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aabe1ae-a61a-41c9-9e17-01343380dc94-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "2aabe1ae-a61a-41c9-9e17-01343380dc94" (UID: "2aabe1ae-a61a-41c9-9e17-01343380dc94"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 22:34:48.796107 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:48.796069 2573 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/2aabe1ae-a61a-41c9-9e17-01343380dc94-maas-api-tls\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:34:48.796107 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:48.796099 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tbrjl\" (UniqueName: \"kubernetes.io/projected/2aabe1ae-a61a-41c9-9e17-01343380dc94-kube-api-access-tbrjl\") on node \"ip-10-0-130-91.ec2.internal\" DevicePath \"\"" Apr 20 22:34:49.328792 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:49.328756 2573 generic.go:358] "Generic (PLEG): container finished" podID="2aabe1ae-a61a-41c9-9e17-01343380dc94" containerID="f37ee5b0cd98bced2544569f5ac53a30bb65ce25542944a1b15fa8faefe75f8c" exitCode=0 Apr 20 22:34:49.329214 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:49.328821 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-598d5cf9b8-52sdn" Apr 20 22:34:49.329214 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:49.328833 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-598d5cf9b8-52sdn" event={"ID":"2aabe1ae-a61a-41c9-9e17-01343380dc94","Type":"ContainerDied","Data":"f37ee5b0cd98bced2544569f5ac53a30bb65ce25542944a1b15fa8faefe75f8c"} Apr 20 22:34:49.329214 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:49.328885 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-598d5cf9b8-52sdn" event={"ID":"2aabe1ae-a61a-41c9-9e17-01343380dc94","Type":"ContainerDied","Data":"e82745b54ff19ba21858cf07461c0d430d0836e83cb6fee7f860033530fb50f4"} Apr 20 22:34:49.329214 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:49.328901 2573 scope.go:117] "RemoveContainer" containerID="f37ee5b0cd98bced2544569f5ac53a30bb65ce25542944a1b15fa8faefe75f8c" Apr 20 22:34:49.337985 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:49.337920 2573 scope.go:117] "RemoveContainer" containerID="f37ee5b0cd98bced2544569f5ac53a30bb65ce25542944a1b15fa8faefe75f8c" Apr 20 22:34:49.338190 ip-10-0-130-91 kubenswrapper[2573]: E0420 22:34:49.338168 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f37ee5b0cd98bced2544569f5ac53a30bb65ce25542944a1b15fa8faefe75f8c\": container with ID starting with f37ee5b0cd98bced2544569f5ac53a30bb65ce25542944a1b15fa8faefe75f8c not found: ID does not exist" containerID="f37ee5b0cd98bced2544569f5ac53a30bb65ce25542944a1b15fa8faefe75f8c" Apr 20 22:34:49.338237 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:49.338198 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f37ee5b0cd98bced2544569f5ac53a30bb65ce25542944a1b15fa8faefe75f8c"} err="failed to get container status \"f37ee5b0cd98bced2544569f5ac53a30bb65ce25542944a1b15fa8faefe75f8c\": rpc error: code = NotFound desc = could not find container \"f37ee5b0cd98bced2544569f5ac53a30bb65ce25542944a1b15fa8faefe75f8c\": container with ID starting with f37ee5b0cd98bced2544569f5ac53a30bb65ce25542944a1b15fa8faefe75f8c not found: ID does not exist" Apr 20 22:34:49.352500 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:49.352475 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-598d5cf9b8-52sdn"] Apr 20 22:34:49.355701 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:49.355680 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-598d5cf9b8-52sdn"] Apr 20 22:34:49.497220 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:49.497196 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-zjrpw_9e8fe991-f30a-4442-b41c-1e04e82e2fd8/manager/0.log" Apr 20 22:34:49.518165 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:49.518137 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-5867446944-kcmnt_bd6f976f-d1a4-46df-8d30-6537bd6c2d8f/maas-api/0.log" Apr 20 22:34:49.540038 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:49.540018 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-7f4779679f-mtglr_f83d457f-aa83-49f2-9a34-7b1c5f3ec787/manager/0.log" Apr 20 22:34:49.601718 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:49.601623 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aabe1ae-a61a-41c9-9e17-01343380dc94" path="/var/lib/kubelet/pods/2aabe1ae-a61a-41c9-9e17-01343380dc94/volumes" Apr 20 22:34:49.683749 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:49.683716 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5d8d569d47-w48jf_965a673a-9e44-490c-8dfa-b522b1bebe78/manager/0.log" Apr 20 22:34:50.323420 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:50.323393 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-5wqp7/perf-node-gather-daemonset-rwr2p" Apr 20 22:34:55.482116 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:55.482076 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-9qgmc_e37007b0-9ddd-4f6c-90e7-b3a1dd501568/migrator/0.log" Apr 20 22:34:55.500778 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:55.500757 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-9qgmc_e37007b0-9ddd-4f6c-90e7-b3a1dd501568/graceful-termination/0.log" Apr 20 22:34:55.869438 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:55.869388 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-fzjvb_ea02661d-e4a4-469a-9451-7f11a7db90d2/kube-storage-version-migrator-operator/1.log" Apr 20 22:34:55.870556 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:55.870527 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-fzjvb_ea02661d-e4a4-469a-9451-7f11a7db90d2/kube-storage-version-migrator-operator/0.log" Apr 20 22:34:57.205539 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:57.205509 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qvjgp_d7ecb730-4be8-4cc2-86d1-47a71c9e25e7/kube-multus-additional-cni-plugins/0.log" Apr 20 22:34:57.225338 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:57.225305 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qvjgp_d7ecb730-4be8-4cc2-86d1-47a71c9e25e7/egress-router-binary-copy/0.log" Apr 20 22:34:57.246269 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:57.246242 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qvjgp_d7ecb730-4be8-4cc2-86d1-47a71c9e25e7/cni-plugins/0.log" Apr 20 22:34:57.265100 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:57.265071 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qvjgp_d7ecb730-4be8-4cc2-86d1-47a71c9e25e7/bond-cni-plugin/0.log" Apr 20 22:34:57.285170 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:57.285139 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qvjgp_d7ecb730-4be8-4cc2-86d1-47a71c9e25e7/routeoverride-cni/0.log" Apr 20 22:34:57.305973 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:57.305952 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qvjgp_d7ecb730-4be8-4cc2-86d1-47a71c9e25e7/whereabouts-cni-bincopy/0.log" Apr 20 22:34:57.325133 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:57.325108 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qvjgp_d7ecb730-4be8-4cc2-86d1-47a71c9e25e7/whereabouts-cni/0.log" Apr 20 22:34:57.365315 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:57.365286 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-q9f2s_5fe8d086-787c-4e3e-ac72-53b9ac48d390/kube-multus/0.log" Apr 20 22:34:57.510595 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:57.510503 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rl87j_75df7794-7926-4023-a9fe-c8bb08e18219/network-metrics-daemon/0.log" Apr 20 22:34:57.528238 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:57.528214 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-rl87j_75df7794-7926-4023-a9fe-c8bb08e18219/kube-rbac-proxy/0.log" Apr 20 22:34:58.376468 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:58.376437 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2trzq_bffb7e6c-ecd8-45cd-a238-8bbc21a4553b/ovn-controller/0.log" Apr 20 22:34:58.400126 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:58.400103 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2trzq_bffb7e6c-ecd8-45cd-a238-8bbc21a4553b/ovn-acl-logging/0.log" Apr 20 22:34:58.420980 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:58.420952 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2trzq_bffb7e6c-ecd8-45cd-a238-8bbc21a4553b/kube-rbac-proxy-node/0.log" Apr 20 22:34:58.449022 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:58.448999 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2trzq_bffb7e6c-ecd8-45cd-a238-8bbc21a4553b/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 22:34:58.465036 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:58.465012 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2trzq_bffb7e6c-ecd8-45cd-a238-8bbc21a4553b/northd/0.log" Apr 20 22:34:58.482798 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:58.482778 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2trzq_bffb7e6c-ecd8-45cd-a238-8bbc21a4553b/nbdb/0.log" Apr 20 22:34:58.501527 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:58.501510 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2trzq_bffb7e6c-ecd8-45cd-a238-8bbc21a4553b/sbdb/0.log" Apr 20 22:34:58.664242 ip-10-0-130-91 kubenswrapper[2573]: I0420 22:34:58.664162 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2trzq_bffb7e6c-ecd8-45cd-a238-8bbc21a4553b/ovnkube-controller/0.log"