Apr 16 22:04:55.627574 ip-10-0-130-26 systemd[1]: Starting Kubernetes Kubelet... Apr 16 22:04:56.026140 ip-10-0-130-26 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:04:56.026140 ip-10-0-130-26 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 22:04:56.026140 ip-10-0-130-26 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:04:56.026140 ip-10-0-130-26 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 22:04:56.026140 ip-10-0-130-26 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 22:04:56.028829 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.028746 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 22:04:56.032958 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.032938 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:04:56.032958 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.032955 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:04:56.032958 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.032959 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:04:56.032958 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.032962 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:04:56.032958 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.032965 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:04:56.033143 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.032968 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:04:56.033143 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.032971 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:04:56.033143 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.032974 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:04:56.033143 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.032977 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:04:56.033143 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.032979 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:04:56.033143 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.032982 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:04:56.033143 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.032984 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:04:56.033143 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.032987 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:04:56.033143 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.032990 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:04:56.033143 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.032995 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:04:56.033143 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.032999 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:04:56.033143 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033002 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:04:56.033143 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033005 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:04:56.033143 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033008 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:04:56.033143 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033011 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:04:56.033143 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033014 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:04:56.033143 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033017 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:04:56.033143 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033019 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:04:56.033143 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033022 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:04:56.033143 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033025 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:04:56.033644 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033027 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:04:56.033644 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033031 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:04:56.033644 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033039 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:04:56.033644 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033042 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:04:56.033644 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033045 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:04:56.033644 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033047 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:04:56.033644 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033050 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:04:56.033644 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033052 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:04:56.033644 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033055 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:04:56.033644 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033058 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:04:56.033644 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033060 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:04:56.033644 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033063 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:04:56.033644 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033065 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:04:56.033644 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033068 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:04:56.033644 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033071 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:04:56.033644 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033074 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:04:56.033644 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033077 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:04:56.033644 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033079 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:04:56.033644 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033082 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:04:56.033644 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033085 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:04:56.034135 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033087 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:04:56.034135 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033090 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:04:56.034135 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033093 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:04:56.034135 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033097 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:04:56.034135 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033099 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:04:56.034135 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033102 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:04:56.034135 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033105 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:04:56.034135 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033107 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:04:56.034135 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033110 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:04:56.034135 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033112 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:04:56.034135 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033115 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:04:56.034135 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033118 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:04:56.034135 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033120 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:04:56.034135 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033124 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:04:56.034135 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033128 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:04:56.034135 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033131 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:04:56.034135 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033133 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:04:56.034135 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033136 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:04:56.034135 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033138 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:04:56.034135 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033141 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:04:56.034634 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033144 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:04:56.034634 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033146 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:04:56.034634 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033151 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:04:56.034634 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033155 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:04:56.034634 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033158 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:04:56.034634 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033161 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:04:56.034634 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033164 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:04:56.034634 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033166 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:04:56.034634 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033169 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:04:56.034634 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033171 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:04:56.034634 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033174 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:04:56.034634 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033177 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:04:56.034634 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033179 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:04:56.034634 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033182 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:04:56.034634 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033185 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:04:56.034634 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033188 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:04:56.034634 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033190 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:04:56.034634 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033193 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:04:56.034634 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033195 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:04:56.035085 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033198 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:04:56.035085 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.033200 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:04:56.035085 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034259 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:04:56.035085 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034266 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:04:56.035085 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034269 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:04:56.035085 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034272 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:04:56.035085 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034275 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:04:56.035085 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034278 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:04:56.035085 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034282 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:04:56.035085 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034286 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:04:56.035085 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034289 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:04:56.035085 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034292 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:04:56.035085 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034294 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:04:56.035085 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034297 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:04:56.035085 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034300 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:04:56.035085 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034318 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:04:56.035085 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034320 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:04:56.035085 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034323 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:04:56.035085 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034326 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:04:56.035568 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034329 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:04:56.035568 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034332 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:04:56.035568 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034335 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:04:56.035568 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034337 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:04:56.035568 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034340 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:04:56.035568 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034343 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:04:56.035568 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034346 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:04:56.035568 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034350 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:04:56.035568 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034353 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:04:56.035568 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034355 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:04:56.035568 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034358 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:04:56.035568 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034361 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:04:56.035568 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034364 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:04:56.035568 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034366 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:04:56.035568 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034369 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:04:56.035568 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034371 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:04:56.035568 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034374 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:04:56.035568 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034376 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:04:56.035568 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034379 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:04:56.035568 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034381 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:04:56.036072 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034384 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:04:56.036072 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034386 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:04:56.036072 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034391 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:04:56.036072 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034394 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:04:56.036072 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034397 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:04:56.036072 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034400 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:04:56.036072 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034403 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:04:56.036072 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034406 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:04:56.036072 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034408 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:04:56.036072 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034411 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:04:56.036072 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034414 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:04:56.036072 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034417 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:04:56.036072 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034419 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:04:56.036072 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034422 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:04:56.036072 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034424 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:04:56.036072 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034427 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:04:56.036072 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034430 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:04:56.036072 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034432 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:04:56.036072 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034434 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:04:56.036072 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034437 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:04:56.036581 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034440 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:04:56.036581 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034443 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:04:56.036581 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034445 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:04:56.036581 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034447 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:04:56.036581 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034450 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:04:56.036581 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034452 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:04:56.036581 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034455 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:04:56.036581 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034457 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:04:56.036581 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034460 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:04:56.036581 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034462 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:04:56.036581 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034465 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:04:56.036581 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034468 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:04:56.036581 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034471 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:04:56.036581 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034474 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:04:56.036581 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034477 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:04:56.036581 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034479 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:04:56.036581 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034482 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:04:56.036581 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034485 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:04:56.036581 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034487 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:04:56.036581 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034491 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:04:56.037093 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034493 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:04:56.037093 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034496 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:04:56.037093 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034498 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:04:56.037093 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034502 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:04:56.037093 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034504 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:04:56.037093 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034507 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:04:56.037093 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034509 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:04:56.037093 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034512 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:04:56.037093 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.034514 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:04:56.037093 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035772 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 22:04:56.037093 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035781 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 22:04:56.037093 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035788 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 22:04:56.037093 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035793 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 22:04:56.037093 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035798 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 22:04:56.037093 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035802 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 22:04:56.037093 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035806 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 22:04:56.037093 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035810 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 22:04:56.037093 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035814 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 22:04:56.037093 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035817 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 22:04:56.037093 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035820 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 22:04:56.037093 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035823 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035826 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035830 2576 flags.go:64] FLAG: --cgroup-root="" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035833 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035836 2576 flags.go:64] FLAG: --client-ca-file="" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035839 2576 flags.go:64] FLAG: --cloud-config="" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035842 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035844 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035848 2576 flags.go:64] FLAG: --cluster-domain="" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035851 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035854 2576 flags.go:64] FLAG: --config-dir="" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035857 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035861 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035864 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035868 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035871 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035874 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035877 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035881 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035884 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035887 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035890 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035894 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035898 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035900 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 22:04:56.037620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035903 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035907 2576 flags.go:64] FLAG: --enable-server="true" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035910 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035914 2576 flags.go:64] FLAG: --event-burst="100" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035917 2576 flags.go:64] FLAG: --event-qps="50" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035920 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035923 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035926 2576 flags.go:64] FLAG: --eviction-hard="" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035929 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035932 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035938 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035941 2576 flags.go:64] FLAG: --eviction-soft="" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035944 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035947 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035950 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035953 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035956 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035959 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035962 2576 flags.go:64] FLAG: --feature-gates="" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035966 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035969 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035972 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035975 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035978 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035982 2576 flags.go:64] FLAG: --help="false" Apr 16 22:04:56.038205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035985 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-130-26.ec2.internal" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035988 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035991 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035994 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.035998 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036001 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036004 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036007 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036010 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036013 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036016 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036019 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036022 2576 flags.go:64] FLAG: --kube-reserved="" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036025 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036028 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036031 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036034 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036038 2576 flags.go:64] FLAG: --lock-file="" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036041 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036044 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036047 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036052 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036055 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036058 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 22:04:56.038853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036061 2576 flags.go:64] FLAG: --logging-format="text" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036064 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036067 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036070 2576 flags.go:64] FLAG: --manifest-url="" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036073 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036078 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036082 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036086 2576 flags.go:64] FLAG: --max-pods="110" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036089 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036092 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036095 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036098 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036101 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036104 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036107 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036114 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036117 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036120 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036124 2576 flags.go:64] FLAG: --pod-cidr="" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036127 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036133 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036136 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036139 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036141 2576 flags.go:64] FLAG: --port="10250" Apr 16 22:04:56.039437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036145 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036148 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f3e59d2fc70b5b59" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036151 2576 flags.go:64] FLAG: --qos-reserved="" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036154 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036157 2576 flags.go:64] FLAG: --register-node="true" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036160 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036163 2576 flags.go:64] FLAG: --register-with-taints="" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036166 2576 flags.go:64] FLAG: --registry-burst="10" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036169 2576 flags.go:64] FLAG: --registry-qps="5" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036172 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036174 2576 flags.go:64] FLAG: --reserved-memory="" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036178 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036181 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036184 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036188 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036190 2576 flags.go:64] FLAG: --runonce="false" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036193 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036196 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036199 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036202 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036205 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036208 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036211 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036214 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036217 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036219 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 22:04:56.040053 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036222 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036225 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036229 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036232 2576 flags.go:64] FLAG: --system-cgroups="" Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036234 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036239 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036242 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036245 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036249 2576 flags.go:64] FLAG: --tls-min-version="" Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036252 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036254 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036257 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036260 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036263 2576 flags.go:64] FLAG: --v="2" Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036267 2576 flags.go:64] FLAG: --version="false" Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036272 2576 flags.go:64] FLAG: --vmodule="" Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036276 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036279 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036376 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036380 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036383 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036386 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036389 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:04:56.040696 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036392 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:04:56.041245 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036395 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:04:56.041245 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036398 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:04:56.041245 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036400 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:04:56.041245 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036403 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:04:56.041245 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036406 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:04:56.041245 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036409 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:04:56.041245 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036411 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:04:56.041245 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036414 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:04:56.041245 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036416 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:04:56.041245 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036419 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:04:56.041245 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036422 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:04:56.041245 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036425 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:04:56.041245 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036427 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:04:56.041245 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036430 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:04:56.041245 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036433 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:04:56.041245 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036435 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:04:56.041245 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036438 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:04:56.041245 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036440 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:04:56.041245 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036443 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:04:56.041245 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036445 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:04:56.041795 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036448 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:04:56.041795 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036451 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:04:56.041795 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036453 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:04:56.041795 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036463 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:04:56.041795 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036467 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:04:56.041795 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036469 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:04:56.041795 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036472 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:04:56.041795 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036475 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:04:56.041795 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036477 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:04:56.041795 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036480 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:04:56.041795 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036482 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:04:56.041795 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036485 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:04:56.041795 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036487 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:04:56.041795 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036490 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:04:56.041795 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036492 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:04:56.041795 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036495 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:04:56.041795 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036497 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:04:56.041795 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036500 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:04:56.041795 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036502 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:04:56.041795 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036505 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:04:56.042295 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036508 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:04:56.042295 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036510 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:04:56.042295 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036513 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:04:56.042295 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036516 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:04:56.042295 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036518 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:04:56.042295 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036521 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:04:56.042295 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036523 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:04:56.042295 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036526 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:04:56.042295 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036529 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:04:56.042295 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036531 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:04:56.042295 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036534 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:04:56.042295 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036536 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:04:56.042295 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036539 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:04:56.042295 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036541 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:04:56.042295 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036545 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:04:56.042295 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036549 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:04:56.042295 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036552 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:04:56.042295 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036554 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:04:56.042295 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036557 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:04:56.042295 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036559 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:04:56.042816 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036562 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:04:56.042816 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036564 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:04:56.042816 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036567 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:04:56.042816 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036569 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:04:56.042816 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036572 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:04:56.042816 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036576 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:04:56.042816 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036579 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:04:56.042816 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036584 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:04:56.042816 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036588 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:04:56.042816 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036591 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:04:56.042816 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036594 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:04:56.042816 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036597 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:04:56.042816 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036600 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:04:56.042816 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036603 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:04:56.042816 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036605 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:04:56.042816 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036608 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:04:56.042816 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036611 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:04:56.042816 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036614 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:04:56.042816 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036617 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:04:56.043284 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.036619 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:04:56.043284 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.036629 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:04:56.044600 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.044581 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 22:04:56.044643 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.044601 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 22:04:56.044674 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044665 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:04:56.044674 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044671 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:04:56.044674 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044674 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:04:56.044749 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044678 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:04:56.044749 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044681 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:04:56.044749 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044684 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:04:56.044749 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044688 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:04:56.044749 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044690 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:04:56.044749 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044693 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:04:56.044749 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044695 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:04:56.044749 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044698 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:04:56.044749 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044701 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:04:56.044749 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044703 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:04:56.044749 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044706 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:04:56.044749 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044709 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:04:56.044749 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044711 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:04:56.044749 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044714 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:04:56.044749 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044717 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:04:56.044749 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044721 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:04:56.044749 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044725 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:04:56.044749 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044728 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:04:56.044749 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044731 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:04:56.045206 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044734 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:04:56.045206 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044737 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:04:56.045206 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044739 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:04:56.045206 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044742 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:04:56.045206 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044745 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:04:56.045206 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044748 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:04:56.045206 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044750 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:04:56.045206 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044753 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:04:56.045206 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044756 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:04:56.045206 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044760 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:04:56.045206 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044763 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:04:56.045206 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044766 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:04:56.045206 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044768 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:04:56.045206 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044770 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:04:56.045206 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044773 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:04:56.045206 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044775 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:04:56.045206 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044778 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:04:56.045206 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044780 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:04:56.045206 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044783 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:04:56.045206 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044785 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:04:56.045709 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044788 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:04:56.045709 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044791 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:04:56.045709 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044793 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:04:56.045709 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044796 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:04:56.045709 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044798 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:04:56.045709 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044801 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:04:56.045709 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044803 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:04:56.045709 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044806 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:04:56.045709 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044808 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:04:56.045709 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044811 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:04:56.045709 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044813 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:04:56.045709 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044815 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:04:56.045709 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044818 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:04:56.045709 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044821 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:04:56.045709 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044824 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:04:56.045709 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044826 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:04:56.045709 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044829 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:04:56.045709 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044832 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:04:56.045709 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044834 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:04:56.045709 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044837 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:04:56.046188 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044840 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:04:56.046188 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044842 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:04:56.046188 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044851 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:04:56.046188 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044854 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:04:56.046188 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044857 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:04:56.046188 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044859 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:04:56.046188 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044862 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:04:56.046188 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044865 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:04:56.046188 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044867 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:04:56.046188 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044870 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:04:56.046188 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044872 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:04:56.046188 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044875 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:04:56.046188 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044878 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:04:56.046188 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044880 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:04:56.046188 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044884 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:04:56.046188 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044888 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:04:56.046188 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044891 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:04:56.046188 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044894 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:04:56.046188 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044897 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:04:56.046188 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044899 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:04:56.046693 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044902 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:04:56.046693 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044904 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:04:56.046693 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044907 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:04:56.046693 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.044910 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:04:56.046693 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.044915 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:04:56.046693 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045029 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 22:04:56.046693 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045035 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 22:04:56.046693 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045038 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 22:04:56.046693 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045041 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 22:04:56.046693 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045044 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 22:04:56.046693 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045047 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 22:04:56.046693 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045049 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 22:04:56.046693 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045052 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 22:04:56.046693 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045054 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 22:04:56.046693 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045057 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 22:04:56.046693 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045060 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 22:04:56.047083 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045063 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 22:04:56.047083 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045066 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 22:04:56.047083 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045069 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 22:04:56.047083 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045071 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 22:04:56.047083 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045074 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 22:04:56.047083 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045077 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 22:04:56.047083 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045079 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 22:04:56.047083 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045082 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 22:04:56.047083 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045084 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 22:04:56.047083 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045087 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 22:04:56.047083 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045089 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 22:04:56.047083 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045091 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 22:04:56.047083 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045094 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 22:04:56.047083 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045096 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 22:04:56.047083 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045099 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 22:04:56.047083 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045101 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 22:04:56.047083 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045104 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 22:04:56.047083 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045107 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 22:04:56.047083 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045109 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 22:04:56.047083 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045111 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 22:04:56.047583 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045114 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 22:04:56.047583 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045117 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 22:04:56.047583 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045120 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 22:04:56.047583 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045122 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 22:04:56.047583 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045125 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 22:04:56.047583 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045128 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 22:04:56.047583 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045130 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 22:04:56.047583 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045133 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 22:04:56.047583 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045136 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 22:04:56.047583 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045138 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 22:04:56.047583 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045141 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 22:04:56.047583 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045143 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 22:04:56.047583 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045146 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 22:04:56.047583 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045148 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 22:04:56.047583 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045151 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 22:04:56.047583 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045153 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 22:04:56.047583 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045156 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 22:04:56.047583 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045158 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 22:04:56.047583 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045161 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 22:04:56.047583 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045164 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 22:04:56.048187 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045166 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 22:04:56.048187 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045168 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 22:04:56.048187 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045171 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 22:04:56.048187 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045175 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 22:04:56.048187 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045178 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 22:04:56.048187 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045181 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 22:04:56.048187 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045184 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 22:04:56.048187 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045186 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 22:04:56.048187 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045188 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 22:04:56.048187 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045191 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 22:04:56.048187 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045193 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 22:04:56.048187 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045196 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 22:04:56.048187 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045198 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 22:04:56.048187 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045201 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 22:04:56.048187 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045204 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 22:04:56.048187 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045206 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 22:04:56.048187 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045209 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 22:04:56.048187 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045212 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 22:04:56.048187 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045214 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 22:04:56.048734 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045217 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 22:04:56.048734 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045220 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 22:04:56.048734 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045222 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 22:04:56.048734 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045225 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 22:04:56.048734 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045227 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 22:04:56.048734 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045230 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 22:04:56.048734 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045232 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 22:04:56.048734 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045235 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 22:04:56.048734 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045237 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 22:04:56.048734 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045240 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 22:04:56.048734 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045244 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 22:04:56.048734 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045247 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 22:04:56.048734 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045250 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 22:04:56.048734 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045253 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 22:04:56.048734 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045256 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 22:04:56.048734 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:56.045259 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 22:04:56.049137 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.045263 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 22:04:56.049137 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.046048 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 22:04:56.049137 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.048934 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 22:04:56.049805 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.049790 2576 server.go:1019] "Starting client certificate rotation" Apr 16 22:04:56.049914 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.049897 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:04:56.049954 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.049945 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 22:04:56.070974 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.070957 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:04:56.076385 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.076368 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 22:04:56.093080 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.093062 2576 log.go:25] "Validated CRI v1 runtime API" Apr 16 22:04:56.098637 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.098621 2576 log.go:25] "Validated CRI v1 image API" Apr 16 22:04:56.099329 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.099297 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:04:56.100019 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.099999 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 22:04:56.104957 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.104938 2576 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 e3e0c37a-2bea-419c-818c-33fb6ac6360b:/dev/nvme0n1p3 f8a5b1a0-a15f-4d46-8043-577598c1e596:/dev/nvme0n1p4] Apr 16 22:04:56.105032 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.104956 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 22:04:56.110248 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.110146 2576 manager.go:217] Machine: {Timestamp:2026-04-16 22:04:56.108497008 +0000 UTC m=+0.369223124 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100258 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2bd445cc76cd76b73cee70853c78cb SystemUUID:ec2bd445-cc76-cd76-b73c-ee70853c78cb BootID:745102a4-a922-4851-8031-014feb16db99 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:da:b0:4b:15:51 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:da:b0:4b:15:51 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ce:8d:ec:b8:2f:3a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 22:04:56.110248 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.110243 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 22:04:56.110367 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.110325 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 22:04:56.111298 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.111276 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 22:04:56.111446 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.111300 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-26.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 22:04:56.111491 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.111453 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 22:04:56.111491 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.111461 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 22:04:56.111491 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.111474 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:04:56.112331 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.112321 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 22:04:56.113995 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.113985 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:04:56.114101 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.114092 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 22:04:56.116236 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.116227 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 16 22:04:56.116840 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.116830 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 22:04:56.116872 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.116853 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 22:04:56.116872 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.116863 2576 kubelet.go:397] "Adding apiserver pod source" Apr 16 22:04:56.116872 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.116872 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 22:04:56.117773 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.117763 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:04:56.117810 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.117780 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 22:04:56.118742 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.118721 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pjrtb" Apr 16 22:04:56.120734 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.120717 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 22:04:56.122844 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.122831 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 22:04:56.124132 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.124118 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 22:04:56.124171 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.124143 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 22:04:56.124171 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.124152 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 22:04:56.124171 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.124161 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 22:04:56.124171 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.124169 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 22:04:56.124273 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.124180 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 22:04:56.124273 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.124204 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 22:04:56.124273 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.124213 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 22:04:56.124273 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.124223 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 22:04:56.124273 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.124233 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 22:04:56.124273 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.124266 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 22:04:56.124466 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.124279 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 22:04:56.124972 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.124960 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 22:04:56.125001 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.124974 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 22:04:56.125544 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.125531 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-pjrtb" Apr 16 22:04:56.128657 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.128640 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 22:04:56.128730 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.128676 2576 server.go:1295] "Started kubelet" Apr 16 22:04:56.128782 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.128737 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 22:04:56.128852 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.128808 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 22:04:56.128892 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.128881 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 22:04:56.129504 ip-10-0-130-26 systemd[1]: Started Kubernetes Kubelet. Apr 16 22:04:56.133244 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.133225 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 22:04:56.136905 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.136885 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 16 22:04:56.136905 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.136901 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:04:56.138176 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.138156 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:04:56.139521 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.139506 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-26.ec2.internal" not found Apr 16 22:04:56.140964 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:56.140942 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 22:04:56.141066 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.140990 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 22:04:56.141529 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.141518 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 22:04:56.142090 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.142072 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 22:04:56.142181 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.142095 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 22:04:56.142181 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.142072 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 22:04:56.142289 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.142186 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 16 22:04:56.142289 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.142195 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 16 22:04:56.142289 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.142242 2576 factory.go:55] Registering systemd factory Apr 16 22:04:56.142289 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.142259 2576 factory.go:223] Registration of the systemd container factory successfully Apr 16 22:04:56.142469 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:56.142377 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-26.ec2.internal\" not found" Apr 16 22:04:56.142526 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.142494 2576 factory.go:153] Registering CRI-O factory Apr 16 22:04:56.142526 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.142507 2576 factory.go:223] Registration of the crio container factory successfully Apr 16 22:04:56.142625 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.142557 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 22:04:56.142625 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.142596 2576 factory.go:103] Registering Raw factory Apr 16 22:04:56.142625 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.142610 2576 manager.go:1196] Started watching for new ooms in manager Apr 16 22:04:56.143027 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.143014 2576 manager.go:319] Starting recovery of all containers Apr 16 22:04:56.143580 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.143559 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:04:56.146214 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:56.146194 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-26.ec2.internal\" not found" node="ip-10-0-130-26.ec2.internal" Apr 16 22:04:56.152509 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.152484 2576 manager.go:324] Recovery completed Apr 16 22:04:56.155289 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.155273 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-26.ec2.internal" not found Apr 16 22:04:56.156163 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.156152 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:04:56.157823 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.157808 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-26.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:04:56.157883 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.157835 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-26.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:04:56.157883 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.157847 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-26.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:04:56.158230 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.158214 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 22:04:56.158230 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.158228 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 22:04:56.158345 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.158247 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 22:04:56.160971 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.160960 2576 policy_none.go:49] "None policy: Start" Apr 16 22:04:56.161022 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.160975 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 22:04:56.161022 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.160984 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 16 22:04:56.210911 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.210898 2576 manager.go:341] "Starting Device Plugin manager" Apr 16 22:04:56.211769 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:56.210972 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 22:04:56.211769 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.210986 2576 server.go:85] "Starting device plugin registration server" Apr 16 22:04:56.211769 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.211209 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 22:04:56.211769 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.211219 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 22:04:56.211769 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.211327 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 22:04:56.211769 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.211407 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 22:04:56.211769 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.211416 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 22:04:56.212029 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:56.211840 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 22:04:56.212029 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:56.211881 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-26.ec2.internal\" not found" Apr 16 22:04:56.212445 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.212431 2576 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-26.ec2.internal" not found Apr 16 22:04:56.275840 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.275804 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 22:04:56.277062 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.277011 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 22:04:56.277062 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.277036 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 22:04:56.277062 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.277057 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 22:04:56.277199 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.277065 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 22:04:56.277199 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:56.277102 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 22:04:56.278882 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.278865 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:04:56.311734 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.311714 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 22:04:56.312761 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.312746 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-26.ec2.internal" event="NodeHasSufficientMemory" Apr 16 22:04:56.312843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.312775 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-26.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 22:04:56.312843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.312785 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-26.ec2.internal" event="NodeHasSufficientPID" Apr 16 22:04:56.312843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.312809 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-26.ec2.internal" Apr 16 22:04:56.321195 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.321176 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-26.ec2.internal" Apr 16 22:04:56.377639 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.377596 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-26.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-26.ec2.internal"] Apr 16 22:04:56.379810 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.379794 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-26.ec2.internal" Apr 16 22:04:56.379810 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.379805 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-26.ec2.internal" Apr 16 22:04:56.410826 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.410807 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-26.ec2.internal" Apr 16 22:04:56.414113 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.414099 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-26.ec2.internal" Apr 16 22:04:56.422122 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.422108 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:04:56.422198 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.422109 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 22:04:56.443485 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.443464 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ceb21ad86624033bad3b80db33531fac-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-26.ec2.internal\" (UID: \"ceb21ad86624033bad3b80db33531fac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-26.ec2.internal" Apr 16 22:04:56.443565 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.443489 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ceb21ad86624033bad3b80db33531fac-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-26.ec2.internal\" (UID: \"ceb21ad86624033bad3b80db33531fac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-26.ec2.internal" Apr 16 22:04:56.443565 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.443508 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/be5bbbaa384543c3c03d64000cc8b573-config\") pod \"kube-apiserver-proxy-ip-10-0-130-26.ec2.internal\" (UID: \"be5bbbaa384543c3c03d64000cc8b573\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-26.ec2.internal" Apr 16 22:04:56.544402 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.544354 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ceb21ad86624033bad3b80db33531fac-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-26.ec2.internal\" (UID: \"ceb21ad86624033bad3b80db33531fac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-26.ec2.internal" Apr 16 22:04:56.544402 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.544377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ceb21ad86624033bad3b80db33531fac-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-26.ec2.internal\" (UID: \"ceb21ad86624033bad3b80db33531fac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-26.ec2.internal" Apr 16 22:04:56.544402 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.544393 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/be5bbbaa384543c3c03d64000cc8b573-config\") pod \"kube-apiserver-proxy-ip-10-0-130-26.ec2.internal\" (UID: \"be5bbbaa384543c3c03d64000cc8b573\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-26.ec2.internal" Apr 16 22:04:56.544556 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.544433 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/be5bbbaa384543c3c03d64000cc8b573-config\") pod \"kube-apiserver-proxy-ip-10-0-130-26.ec2.internal\" (UID: \"be5bbbaa384543c3c03d64000cc8b573\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-26.ec2.internal" Apr 16 22:04:56.544556 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.544453 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ceb21ad86624033bad3b80db33531fac-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-26.ec2.internal\" (UID: \"ceb21ad86624033bad3b80db33531fac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-26.ec2.internal" Apr 16 22:04:56.544556 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.544469 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ceb21ad86624033bad3b80db33531fac-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-26.ec2.internal\" (UID: \"ceb21ad86624033bad3b80db33531fac\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-26.ec2.internal" Apr 16 22:04:56.724992 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.724961 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-26.ec2.internal" Apr 16 22:04:56.726052 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:56.726038 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-26.ec2.internal" Apr 16 22:04:57.049405 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.049385 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 22:04:57.050115 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.049498 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:04:57.050115 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.049533 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:04:57.050115 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.049542 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 22:04:57.117866 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.117840 2576 apiserver.go:52] "Watching apiserver" Apr 16 22:04:57.122567 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.122545 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 22:04:57.122935 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.122915 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-130-26.ec2.internal","openshift-dns/node-resolver-7pgrh","openshift-image-registry/node-ca-4nfgz","openshift-multus/network-metrics-daemon-nrljs","openshift-network-diagnostics/network-check-target-jngsx","openshift-ovn-kubernetes/ovnkube-node-fw6md","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h","openshift-cluster-node-tuning-operator/tuned-5fq7v","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-26.ec2.internal","openshift-multus/multus-additional-cni-plugins-mz6rs","openshift-multus/multus-ltxhs","openshift-network-operator/iptables-alerter-2tscq","kube-system/konnectivity-agent-7qvcm"] Apr 16 22:04:57.125428 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.125410 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7pgrh" Apr 16 22:04:57.126465 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.126446 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4nfgz" Apr 16 22:04:57.127439 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.127419 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-d7w2z\"" Apr 16 22:04:57.127439 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.127418 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 21:59:56 +0000 UTC" deadline="2027-09-16 03:43:02.90882676 +0000 UTC" Apr 16 22:04:57.127634 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.127440 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 22:04:57.127634 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.127490 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:04:57.127634 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.127446 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12413h38m5.781383308s" Apr 16 22:04:57.127844 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:57.127816 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nrljs" podUID="2f7c8d95-90cb-497a-8866-d2c45b825b72" Apr 16 22:04:57.127899 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.127436 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 22:04:57.128290 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.128269 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:04:57.128421 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:57.128382 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jngsx" podUID="2d82db40-2daf-4082-bfaa-d74e9b453817" Apr 16 22:04:57.128496 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.128453 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 22:04:57.128496 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.128508 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 22:04:57.128706 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.128514 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 22:04:57.128706 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.128685 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-bx2nq\"" Apr 16 22:04:57.129501 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.129481 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.130962 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.130946 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.131550 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.131531 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 22:04:57.131824 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.131806 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 22:04:57.131885 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.131819 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 22:04:57.131885 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.131875 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-kgszc\"" Apr 16 22:04:57.132024 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.131882 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 22:04:57.132024 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.131810 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 22:04:57.132147 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.132131 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 22:04:57.132199 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.132160 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.132641 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.132625 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-bmw9x\"" Apr 16 22:04:57.132641 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.132637 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 22:04:57.132819 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.132807 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 22:04:57.132897 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.132831 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 22:04:57.133484 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.133470 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.133928 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.133911 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:04:57.134011 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.133936 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zt5gz\"" Apr 16 22:04:57.134191 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.134162 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 22:04:57.134750 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.134734 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.135017 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.134996 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-9vld4\"" Apr 16 22:04:57.135205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.135191 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 22:04:57.135272 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.135233 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 22:04:57.135272 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.135245 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 22:04:57.135396 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.135369 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 22:04:57.135450 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.135429 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 22:04:57.135980 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.135962 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2tscq" Apr 16 22:04:57.136498 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.136482 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 22:04:57.136570 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.136529 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-ldz7x\"" Apr 16 22:04:57.137200 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.137187 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7qvcm" Apr 16 22:04:57.137756 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.137737 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 22:04:57.137847 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.137832 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:04:57.137897 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.137832 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 22:04:57.137971 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.137951 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ckwwq\"" Apr 16 22:04:57.138892 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.138875 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 22:04:57.139564 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.139546 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zjszl\"" Apr 16 22:04:57.139634 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.139582 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 22:04:57.141070 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.141052 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 22:04:57.143319 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.143294 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 22:04:57.147573 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.147555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-cni-binary-copy\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.147660 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.147580 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0982ea7f-a131-4caf-8792-d0c2c1bf4089-konnectivity-ca\") pod \"konnectivity-agent-7qvcm\" (UID: \"0982ea7f-a131-4caf-8792-d0c2c1bf4089\") " pod="kube-system/konnectivity-agent-7qvcm" Apr 16 22:04:57.147660 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.147595 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmt9c\" (UniqueName: \"kubernetes.io/projected/4591d12d-774b-43ce-a862-67018bf47f0c-kube-api-access-wmt9c\") pod \"node-ca-4nfgz\" (UID: \"4591d12d-774b-43ce-a862-67018bf47f0c\") " pod="openshift-image-registry/node-ca-4nfgz" Apr 16 22:04:57.147660 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.147614 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-run-ovn\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.147660 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.147632 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-sys\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.147864 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.147704 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/84ed2aab-4f39-4883-a3d8-e59507d5aae3-host-slash\") pod \"iptables-alerter-2tscq\" (UID: \"84ed2aab-4f39-4883-a3d8-e59507d5aae3\") " pod="openshift-network-operator/iptables-alerter-2tscq" Apr 16 22:04:57.147864 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.147740 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-host-slash\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.147864 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.147776 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c889066-e69a-44d5-b456-b37d09282234-env-overrides\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.147864 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.147800 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6b04177-21b9-47c3-b9d0-f924bd2178f4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lsg6h\" (UID: \"f6b04177-21b9-47c3-b9d0-f924bd2178f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.147864 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.147823 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f6b04177-21b9-47c3-b9d0-f924bd2178f4-device-dir\") pod \"aws-ebs-csi-driver-node-lsg6h\" (UID: \"f6b04177-21b9-47c3-b9d0-f924bd2178f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.147864 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.147846 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-host-run-netns\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.148152 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.147869 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a333a392-a24f-4b6c-85d5-2cc457992bf5-hosts-file\") pod \"node-resolver-7pgrh\" (UID: \"a333a392-a24f-4b6c-85d5-2cc457992bf5\") " pod="openshift-dns/node-resolver-7pgrh" Apr 16 22:04:57.148152 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.147904 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c889066-e69a-44d5-b456-b37d09282234-ovn-node-metrics-cert\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.148152 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.147929 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f6b04177-21b9-47c3-b9d0-f924bd2178f4-socket-dir\") pod \"aws-ebs-csi-driver-node-lsg6h\" (UID: \"f6b04177-21b9-47c3-b9d0-f924bd2178f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.148152 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.147953 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a333a392-a24f-4b6c-85d5-2cc457992bf5-tmp-dir\") pod \"node-resolver-7pgrh\" (UID: \"a333a392-a24f-4b6c-85d5-2cc457992bf5\") " pod="openshift-dns/node-resolver-7pgrh" Apr 16 22:04:57.148152 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.147976 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-host-run-netns\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.148152 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148000 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-run-openvswitch\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.148152 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148022 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-node-log\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.148152 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148045 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-etc-modprobe-d\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.148152 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148067 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-etc-sysconfig\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.148152 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148096 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-etc-sysctl-d\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.148152 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148120 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-etc-systemd\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.148152 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148144 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4vx7\" (UniqueName: \"kubernetes.io/projected/f6b04177-21b9-47c3-b9d0-f924bd2178f4-kube-api-access-h4vx7\") pod \"aws-ebs-csi-driver-node-lsg6h\" (UID: \"f6b04177-21b9-47c3-b9d0-f924bd2178f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.148745 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148170 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbk7g\" (UniqueName: \"kubernetes.io/projected/2f7c8d95-90cb-497a-8866-d2c45b825b72-kube-api-access-bbk7g\") pod \"network-metrics-daemon-nrljs\" (UID: \"2f7c8d95-90cb-497a-8866-d2c45b825b72\") " pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:04:57.148745 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148195 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-var-lib-kubelet\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.148745 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148225 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd98x\" (UniqueName: \"kubernetes.io/projected/34de8035-f298-43ae-864d-cfeeb05c7620-kube-api-access-cd98x\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.148745 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148248 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.148745 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148274 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdtw2\" (UniqueName: \"kubernetes.io/projected/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-kube-api-access-pdtw2\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.148745 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148299 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-system-cni-dir\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.148745 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148341 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-os-release\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.148745 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148375 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/84ed2aab-4f39-4883-a3d8-e59507d5aae3-iptables-alerter-script\") pod \"iptables-alerter-2tscq\" (UID: \"84ed2aab-4f39-4883-a3d8-e59507d5aae3\") " pod="openshift-network-operator/iptables-alerter-2tscq" Apr 16 22:04:57.148745 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148406 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-etc-kubernetes\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.148745 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148428 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-etc-sysctl-conf\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.148745 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148465 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f6b04177-21b9-47c3-b9d0-f924bd2178f4-registration-dir\") pod \"aws-ebs-csi-driver-node-lsg6h\" (UID: \"f6b04177-21b9-47c3-b9d0-f924bd2178f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.148745 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148488 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-748wp\" (UniqueName: \"kubernetes.io/projected/0dc08e01-6796-4c69-9ed5-214b13ad71cd-kube-api-access-748wp\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.148745 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148512 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-host-kubelet\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.148745 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148534 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-host-cni-bin\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.148745 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148566 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5c889066-e69a-44d5-b456-b37d09282234-ovnkube-script-lib\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.148745 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148591 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-etc-openvswitch\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.149502 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148613 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-cnibin\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.149502 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148636 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-cnibin\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.149502 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148660 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-multus-socket-dir-parent\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.149502 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148685 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs\") pod \"network-metrics-daemon-nrljs\" (UID: \"2f7c8d95-90cb-497a-8866-d2c45b825b72\") " pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:04:57.149502 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148707 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-run-systemd\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.149502 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148730 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-host-run-k8s-cni-cncf-io\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.149502 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148755 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-host-var-lib-cni-multus\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.149502 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148778 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-host-var-lib-kubelet\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.149502 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148809 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0982ea7f-a131-4caf-8792-d0c2c1bf4089-agent-certs\") pod \"konnectivity-agent-7qvcm\" (UID: \"0982ea7f-a131-4caf-8792-d0c2c1bf4089\") " pod="kube-system/konnectivity-agent-7qvcm" Apr 16 22:04:57.149502 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148851 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-systemd-units\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.149502 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148874 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn9wf\" (UniqueName: \"kubernetes.io/projected/5c889066-e69a-44d5-b456-b37d09282234-kube-api-access-vn9wf\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.149502 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148896 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0dc08e01-6796-4c69-9ed5-214b13ad71cd-multus-daemon-config\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.149502 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148919 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-host-cni-netd\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.149502 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148941 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-run\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.149502 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-host\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.149502 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.148985 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/34de8035-f298-43ae-864d-cfeeb05c7620-etc-tuned\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.149502 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149010 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f6b04177-21b9-47c3-b9d0-f924bd2178f4-etc-selinux\") pod \"aws-ebs-csi-driver-node-lsg6h\" (UID: \"f6b04177-21b9-47c3-b9d0-f924bd2178f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.150041 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149034 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.150041 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149059 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-host-var-lib-cni-bin\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.150041 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149101 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-hostroot\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.150041 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149124 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-multus-conf-dir\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.150041 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149147 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-var-lib-openvswitch\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.150041 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149169 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-host-run-ovn-kubernetes\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.150041 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149202 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fq9x\" (UniqueName: \"kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x\") pod \"network-check-target-jngsx\" (UID: \"2d82db40-2daf-4082-bfaa-d74e9b453817\") " pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:04:57.150041 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149228 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-multus-cni-dir\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.150041 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149252 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0dc08e01-6796-4c69-9ed5-214b13ad71cd-cni-binary-copy\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.150041 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149274 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-etc-kubernetes\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.150041 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149297 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4591d12d-774b-43ce-a862-67018bf47f0c-host\") pod \"node-ca-4nfgz\" (UID: \"4591d12d-774b-43ce-a862-67018bf47f0c\") " pod="openshift-image-registry/node-ca-4nfgz" Apr 16 22:04:57.150041 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149343 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.150041 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149368 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-os-release\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.150041 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149392 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.150041 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149416 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq5z8\" (UniqueName: \"kubernetes.io/projected/84ed2aab-4f39-4883-a3d8-e59507d5aae3-kube-api-access-vq5z8\") pod \"iptables-alerter-2tscq\" (UID: \"84ed2aab-4f39-4883-a3d8-e59507d5aae3\") " pod="openshift-network-operator/iptables-alerter-2tscq" Apr 16 22:04:57.150041 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149441 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7b8x\" (UniqueName: \"kubernetes.io/projected/a333a392-a24f-4b6c-85d5-2cc457992bf5-kube-api-access-f7b8x\") pod \"node-resolver-7pgrh\" (UID: \"a333a392-a24f-4b6c-85d5-2cc457992bf5\") " pod="openshift-dns/node-resolver-7pgrh" Apr 16 22:04:57.150528 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149464 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4591d12d-774b-43ce-a862-67018bf47f0c-serviceca\") pod \"node-ca-4nfgz\" (UID: \"4591d12d-774b-43ce-a862-67018bf47f0c\") " pod="openshift-image-registry/node-ca-4nfgz" Apr 16 22:04:57.150528 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149487 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c889066-e69a-44d5-b456-b37d09282234-ovnkube-config\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.150528 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149508 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/34de8035-f298-43ae-864d-cfeeb05c7620-tmp\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.150528 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149531 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-lib-modules\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.150528 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f6b04177-21b9-47c3-b9d0-f924bd2178f4-sys-fs\") pod \"aws-ebs-csi-driver-node-lsg6h\" (UID: \"f6b04177-21b9-47c3-b9d0-f924bd2178f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.150528 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149578 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-system-cni-dir\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.150528 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149602 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-host-run-multus-certs\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.150528 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.149627 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-log-socket\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.155920 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.155895 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 22:04:57.176981 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.176963 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-gprk2" Apr 16 22:04:57.185661 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.185636 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-gprk2" Apr 16 22:04:57.249816 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.249793 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f6b04177-21b9-47c3-b9d0-f924bd2178f4-device-dir\") pod \"aws-ebs-csi-driver-node-lsg6h\" (UID: \"f6b04177-21b9-47c3-b9d0-f924bd2178f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.249932 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.249820 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-host-run-netns\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.249932 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.249837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a333a392-a24f-4b6c-85d5-2cc457992bf5-hosts-file\") pod \"node-resolver-7pgrh\" (UID: \"a333a392-a24f-4b6c-85d5-2cc457992bf5\") " pod="openshift-dns/node-resolver-7pgrh" Apr 16 22:04:57.249932 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.249854 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c889066-e69a-44d5-b456-b37d09282234-ovn-node-metrics-cert\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.249932 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.249878 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f6b04177-21b9-47c3-b9d0-f924bd2178f4-socket-dir\") pod \"aws-ebs-csi-driver-node-lsg6h\" (UID: \"f6b04177-21b9-47c3-b9d0-f924bd2178f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.249932 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.249900 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a333a392-a24f-4b6c-85d5-2cc457992bf5-tmp-dir\") pod \"node-resolver-7pgrh\" (UID: \"a333a392-a24f-4b6c-85d5-2cc457992bf5\") " pod="openshift-dns/node-resolver-7pgrh" Apr 16 22:04:57.249932 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.249906 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-host-run-netns\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.249932 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.249924 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-host-run-netns\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.249932 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.249922 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f6b04177-21b9-47c3-b9d0-f924bd2178f4-device-dir\") pod \"aws-ebs-csi-driver-node-lsg6h\" (UID: \"f6b04177-21b9-47c3-b9d0-f924bd2178f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.250321 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.249936 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a333a392-a24f-4b6c-85d5-2cc457992bf5-hosts-file\") pod \"node-resolver-7pgrh\" (UID: \"a333a392-a24f-4b6c-85d5-2cc457992bf5\") " pod="openshift-dns/node-resolver-7pgrh" Apr 16 22:04:57.250321 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.249967 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-run-openvswitch\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.250321 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.249991 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-host-run-netns\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.250321 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.249994 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-node-log\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.250321 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250020 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f6b04177-21b9-47c3-b9d0-f924bd2178f4-socket-dir\") pod \"aws-ebs-csi-driver-node-lsg6h\" (UID: \"f6b04177-21b9-47c3-b9d0-f924bd2178f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.250321 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250030 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-node-log\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.250321 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250032 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-etc-modprobe-d\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.250321 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250065 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-etc-sysconfig\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.250321 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250080 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-run-openvswitch\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.250321 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250090 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-etc-sysctl-d\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.250321 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250129 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-etc-systemd\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.250321 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250139 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-etc-modprobe-d\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.250321 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250158 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4vx7\" (UniqueName: \"kubernetes.io/projected/f6b04177-21b9-47c3-b9d0-f924bd2178f4-kube-api-access-h4vx7\") pod \"aws-ebs-csi-driver-node-lsg6h\" (UID: \"f6b04177-21b9-47c3-b9d0-f924bd2178f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.250321 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250169 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-etc-sysconfig\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.250321 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250184 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbk7g\" (UniqueName: \"kubernetes.io/projected/2f7c8d95-90cb-497a-8866-d2c45b825b72-kube-api-access-bbk7g\") pod \"network-metrics-daemon-nrljs\" (UID: \"2f7c8d95-90cb-497a-8866-d2c45b825b72\") " pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:04:57.250321 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250188 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-etc-systemd\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.250321 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250195 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-etc-sysctl-d\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.251154 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-var-lib-kubelet\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.251154 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250234 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cd98x\" (UniqueName: \"kubernetes.io/projected/34de8035-f298-43ae-864d-cfeeb05c7620-kube-api-access-cd98x\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.251154 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250247 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 22:04:57.251154 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250257 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.251154 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250285 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdtw2\" (UniqueName: \"kubernetes.io/projected/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-kube-api-access-pdtw2\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.251154 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250336 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-system-cni-dir\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.251154 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250361 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-os-release\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.251154 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/84ed2aab-4f39-4883-a3d8-e59507d5aae3-iptables-alerter-script\") pod \"iptables-alerter-2tscq\" (UID: \"84ed2aab-4f39-4883-a3d8-e59507d5aae3\") " pod="openshift-network-operator/iptables-alerter-2tscq" Apr 16 22:04:57.251154 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-etc-kubernetes\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.251154 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250434 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-etc-sysctl-conf\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.251154 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250459 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f6b04177-21b9-47c3-b9d0-f924bd2178f4-registration-dir\") pod \"aws-ebs-csi-driver-node-lsg6h\" (UID: \"f6b04177-21b9-47c3-b9d0-f924bd2178f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.251154 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250490 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-748wp\" (UniqueName: \"kubernetes.io/projected/0dc08e01-6796-4c69-9ed5-214b13ad71cd-kube-api-access-748wp\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.251154 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250531 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-host-kubelet\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.251154 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250533 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-etc-kubernetes\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.251154 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250571 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-tuning-conf-dir\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.251154 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-host-cni-bin\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.251154 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250458 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-var-lib-kubelet\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.251154 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250592 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-os-release\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.251975 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250615 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-host-cni-bin\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.251975 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250616 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5c889066-e69a-44d5-b456-b37d09282234-ovnkube-script-lib\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.251975 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250657 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f6b04177-21b9-47c3-b9d0-f924bd2178f4-registration-dir\") pod \"aws-ebs-csi-driver-node-lsg6h\" (UID: \"f6b04177-21b9-47c3-b9d0-f924bd2178f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.251975 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-host-kubelet\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.251975 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250701 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-system-cni-dir\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.251975 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-etc-sysctl-conf\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.251975 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.250912 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a333a392-a24f-4b6c-85d5-2cc457992bf5-tmp-dir\") pod \"node-resolver-7pgrh\" (UID: \"a333a392-a24f-4b6c-85d5-2cc457992bf5\") " pod="openshift-dns/node-resolver-7pgrh" Apr 16 22:04:57.251975 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-etc-openvswitch\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.251975 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-cnibin\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.251975 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251077 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-cnibin\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.251975 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251098 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-multus-socket-dir-parent\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.251975 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251104 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-etc-openvswitch\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.251975 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251119 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs\") pod \"network-metrics-daemon-nrljs\" (UID: \"2f7c8d95-90cb-497a-8866-d2c45b825b72\") " pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:04:57.251975 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251168 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-cnibin\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.251975 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251172 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/84ed2aab-4f39-4883-a3d8-e59507d5aae3-iptables-alerter-script\") pod \"iptables-alerter-2tscq\" (UID: \"84ed2aab-4f39-4883-a3d8-e59507d5aae3\") " pod="openshift-network-operator/iptables-alerter-2tscq" Apr 16 22:04:57.251975 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:57.251197 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:57.251975 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251205 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-run-systemd\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.251975 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251236 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-host-run-k8s-cni-cncf-io\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.252856 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:57.251255 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs podName:2f7c8d95-90cb-497a-8866-d2c45b825b72 nodeName:}" failed. No retries permitted until 2026-04-16 22:04:57.75123671 +0000 UTC m=+2.011962813 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs") pod "network-metrics-daemon-nrljs" (UID: "2f7c8d95-90cb-497a-8866-d2c45b825b72") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:57.252856 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251271 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-host-run-k8s-cni-cncf-io\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.252856 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-host-var-lib-cni-multus\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.252856 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251323 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-cnibin\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.252856 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251271 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-multus-socket-dir-parent\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.252856 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251279 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5c889066-e69a-44d5-b456-b37d09282234-ovnkube-script-lib\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.252856 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251345 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-run-systemd\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.252856 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251358 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-host-var-lib-kubelet\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.252856 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251393 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0982ea7f-a131-4caf-8792-d0c2c1bf4089-agent-certs\") pod \"konnectivity-agent-7qvcm\" (UID: \"0982ea7f-a131-4caf-8792-d0c2c1bf4089\") " pod="kube-system/konnectivity-agent-7qvcm" Apr 16 22:04:57.252856 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-systemd-units\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.252856 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vn9wf\" (UniqueName: \"kubernetes.io/projected/5c889066-e69a-44d5-b456-b37d09282234-kube-api-access-vn9wf\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.252856 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251471 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-systemd-units\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.252856 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251486 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0dc08e01-6796-4c69-9ed5-214b13ad71cd-multus-daemon-config\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.252856 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251512 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-host-var-lib-kubelet\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.252856 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251547 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-host-var-lib-cni-multus\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.252856 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251650 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-host-cni-netd\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.252856 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251677 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-run\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.253680 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251703 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-host\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.253680 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251727 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/34de8035-f298-43ae-864d-cfeeb05c7620-etc-tuned\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.253680 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251749 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-host-cni-netd\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.253680 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251754 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f6b04177-21b9-47c3-b9d0-f924bd2178f4-etc-selinux\") pod \"aws-ebs-csi-driver-node-lsg6h\" (UID: \"f6b04177-21b9-47c3-b9d0-f924bd2178f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.253680 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251773 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-run\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.253680 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251797 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f6b04177-21b9-47c3-b9d0-f924bd2178f4-etc-selinux\") pod \"aws-ebs-csi-driver-node-lsg6h\" (UID: \"f6b04177-21b9-47c3-b9d0-f924bd2178f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.253680 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251798 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.253680 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251827 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-host\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.253680 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-host-var-lib-cni-bin\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.253680 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-hostroot\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.253680 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251890 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-multus-conf-dir\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.253680 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251895 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-hostroot\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.253680 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251865 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-host-var-lib-cni-bin\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.253680 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251916 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-var-lib-openvswitch\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.253680 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251940 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-multus-conf-dir\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.253680 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251942 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-host-run-ovn-kubernetes\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.253680 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251958 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0dc08e01-6796-4c69-9ed5-214b13ad71cd-multus-daemon-config\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.253680 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251975 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fq9x\" (UniqueName: \"kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x\") pod \"network-check-target-jngsx\" (UID: \"2d82db40-2daf-4082-bfaa-d74e9b453817\") " pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:04:57.254286 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.251973 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-host-run-ovn-kubernetes\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.254286 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-multus-cni-dir\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.254286 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252047 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0dc08e01-6796-4c69-9ed5-214b13ad71cd-cni-binary-copy\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.254286 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-etc-kubernetes\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.254286 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252079 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4591d12d-774b-43ce-a862-67018bf47f0c-host\") pod \"node-ca-4nfgz\" (UID: \"4591d12d-774b-43ce-a862-67018bf47f0c\") " pod="openshift-image-registry/node-ca-4nfgz" Apr 16 22:04:57.254286 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252107 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-var-lib-openvswitch\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.254286 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252109 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.254286 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252187 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-multus-cni-dir\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.254286 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-os-release\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.254286 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252204 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-etc-kubernetes\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.254286 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252220 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.254286 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252233 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-os-release\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.254286 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vq5z8\" (UniqueName: \"kubernetes.io/projected/84ed2aab-4f39-4883-a3d8-e59507d5aae3-kube-api-access-vq5z8\") pod \"iptables-alerter-2tscq\" (UID: \"84ed2aab-4f39-4883-a3d8-e59507d5aae3\") " pod="openshift-network-operator/iptables-alerter-2tscq" Apr 16 22:04:57.254286 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252274 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7b8x\" (UniqueName: \"kubernetes.io/projected/a333a392-a24f-4b6c-85d5-2cc457992bf5-kube-api-access-f7b8x\") pod \"node-resolver-7pgrh\" (UID: \"a333a392-a24f-4b6c-85d5-2cc457992bf5\") " pod="openshift-dns/node-resolver-7pgrh" Apr 16 22:04:57.254286 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252280 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4591d12d-774b-43ce-a862-67018bf47f0c-host\") pod \"node-ca-4nfgz\" (UID: \"4591d12d-774b-43ce-a862-67018bf47f0c\") " pod="openshift-image-registry/node-ca-4nfgz" Apr 16 22:04:57.254286 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252289 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.254286 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252317 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4591d12d-774b-43ce-a862-67018bf47f0c-serviceca\") pod \"node-ca-4nfgz\" (UID: \"4591d12d-774b-43ce-a862-67018bf47f0c\") " pod="openshift-image-registry/node-ca-4nfgz" Apr 16 22:04:57.254843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252360 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c889066-e69a-44d5-b456-b37d09282234-ovnkube-config\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.254843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252383 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/34de8035-f298-43ae-864d-cfeeb05c7620-tmp\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.254843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252408 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-lib-modules\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.254843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252431 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f6b04177-21b9-47c3-b9d0-f924bd2178f4-sys-fs\") pod \"aws-ebs-csi-driver-node-lsg6h\" (UID: \"f6b04177-21b9-47c3-b9d0-f924bd2178f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.254843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252462 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-system-cni-dir\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.254843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252488 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-host-run-multus-certs\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.254843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252514 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-log-socket\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.254843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252554 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-log-socket\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.254843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252639 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-lib-modules\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.254843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252685 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f6b04177-21b9-47c3-b9d0-f924bd2178f4-sys-fs\") pod \"aws-ebs-csi-driver-node-lsg6h\" (UID: \"f6b04177-21b9-47c3-b9d0-f924bd2178f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.254843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252724 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-system-cni-dir\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.254843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252746 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0dc08e01-6796-4c69-9ed5-214b13ad71cd-cni-binary-copy\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.254843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252760 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0dc08e01-6796-4c69-9ed5-214b13ad71cd-host-run-multus-certs\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.254843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252139 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.254843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.252794 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.254843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.253055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-cni-binary-copy\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.254843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.253086 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0982ea7f-a131-4caf-8792-d0c2c1bf4089-konnectivity-ca\") pod \"konnectivity-agent-7qvcm\" (UID: \"0982ea7f-a131-4caf-8792-d0c2c1bf4089\") " pod="kube-system/konnectivity-agent-7qvcm" Apr 16 22:04:57.255338 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.253111 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmt9c\" (UniqueName: \"kubernetes.io/projected/4591d12d-774b-43ce-a862-67018bf47f0c-kube-api-access-wmt9c\") pod \"node-ca-4nfgz\" (UID: \"4591d12d-774b-43ce-a862-67018bf47f0c\") " pod="openshift-image-registry/node-ca-4nfgz" Apr 16 22:04:57.255338 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.253134 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-run-ovn\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.255338 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.253156 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-sys\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.255338 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.253181 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/84ed2aab-4f39-4883-a3d8-e59507d5aae3-host-slash\") pod \"iptables-alerter-2tscq\" (UID: \"84ed2aab-4f39-4883-a3d8-e59507d5aae3\") " pod="openshift-network-operator/iptables-alerter-2tscq" Apr 16 22:04:57.255338 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.253186 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4591d12d-774b-43ce-a862-67018bf47f0c-serviceca\") pod \"node-ca-4nfgz\" (UID: \"4591d12d-774b-43ce-a862-67018bf47f0c\") " pod="openshift-image-registry/node-ca-4nfgz" Apr 16 22:04:57.255338 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.253203 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-host-slash\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.255338 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.253248 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-run-ovn\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.255338 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.253258 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/34de8035-f298-43ae-864d-cfeeb05c7620-sys\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.255338 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.253284 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c889066-e69a-44d5-b456-b37d09282234-env-overrides\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.255338 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.253286 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5c889066-e69a-44d5-b456-b37d09282234-ovnkube-config\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.255338 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.253319 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/84ed2aab-4f39-4883-a3d8-e59507d5aae3-host-slash\") pod \"iptables-alerter-2tscq\" (UID: \"84ed2aab-4f39-4883-a3d8-e59507d5aae3\") " pod="openshift-network-operator/iptables-alerter-2tscq" Apr 16 22:04:57.255338 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.253335 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6b04177-21b9-47c3-b9d0-f924bd2178f4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lsg6h\" (UID: \"f6b04177-21b9-47c3-b9d0-f924bd2178f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.255338 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.253367 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5c889066-e69a-44d5-b456-b37d09282234-host-slash\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.255338 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.253411 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6b04177-21b9-47c3-b9d0-f924bd2178f4-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lsg6h\" (UID: \"f6b04177-21b9-47c3-b9d0-f924bd2178f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.255338 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.253485 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5c889066-e69a-44d5-b456-b37d09282234-ovn-node-metrics-cert\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.255338 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.253852 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-cni-binary-copy\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.255338 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.253927 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5c889066-e69a-44d5-b456-b37d09282234-env-overrides\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.255338 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.253963 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0982ea7f-a131-4caf-8792-d0c2c1bf4089-konnectivity-ca\") pod \"konnectivity-agent-7qvcm\" (UID: \"0982ea7f-a131-4caf-8792-d0c2c1bf4089\") " pod="kube-system/konnectivity-agent-7qvcm" Apr 16 22:04:57.256020 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.254096 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/34de8035-f298-43ae-864d-cfeeb05c7620-etc-tuned\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.256020 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.254507 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0982ea7f-a131-4caf-8792-d0c2c1bf4089-agent-certs\") pod \"konnectivity-agent-7qvcm\" (UID: \"0982ea7f-a131-4caf-8792-d0c2c1bf4089\") " pod="kube-system/konnectivity-agent-7qvcm" Apr 16 22:04:57.256020 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.254582 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/34de8035-f298-43ae-864d-cfeeb05c7620-tmp\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.257653 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.257626 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdtw2\" (UniqueName: \"kubernetes.io/projected/87bd69c1-f44b-42b9-ba15-a3ed7fdad078-kube-api-access-pdtw2\") pod \"multus-additional-cni-plugins-mz6rs\" (UID: \"87bd69c1-f44b-42b9-ba15-a3ed7fdad078\") " pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.257755 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.257735 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd98x\" (UniqueName: \"kubernetes.io/projected/34de8035-f298-43ae-864d-cfeeb05c7620-kube-api-access-cd98x\") pod \"tuned-5fq7v\" (UID: \"34de8035-f298-43ae-864d-cfeeb05c7620\") " pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.257827 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.257810 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbk7g\" (UniqueName: \"kubernetes.io/projected/2f7c8d95-90cb-497a-8866-d2c45b825b72-kube-api-access-bbk7g\") pod \"network-metrics-daemon-nrljs\" (UID: \"2f7c8d95-90cb-497a-8866-d2c45b825b72\") " pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:04:57.258292 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.258268 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4vx7\" (UniqueName: \"kubernetes.io/projected/f6b04177-21b9-47c3-b9d0-f924bd2178f4-kube-api-access-h4vx7\") pod \"aws-ebs-csi-driver-node-lsg6h\" (UID: \"f6b04177-21b9-47c3-b9d0-f924bd2178f4\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.264209 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:57.264187 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:04:57.264359 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:57.264346 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:04:57.264472 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:57.264452 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7fq9x for pod openshift-network-diagnostics/network-check-target-jngsx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:57.264625 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:57.264613 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x podName:2d82db40-2daf-4082-bfaa-d74e9b453817 nodeName:}" failed. No retries permitted until 2026-04-16 22:04:57.764595812 +0000 UTC m=+2.025321912 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7fq9x" (UniqueName: "kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x") pod "network-check-target-jngsx" (UID: "2d82db40-2daf-4082-bfaa-d74e9b453817") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:57.266472 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.266418 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn9wf\" (UniqueName: \"kubernetes.io/projected/5c889066-e69a-44d5-b456-b37d09282234-kube-api-access-vn9wf\") pod \"ovnkube-node-fw6md\" (UID: \"5c889066-e69a-44d5-b456-b37d09282234\") " pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.266818 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.266793 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7b8x\" (UniqueName: \"kubernetes.io/projected/a333a392-a24f-4b6c-85d5-2cc457992bf5-kube-api-access-f7b8x\") pod \"node-resolver-7pgrh\" (UID: \"a333a392-a24f-4b6c-85d5-2cc457992bf5\") " pod="openshift-dns/node-resolver-7pgrh" Apr 16 22:04:57.266920 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.266793 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmt9c\" (UniqueName: \"kubernetes.io/projected/4591d12d-774b-43ce-a862-67018bf47f0c-kube-api-access-wmt9c\") pod \"node-ca-4nfgz\" (UID: \"4591d12d-774b-43ce-a862-67018bf47f0c\") " pod="openshift-image-registry/node-ca-4nfgz" Apr 16 22:04:57.267086 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.267068 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq5z8\" (UniqueName: \"kubernetes.io/projected/84ed2aab-4f39-4883-a3d8-e59507d5aae3-kube-api-access-vq5z8\") pod \"iptables-alerter-2tscq\" (UID: \"84ed2aab-4f39-4883-a3d8-e59507d5aae3\") " pod="openshift-network-operator/iptables-alerter-2tscq" Apr 16 22:04:57.267186 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.267167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-748wp\" (UniqueName: \"kubernetes.io/projected/0dc08e01-6796-4c69-9ed5-214b13ad71cd-kube-api-access-748wp\") pod \"multus-ltxhs\" (UID: \"0dc08e01-6796-4c69-9ed5-214b13ad71cd\") " pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.270402 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:57.270377 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podceb21ad86624033bad3b80db33531fac.slice/crio-652beaae2eb64d769e0ae9fff3fe2cad34b2c6d1cd51b17d983204f2a10291da WatchSource:0}: Error finding container 652beaae2eb64d769e0ae9fff3fe2cad34b2c6d1cd51b17d983204f2a10291da: Status 404 returned error can't find the container with id 652beaae2eb64d769e0ae9fff3fe2cad34b2c6d1cd51b17d983204f2a10291da Apr 16 22:04:57.270742 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:57.270725 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe5bbbaa384543c3c03d64000cc8b573.slice/crio-6c5fd819db2f2362f066e5b612eb66556cbc095c0e21e39bd7874798cd73dd0a WatchSource:0}: Error finding container 6c5fd819db2f2362f066e5b612eb66556cbc095c0e21e39bd7874798cd73dd0a: Status 404 returned error can't find the container with id 6c5fd819db2f2362f066e5b612eb66556cbc095c0e21e39bd7874798cd73dd0a Apr 16 22:04:57.276298 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.276285 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:04:57.280693 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.280624 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-26.ec2.internal" event={"ID":"ceb21ad86624033bad3b80db33531fac","Type":"ContainerStarted","Data":"652beaae2eb64d769e0ae9fff3fe2cad34b2c6d1cd51b17d983204f2a10291da"} Apr 16 22:04:57.281973 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.281951 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-26.ec2.internal" event={"ID":"be5bbbaa384543c3c03d64000cc8b573","Type":"ContainerStarted","Data":"6c5fd819db2f2362f066e5b612eb66556cbc095c0e21e39bd7874798cd73dd0a"} Apr 16 22:04:57.282687 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.282674 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7qvcm" Apr 16 22:04:57.287905 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:57.287883 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0982ea7f_a131_4caf_8792_d0c2c1bf4089.slice/crio-10435495cbbd09adfe72ef2a250df06a8405cef5622e7f9f8f5c42aad523dbb5 WatchSource:0}: Error finding container 10435495cbbd09adfe72ef2a250df06a8405cef5622e7f9f8f5c42aad523dbb5: Status 404 returned error can't find the container with id 10435495cbbd09adfe72ef2a250df06a8405cef5622e7f9f8f5c42aad523dbb5 Apr 16 22:04:57.457327 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.457250 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7pgrh" Apr 16 22:04:57.463342 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:57.463296 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda333a392_a24f_4b6c_85d5_2cc457992bf5.slice/crio-c36735615dc2ea2dcec5ea69f19720f6c891c6a4742d65554ac09d39ee8ba46c WatchSource:0}: Error finding container c36735615dc2ea2dcec5ea69f19720f6c891c6a4742d65554ac09d39ee8ba46c: Status 404 returned error can't find the container with id c36735615dc2ea2dcec5ea69f19720f6c891c6a4742d65554ac09d39ee8ba46c Apr 16 22:04:57.488861 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.488841 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4nfgz" Apr 16 22:04:57.494425 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.494249 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:04:57.494513 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:57.494448 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4591d12d_774b_43ce_a862_67018bf47f0c.slice/crio-2ce7f2a8597a395fc078ea36009de1db7c325e2e0e17ef759f9d065f21b4c000 WatchSource:0}: Error finding container 2ce7f2a8597a395fc078ea36009de1db7c325e2e0e17ef759f9d065f21b4c000: Status 404 returned error can't find the container with id 2ce7f2a8597a395fc078ea36009de1db7c325e2e0e17ef759f9d065f21b4c000 Apr 16 22:04:57.500403 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:57.500383 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c889066_e69a_44d5_b456_b37d09282234.slice/crio-7ad06a6ae153dae3788e85200763d4011f3fd56bb50e34049d821efb8a9ed91d WatchSource:0}: Error finding container 7ad06a6ae153dae3788e85200763d4011f3fd56bb50e34049d821efb8a9ed91d: Status 404 returned error can't find the container with id 7ad06a6ae153dae3788e85200763d4011f3fd56bb50e34049d821efb8a9ed91d Apr 16 22:04:57.503544 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.503530 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" Apr 16 22:04:57.508812 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:57.508794 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6b04177_21b9_47c3_b9d0_f924bd2178f4.slice/crio-dfb86a7c7598b2df3455c74cb4e981399e417c42aa5dcc04659a4a660ae6e52e WatchSource:0}: Error finding container dfb86a7c7598b2df3455c74cb4e981399e417c42aa5dcc04659a4a660ae6e52e: Status 404 returned error can't find the container with id dfb86a7c7598b2df3455c74cb4e981399e417c42aa5dcc04659a4a660ae6e52e Apr 16 22:04:57.515033 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.515017 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" Apr 16 22:04:57.520715 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:57.520683 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34de8035_f298_43ae_864d_cfeeb05c7620.slice/crio-7cdd1579b283bf42b8579a66c047619ddcb35cdefda656b60f833ca4da6d1da5 WatchSource:0}: Error finding container 7cdd1579b283bf42b8579a66c047619ddcb35cdefda656b60f833ca4da6d1da5: Status 404 returned error can't find the container with id 7cdd1579b283bf42b8579a66c047619ddcb35cdefda656b60f833ca4da6d1da5 Apr 16 22:04:57.528875 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.528858 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-mz6rs" Apr 16 22:04:57.534427 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.534405 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ltxhs" Apr 16 22:04:57.535216 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:57.535190 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87bd69c1_f44b_42b9_ba15_a3ed7fdad078.slice/crio-dd69a3653b3bd5e6b3d8a6d7f26e2426d2cef1e149403ae8c21497ba4c207093 WatchSource:0}: Error finding container dd69a3653b3bd5e6b3d8a6d7f26e2426d2cef1e149403ae8c21497ba4c207093: Status 404 returned error can't find the container with id dd69a3653b3bd5e6b3d8a6d7f26e2426d2cef1e149403ae8c21497ba4c207093 Apr 16 22:04:57.540134 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:57.540116 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dc08e01_6796_4c69_9ed5_214b13ad71cd.slice/crio-54b3e9d6a5875a5a3924d3ff7bdfb32b556d14f06bd46ac547ace5a4826f80e3 WatchSource:0}: Error finding container 54b3e9d6a5875a5a3924d3ff7bdfb32b556d14f06bd46ac547ace5a4826f80e3: Status 404 returned error can't find the container with id 54b3e9d6a5875a5a3924d3ff7bdfb32b556d14f06bd46ac547ace5a4826f80e3 Apr 16 22:04:57.548643 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.548626 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2tscq" Apr 16 22:04:57.553846 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:04:57.553829 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84ed2aab_4f39_4883_a3d8_e59507d5aae3.slice/crio-f4181f31641297ff8a85ce7e7624705ecaef95731e94fba9904a7ed60d620f3a WatchSource:0}: Error finding container f4181f31641297ff8a85ce7e7624705ecaef95731e94fba9904a7ed60d620f3a: Status 404 returned error can't find the container with id f4181f31641297ff8a85ce7e7624705ecaef95731e94fba9904a7ed60d620f3a Apr 16 22:04:57.756700 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.756596 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs\") pod \"network-metrics-daemon-nrljs\" (UID: \"2f7c8d95-90cb-497a-8866-d2c45b825b72\") " pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:04:57.756853 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:57.756752 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:57.756853 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:57.756808 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs podName:2f7c8d95-90cb-497a-8866-d2c45b825b72 nodeName:}" failed. No retries permitted until 2026-04-16 22:04:58.756791017 +0000 UTC m=+3.017517115 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs") pod "network-metrics-daemon-nrljs" (UID: "2f7c8d95-90cb-497a-8866-d2c45b825b72") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:57.857190 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.857155 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fq9x\" (UniqueName: \"kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x\") pod \"network-check-target-jngsx\" (UID: \"2d82db40-2daf-4082-bfaa-d74e9b453817\") " pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:04:57.857415 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:57.857320 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:04:57.857415 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:57.857340 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:04:57.857415 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:57.857353 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7fq9x for pod openshift-network-diagnostics/network-check-target-jngsx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:57.857415 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:57.857415 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x podName:2d82db40-2daf-4082-bfaa-d74e9b453817 nodeName:}" failed. No retries permitted until 2026-04-16 22:04:58.857395718 +0000 UTC m=+3.118121804 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fq9x" (UniqueName: "kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x") pod "network-check-target-jngsx" (UID: "2d82db40-2daf-4082-bfaa-d74e9b453817") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:57.981426 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:57.981211 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:04:58.112554 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:58.112280 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:04:58.187464 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:58.187428 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 21:59:57 +0000 UTC" deadline="2028-01-11 22:36:03.840457546 +0000 UTC" Apr 16 22:04:58.187464 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:58.187462 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15240h31m5.652999242s" Apr 16 22:04:58.320131 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:58.320073 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2tscq" event={"ID":"84ed2aab-4f39-4883-a3d8-e59507d5aae3","Type":"ContainerStarted","Data":"f4181f31641297ff8a85ce7e7624705ecaef95731e94fba9904a7ed60d620f3a"} Apr 16 22:04:58.325456 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:58.324492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ltxhs" event={"ID":"0dc08e01-6796-4c69-9ed5-214b13ad71cd","Type":"ContainerStarted","Data":"54b3e9d6a5875a5a3924d3ff7bdfb32b556d14f06bd46ac547ace5a4826f80e3"} Apr 16 22:04:58.337989 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:58.337954 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" event={"ID":"34de8035-f298-43ae-864d-cfeeb05c7620","Type":"ContainerStarted","Data":"7cdd1579b283bf42b8579a66c047619ddcb35cdefda656b60f833ca4da6d1da5"} Apr 16 22:04:58.349491 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:58.349459 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" event={"ID":"5c889066-e69a-44d5-b456-b37d09282234","Type":"ContainerStarted","Data":"7ad06a6ae153dae3788e85200763d4011f3fd56bb50e34049d821efb8a9ed91d"} Apr 16 22:04:58.371579 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:58.371518 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7qvcm" event={"ID":"0982ea7f-a131-4caf-8792-d0c2c1bf4089","Type":"ContainerStarted","Data":"10435495cbbd09adfe72ef2a250df06a8405cef5622e7f9f8f5c42aad523dbb5"} Apr 16 22:04:58.386200 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:58.386162 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mz6rs" event={"ID":"87bd69c1-f44b-42b9-ba15-a3ed7fdad078","Type":"ContainerStarted","Data":"dd69a3653b3bd5e6b3d8a6d7f26e2426d2cef1e149403ae8c21497ba4c207093"} Apr 16 22:04:58.396708 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:58.396681 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" event={"ID":"f6b04177-21b9-47c3-b9d0-f924bd2178f4","Type":"ContainerStarted","Data":"dfb86a7c7598b2df3455c74cb4e981399e417c42aa5dcc04659a4a660ae6e52e"} Apr 16 22:04:58.401889 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:58.401865 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4nfgz" event={"ID":"4591d12d-774b-43ce-a862-67018bf47f0c","Type":"ContainerStarted","Data":"2ce7f2a8597a395fc078ea36009de1db7c325e2e0e17ef759f9d065f21b4c000"} Apr 16 22:04:58.416788 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:58.416758 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7pgrh" event={"ID":"a333a392-a24f-4b6c-85d5-2cc457992bf5","Type":"ContainerStarted","Data":"c36735615dc2ea2dcec5ea69f19720f6c891c6a4742d65554ac09d39ee8ba46c"} Apr 16 22:04:58.457993 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:58.457936 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 22:04:58.763920 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:58.763814 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs\") pod \"network-metrics-daemon-nrljs\" (UID: \"2f7c8d95-90cb-497a-8866-d2c45b825b72\") " pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:04:58.764081 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:58.764022 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:58.764149 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:58.764086 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs podName:2f7c8d95-90cb-497a-8866-d2c45b825b72 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:00.764066555 +0000 UTC m=+5.024792652 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs") pod "network-metrics-daemon-nrljs" (UID: "2f7c8d95-90cb-497a-8866-d2c45b825b72") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:04:58.864671 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:58.864635 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fq9x\" (UniqueName: \"kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x\") pod \"network-check-target-jngsx\" (UID: \"2d82db40-2daf-4082-bfaa-d74e9b453817\") " pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:04:58.864892 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:58.864873 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:04:58.864980 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:58.864898 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:04:58.864980 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:58.864912 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7fq9x for pod openshift-network-diagnostics/network-check-target-jngsx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:58.864980 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:58.864968 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x podName:2d82db40-2daf-4082-bfaa-d74e9b453817 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:00.864950035 +0000 UTC m=+5.125676138 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fq9x" (UniqueName: "kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x") pod "network-check-target-jngsx" (UID: "2d82db40-2daf-4082-bfaa-d74e9b453817") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:04:59.188676 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:59.188585 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 21:59:57 +0000 UTC" deadline="2027-10-07 09:18:56.707904854 +0000 UTC" Apr 16 22:04:59.188676 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:59.188625 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12923h13m57.519283227s" Apr 16 22:04:59.278324 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:59.278283 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:04:59.278487 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:59.278429 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nrljs" podUID="2f7c8d95-90cb-497a-8866-d2c45b825b72" Apr 16 22:04:59.278878 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:04:59.278858 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:04:59.278963 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:04:59.278947 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jngsx" podUID="2d82db40-2daf-4082-bfaa-d74e9b453817" Apr 16 22:05:00.780930 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:00.780880 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs\") pod \"network-metrics-daemon-nrljs\" (UID: \"2f7c8d95-90cb-497a-8866-d2c45b825b72\") " pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:05:00.781428 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:00.781054 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:05:00.781428 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:00.781128 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs podName:2f7c8d95-90cb-497a-8866-d2c45b825b72 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:04.781108833 +0000 UTC m=+9.041834927 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs") pod "network-metrics-daemon-nrljs" (UID: "2f7c8d95-90cb-497a-8866-d2c45b825b72") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:05:00.881504 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:00.881466 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fq9x\" (UniqueName: \"kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x\") pod \"network-check-target-jngsx\" (UID: \"2d82db40-2daf-4082-bfaa-d74e9b453817\") " pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:05:00.881694 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:00.881673 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:05:00.881763 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:00.881702 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:05:00.881763 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:00.881716 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7fq9x for pod openshift-network-diagnostics/network-check-target-jngsx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:05:00.881855 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:00.881779 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x podName:2d82db40-2daf-4082-bfaa-d74e9b453817 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:04.881757712 +0000 UTC m=+9.142483799 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fq9x" (UniqueName: "kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x") pod "network-check-target-jngsx" (UID: "2d82db40-2daf-4082-bfaa-d74e9b453817") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:05:01.277737 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:01.277657 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:05:01.277890 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:01.277665 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:05:01.277890 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:01.277816 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jngsx" podUID="2d82db40-2daf-4082-bfaa-d74e9b453817" Apr 16 22:05:01.277890 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:01.277878 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nrljs" podUID="2f7c8d95-90cb-497a-8866-d2c45b825b72" Apr 16 22:05:03.278166 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:03.278131 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:05:03.278649 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:03.278268 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nrljs" podUID="2f7c8d95-90cb-497a-8866-d2c45b825b72" Apr 16 22:05:03.278649 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:03.278134 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:05:03.278649 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:03.278640 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jngsx" podUID="2d82db40-2daf-4082-bfaa-d74e9b453817" Apr 16 22:05:04.816957 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:04.816872 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs\") pod \"network-metrics-daemon-nrljs\" (UID: \"2f7c8d95-90cb-497a-8866-d2c45b825b72\") " pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:05:04.817430 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:04.817003 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:05:04.817430 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:04.817124 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs podName:2f7c8d95-90cb-497a-8866-d2c45b825b72 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:12.817101565 +0000 UTC m=+17.077827653 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs") pod "network-metrics-daemon-nrljs" (UID: "2f7c8d95-90cb-497a-8866-d2c45b825b72") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:05:04.917648 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:04.917617 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fq9x\" (UniqueName: \"kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x\") pod \"network-check-target-jngsx\" (UID: \"2d82db40-2daf-4082-bfaa-d74e9b453817\") " pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:05:04.917811 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:04.917775 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:05:04.917811 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:04.917797 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:05:04.917811 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:04.917812 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7fq9x for pod openshift-network-diagnostics/network-check-target-jngsx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:05:04.917979 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:04.917872 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x podName:2d82db40-2daf-4082-bfaa-d74e9b453817 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:12.917854322 +0000 UTC m=+17.178580410 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fq9x" (UniqueName: "kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x") pod "network-check-target-jngsx" (UID: "2d82db40-2daf-4082-bfaa-d74e9b453817") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:05:05.278363 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:05.278270 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:05:05.278523 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:05.278282 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:05:05.278523 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:05.278402 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jngsx" podUID="2d82db40-2daf-4082-bfaa-d74e9b453817" Apr 16 22:05:05.278523 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:05.278515 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nrljs" podUID="2f7c8d95-90cb-497a-8866-d2c45b825b72" Apr 16 22:05:07.277647 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:07.277613 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:05:07.278079 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:07.277752 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jngsx" podUID="2d82db40-2daf-4082-bfaa-d74e9b453817" Apr 16 22:05:07.278079 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:07.277802 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:05:07.280252 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:07.278249 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nrljs" podUID="2f7c8d95-90cb-497a-8866-d2c45b825b72" Apr 16 22:05:09.277879 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:09.277850 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:05:09.277879 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:09.277879 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:05:09.278393 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:09.277945 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jngsx" podUID="2d82db40-2daf-4082-bfaa-d74e9b453817" Apr 16 22:05:09.278393 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:09.278084 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nrljs" podUID="2f7c8d95-90cb-497a-8866-d2c45b825b72" Apr 16 22:05:11.277528 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:11.277495 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:05:11.277528 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:11.277495 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:05:11.278014 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:11.277605 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jngsx" podUID="2d82db40-2daf-4082-bfaa-d74e9b453817" Apr 16 22:05:11.278014 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:11.277745 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nrljs" podUID="2f7c8d95-90cb-497a-8866-d2c45b825b72" Apr 16 22:05:12.876436 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:12.876395 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs\") pod \"network-metrics-daemon-nrljs\" (UID: \"2f7c8d95-90cb-497a-8866-d2c45b825b72\") " pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:05:12.876891 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:12.876530 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:05:12.876891 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:12.876593 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs podName:2f7c8d95-90cb-497a-8866-d2c45b825b72 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:28.876573487 +0000 UTC m=+33.137299573 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs") pod "network-metrics-daemon-nrljs" (UID: "2f7c8d95-90cb-497a-8866-d2c45b825b72") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:05:12.977448 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:12.977414 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fq9x\" (UniqueName: \"kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x\") pod \"network-check-target-jngsx\" (UID: \"2d82db40-2daf-4082-bfaa-d74e9b453817\") " pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:05:12.977628 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:12.977547 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:05:12.977628 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:12.977564 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:05:12.977628 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:12.977573 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7fq9x for pod openshift-network-diagnostics/network-check-target-jngsx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:05:12.977628 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:12.977624 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x podName:2d82db40-2daf-4082-bfaa-d74e9b453817 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:28.977607411 +0000 UTC m=+33.238333510 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fq9x" (UniqueName: "kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x") pod "network-check-target-jngsx" (UID: "2d82db40-2daf-4082-bfaa-d74e9b453817") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:05:13.277713 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:13.277630 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:05:13.277864 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:13.277637 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:05:13.277864 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:13.277751 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jngsx" podUID="2d82db40-2daf-4082-bfaa-d74e9b453817" Apr 16 22:05:13.277864 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:13.277843 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nrljs" podUID="2f7c8d95-90cb-497a-8866-d2c45b825b72" Apr 16 22:05:15.278187 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:15.278155 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:05:15.278650 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:15.278268 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jngsx" podUID="2d82db40-2daf-4082-bfaa-d74e9b453817" Apr 16 22:05:15.278729 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:15.278704 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:05:15.278841 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:15.278821 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nrljs" podUID="2f7c8d95-90cb-497a-8866-d2c45b825b72" Apr 16 22:05:16.449425 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:16.449236 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-26.ec2.internal" event={"ID":"be5bbbaa384543c3c03d64000cc8b573","Type":"ContainerStarted","Data":"8ee9dadeb21e321fb778a1a3fde42886573c60807a6875a7f11922de144372e9"} Apr 16 22:05:16.451748 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:16.451615 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ltxhs" event={"ID":"0dc08e01-6796-4c69-9ed5-214b13ad71cd","Type":"ContainerStarted","Data":"5616c5fafe84f80a58ee8e604daf490c9096780055c9c03af98d39de3335a497"} Apr 16 22:05:16.456554 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:16.454419 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" event={"ID":"34de8035-f298-43ae-864d-cfeeb05c7620","Type":"ContainerStarted","Data":"cf0db739598a1c095702b2c23e82c7ec16a9ed6c6a069a361c632da9bee8424a"} Apr 16 22:05:16.459441 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:16.459201 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" event={"ID":"5c889066-e69a-44d5-b456-b37d09282234","Type":"ContainerStarted","Data":"13fb861d704880348a1d8dacce94d0c6bfd92bb19ae2a5bbb7c640642fc4716c"} Apr 16 22:05:16.459441 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:16.459231 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" event={"ID":"5c889066-e69a-44d5-b456-b37d09282234","Type":"ContainerStarted","Data":"00eff89046538275849300fcec90692a5229951074877d6910d275b113ee831b"} Apr 16 22:05:16.459441 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:16.459245 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" event={"ID":"5c889066-e69a-44d5-b456-b37d09282234","Type":"ContainerStarted","Data":"14be27f1dc863c796e771c9075b20fecd187feb682fd1a3d4f462390ef0ade97"} Apr 16 22:05:16.459441 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:16.459258 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" event={"ID":"5c889066-e69a-44d5-b456-b37d09282234","Type":"ContainerStarted","Data":"1685cdfaa028b786a2495524b4c8fdf4cbc3b5bb83e5200de6a17514a78acb36"} Apr 16 22:05:16.459441 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:16.459272 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" event={"ID":"5c889066-e69a-44d5-b456-b37d09282234","Type":"ContainerStarted","Data":"0649f3220bb5d918b08a0623cc6d3b06fc8b0999fe1eff9df98bf9e36a2f8915"} Apr 16 22:05:16.459441 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:16.459283 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" event={"ID":"5c889066-e69a-44d5-b456-b37d09282234","Type":"ContainerStarted","Data":"5e9b155157f8e3a081c55a5f5ab1b94d023d50a5a644ed3989f9bb4a78c1f6f9"} Apr 16 22:05:16.462522 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:16.462472 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-26.ec2.internal" podStartSLOduration=20.462457763 podStartE2EDuration="20.462457763s" podCreationTimestamp="2026-04-16 22:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:05:16.462266834 +0000 UTC m=+20.722992961" watchObservedRunningTime="2026-04-16 22:05:16.462457763 +0000 UTC m=+20.723183869" Apr 16 22:05:16.477949 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:16.477436 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-5fq7v" podStartSLOduration=2.6762382049999998 podStartE2EDuration="20.477409633s" podCreationTimestamp="2026-04-16 22:04:56 +0000 UTC" firstStartedPulling="2026-04-16 22:04:57.522000838 +0000 UTC m=+1.782726922" lastFinishedPulling="2026-04-16 22:05:15.323172259 +0000 UTC m=+19.583898350" observedRunningTime="2026-04-16 22:05:16.47702182 +0000 UTC m=+20.737747939" watchObservedRunningTime="2026-04-16 22:05:16.477409633 +0000 UTC m=+20.738135738" Apr 16 22:05:17.277800 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:17.277773 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:05:17.277928 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:17.277776 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:05:17.277928 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:17.277874 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jngsx" podUID="2d82db40-2daf-4082-bfaa-d74e9b453817" Apr 16 22:05:17.278031 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:17.277974 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nrljs" podUID="2f7c8d95-90cb-497a-8866-d2c45b825b72" Apr 16 22:05:17.321926 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:17.321905 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 22:05:17.462235 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:17.462150 2576 generic.go:358] "Generic (PLEG): container finished" podID="ceb21ad86624033bad3b80db33531fac" containerID="127e7c9d2bef5261db29a0ac4ed9abbb2bae76361d869f6f85b3b1cc7e7f868a" exitCode=0 Apr 16 22:05:17.462675 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:17.462245 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-26.ec2.internal" event={"ID":"ceb21ad86624033bad3b80db33531fac","Type":"ContainerDied","Data":"127e7c9d2bef5261db29a0ac4ed9abbb2bae76361d869f6f85b3b1cc7e7f868a"} Apr 16 22:05:17.463523 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:17.463504 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2tscq" event={"ID":"84ed2aab-4f39-4883-a3d8-e59507d5aae3","Type":"ContainerStarted","Data":"6c2b728cdb9fe424b9e95910db79c8cabd02c57808c331e40a660c1a7588d4aa"} Apr 16 22:05:17.464817 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:17.464792 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7qvcm" event={"ID":"0982ea7f-a131-4caf-8792-d0c2c1bf4089","Type":"ContainerStarted","Data":"d12375c758f3a1bcbed4236f9003c5a117ea6a916eee7ec8201c3c7fddc8d400"} Apr 16 22:05:17.466172 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:17.466153 2576 generic.go:358] "Generic (PLEG): container finished" podID="87bd69c1-f44b-42b9-ba15-a3ed7fdad078" containerID="1634ae8e78ee8f902b9de3cc0a3efe7aceb1c8ae350e8827c79e5fc3d0bd0d7c" exitCode=0 Apr 16 22:05:17.466327 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:17.466213 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mz6rs" event={"ID":"87bd69c1-f44b-42b9-ba15-a3ed7fdad078","Type":"ContainerDied","Data":"1634ae8e78ee8f902b9de3cc0a3efe7aceb1c8ae350e8827c79e5fc3d0bd0d7c"} Apr 16 22:05:17.467760 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:17.467741 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" event={"ID":"f6b04177-21b9-47c3-b9d0-f924bd2178f4","Type":"ContainerStarted","Data":"d1ecfc2c077453f2a800bf8727d7c0ee7b37df292b7cc355132ed1418dd5e54c"} Apr 16 22:05:17.467827 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:17.467768 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" event={"ID":"f6b04177-21b9-47c3-b9d0-f924bd2178f4","Type":"ContainerStarted","Data":"701a311ad8a0982c507f9637c530449ab42384a0aa10c07240ebbd5b212b4bee"} Apr 16 22:05:17.469107 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:17.469086 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4nfgz" event={"ID":"4591d12d-774b-43ce-a862-67018bf47f0c","Type":"ContainerStarted","Data":"1a6b0e1e6d367b2a2cfa5a7a6d93a78965bc75b0b71f8881b0a1bade2695c76b"} Apr 16 22:05:17.470202 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:17.470185 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7pgrh" event={"ID":"a333a392-a24f-4b6c-85d5-2cc457992bf5","Type":"ContainerStarted","Data":"7450d99c1c29578eedda53dbf69e27fb782f14ca144e024eccb069c2f35c1a7a"} Apr 16 22:05:17.474652 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:17.474618 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ltxhs" podStartSLOduration=3.681763405 podStartE2EDuration="21.474606489s" podCreationTimestamp="2026-04-16 22:04:56 +0000 UTC" firstStartedPulling="2026-04-16 22:04:57.541446139 +0000 UTC m=+1.802172227" lastFinishedPulling="2026-04-16 22:05:15.334289229 +0000 UTC m=+19.595015311" observedRunningTime="2026-04-16 22:05:16.492441631 +0000 UTC m=+20.753167759" watchObservedRunningTime="2026-04-16 22:05:17.474606489 +0000 UTC m=+21.735332639" Apr 16 22:05:17.486277 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:17.485860 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2tscq" podStartSLOduration=3.719144913 podStartE2EDuration="21.485846439s" podCreationTimestamp="2026-04-16 22:04:56 +0000 UTC" firstStartedPulling="2026-04-16 22:04:57.555571546 +0000 UTC m=+1.816297629" lastFinishedPulling="2026-04-16 22:05:15.322273069 +0000 UTC m=+19.582999155" observedRunningTime="2026-04-16 22:05:17.485405702 +0000 UTC m=+21.746131810" watchObservedRunningTime="2026-04-16 22:05:17.485846439 +0000 UTC m=+21.746572545" Apr 16 22:05:17.496689 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:17.496644 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7qvcm" podStartSLOduration=3.464848226 podStartE2EDuration="21.496632034s" podCreationTimestamp="2026-04-16 22:04:56 +0000 UTC" firstStartedPulling="2026-04-16 22:04:57.289300305 +0000 UTC m=+1.550026401" lastFinishedPulling="2026-04-16 22:05:15.321084111 +0000 UTC m=+19.581810209" observedRunningTime="2026-04-16 22:05:17.496590535 +0000 UTC m=+21.757316640" watchObservedRunningTime="2026-04-16 22:05:17.496632034 +0000 UTC m=+21.757358139" Apr 16 22:05:17.509701 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:17.509661 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7pgrh" podStartSLOduration=3.653248531 podStartE2EDuration="21.509649334s" podCreationTimestamp="2026-04-16 22:04:56 +0000 UTC" firstStartedPulling="2026-04-16 22:04:57.46471848 +0000 UTC m=+1.725444562" lastFinishedPulling="2026-04-16 22:05:15.321119265 +0000 UTC m=+19.581845365" observedRunningTime="2026-04-16 22:05:17.508981554 +0000 UTC m=+21.769707670" watchObservedRunningTime="2026-04-16 22:05:17.509649334 +0000 UTC m=+21.770375439" Apr 16 22:05:17.522140 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:17.522099 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4nfgz" podStartSLOduration=3.696887946 podStartE2EDuration="21.522085935s" podCreationTimestamp="2026-04-16 22:04:56 +0000 UTC" firstStartedPulling="2026-04-16 22:04:57.496081347 +0000 UTC m=+1.756807430" lastFinishedPulling="2026-04-16 22:05:15.32127933 +0000 UTC m=+19.582005419" observedRunningTime="2026-04-16 22:05:17.522041485 +0000 UTC m=+21.782767591" watchObservedRunningTime="2026-04-16 22:05:17.522085935 +0000 UTC m=+21.782812044" Apr 16 22:05:18.223719 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:18.223379 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T22:05:17.321922031Z","UUID":"5f971806-4fa8-4e69-a313-4a9e3ea75f37","Handler":null,"Name":"","Endpoint":""} Apr 16 22:05:18.226396 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:18.226000 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 22:05:18.226396 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:18.226033 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 22:05:18.475442 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:18.475399 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" event={"ID":"5c889066-e69a-44d5-b456-b37d09282234","Type":"ContainerStarted","Data":"021954d0587d7cfac6b63de97d527e67f76775d3ef52f0962cda1a6f20e6f59f"} Apr 16 22:05:18.477576 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:18.477548 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" event={"ID":"f6b04177-21b9-47c3-b9d0-f924bd2178f4","Type":"ContainerStarted","Data":"3b19c1beaa1637cdc44361483fb9981fe12585dd4cd1861917c5f05c47f50c99"} Apr 16 22:05:18.479236 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:18.479209 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-26.ec2.internal" event={"ID":"ceb21ad86624033bad3b80db33531fac","Type":"ContainerStarted","Data":"90a901a2be702fbc95afcd6e31e8f0e5ab25394aeffe3e9ac6e1cf0d5c766fce"} Apr 16 22:05:18.506580 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:18.506539 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lsg6h" podStartSLOduration=1.865969191 podStartE2EDuration="22.506525525s" podCreationTimestamp="2026-04-16 22:04:56 +0000 UTC" firstStartedPulling="2026-04-16 22:04:57.510300597 +0000 UTC m=+1.771026693" lastFinishedPulling="2026-04-16 22:05:18.15085693 +0000 UTC m=+22.411583027" observedRunningTime="2026-04-16 22:05:18.491525982 +0000 UTC m=+22.752252089" watchObservedRunningTime="2026-04-16 22:05:18.506525525 +0000 UTC m=+22.767251624" Apr 16 22:05:18.506791 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:18.506770 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-26.ec2.internal" podStartSLOduration=22.506765787 podStartE2EDuration="22.506765787s" podCreationTimestamp="2026-04-16 22:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:05:18.506504179 +0000 UTC m=+22.767230285" watchObservedRunningTime="2026-04-16 22:05:18.506765787 +0000 UTC m=+22.767491891" Apr 16 22:05:19.277445 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:19.277409 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:05:19.277613 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:19.277445 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:05:19.277613 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:19.277535 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jngsx" podUID="2d82db40-2daf-4082-bfaa-d74e9b453817" Apr 16 22:05:19.277724 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:19.277687 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nrljs" podUID="2f7c8d95-90cb-497a-8866-d2c45b825b72" Apr 16 22:05:21.277786 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:21.277752 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:05:21.278630 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:21.277752 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:05:21.278630 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:21.277885 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nrljs" podUID="2f7c8d95-90cb-497a-8866-d2c45b825b72" Apr 16 22:05:21.278630 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:21.277921 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jngsx" podUID="2d82db40-2daf-4082-bfaa-d74e9b453817" Apr 16 22:05:21.488181 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:21.488002 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" event={"ID":"5c889066-e69a-44d5-b456-b37d09282234","Type":"ContainerStarted","Data":"df97e467c53c4153185efc513fdc5f7237a3151ca86f5fb9691c6c0d0d5e0771"} Apr 16 22:05:21.488329 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:21.488264 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:05:21.488329 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:21.488292 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:05:21.502263 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:21.502242 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:05:21.511160 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:21.511121 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" podStartSLOduration=7.225924173 podStartE2EDuration="25.511109198s" podCreationTimestamp="2026-04-16 22:04:56 +0000 UTC" firstStartedPulling="2026-04-16 22:04:57.50196136 +0000 UTC m=+1.762687446" lastFinishedPulling="2026-04-16 22:05:15.78714637 +0000 UTC m=+20.047872471" observedRunningTime="2026-04-16 22:05:21.509784852 +0000 UTC m=+25.770510957" watchObservedRunningTime="2026-04-16 22:05:21.511109198 +0000 UTC m=+25.771835302" Apr 16 22:05:21.961558 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:21.961528 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7qvcm" Apr 16 22:05:21.962223 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:21.962201 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7qvcm" Apr 16 22:05:22.389987 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:22.389966 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7qvcm" Apr 16 22:05:22.390549 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:22.390531 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7qvcm" Apr 16 22:05:22.491256 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:22.491225 2576 generic.go:358] "Generic (PLEG): container finished" podID="87bd69c1-f44b-42b9-ba15-a3ed7fdad078" containerID="e0f61414c2f75cf540fea1f1e069b8c17dc7058d40f9de05d7fd455eef122123" exitCode=0 Apr 16 22:05:22.491421 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:22.491300 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mz6rs" event={"ID":"87bd69c1-f44b-42b9-ba15-a3ed7fdad078","Type":"ContainerDied","Data":"e0f61414c2f75cf540fea1f1e069b8c17dc7058d40f9de05d7fd455eef122123"} Apr 16 22:05:22.492526 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:22.491964 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:05:22.507453 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:22.507430 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:05:23.278361 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:23.278216 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:05:23.278450 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:23.278275 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:05:23.278499 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:23.278475 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jngsx" podUID="2d82db40-2daf-4082-bfaa-d74e9b453817" Apr 16 22:05:23.278535 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:23.278502 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nrljs" podUID="2f7c8d95-90cb-497a-8866-d2c45b825b72" Apr 16 22:05:23.353577 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:23.353552 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jngsx"] Apr 16 22:05:23.356171 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:23.356144 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nrljs"] Apr 16 22:05:23.494886 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:23.494803 2576 generic.go:358] "Generic (PLEG): container finished" podID="87bd69c1-f44b-42b9-ba15-a3ed7fdad078" containerID="8379d31fc2667cd3e8af78b57562cd89c6c0524b8de722d304a5159a92c7ddf7" exitCode=0 Apr 16 22:05:23.494886 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:23.494883 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:05:23.495258 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:23.494911 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mz6rs" event={"ID":"87bd69c1-f44b-42b9-ba15-a3ed7fdad078","Type":"ContainerDied","Data":"8379d31fc2667cd3e8af78b57562cd89c6c0524b8de722d304a5159a92c7ddf7"} Apr 16 22:05:23.495258 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:23.494977 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jngsx" podUID="2d82db40-2daf-4082-bfaa-d74e9b453817" Apr 16 22:05:23.495560 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:23.495545 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:05:23.495641 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:23.495625 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nrljs" podUID="2f7c8d95-90cb-497a-8866-d2c45b825b72" Apr 16 22:05:24.498831 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:24.498796 2576 generic.go:358] "Generic (PLEG): container finished" podID="87bd69c1-f44b-42b9-ba15-a3ed7fdad078" containerID="f8e66f488af7a90ccadbe0ad68a95e0b638d8e7f81bc916524c6b4eaff4633be" exitCode=0 Apr 16 22:05:24.499183 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:24.498887 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mz6rs" event={"ID":"87bd69c1-f44b-42b9-ba15-a3ed7fdad078","Type":"ContainerDied","Data":"f8e66f488af7a90ccadbe0ad68a95e0b638d8e7f81bc916524c6b4eaff4633be"} Apr 16 22:05:25.277649 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:25.277617 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:05:25.277816 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:25.277617 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:05:25.277816 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:25.277734 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jngsx" podUID="2d82db40-2daf-4082-bfaa-d74e9b453817" Apr 16 22:05:25.277816 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:25.277806 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nrljs" podUID="2f7c8d95-90cb-497a-8866-d2c45b825b72" Apr 16 22:05:27.278086 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:27.278049 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:05:27.278736 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:27.278050 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:05:27.278736 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:27.278184 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jngsx" podUID="2d82db40-2daf-4082-bfaa-d74e9b453817" Apr 16 22:05:27.278736 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:27.278263 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nrljs" podUID="2f7c8d95-90cb-497a-8866-d2c45b825b72" Apr 16 22:05:28.573410 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.573381 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-26.ec2.internal" event="NodeReady" Apr 16 22:05:28.573928 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.573512 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 22:05:28.613824 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.613763 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-xdjs7"] Apr 16 22:05:28.637149 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.637117 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6dv5d"] Apr 16 22:05:28.637331 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.637279 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xdjs7" Apr 16 22:05:28.639668 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.639362 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 22:05:28.639668 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.639386 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vts6k\"" Apr 16 22:05:28.639668 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.639438 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 22:05:28.650207 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.650187 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xdjs7"] Apr 16 22:05:28.650207 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.650209 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6dv5d"] Apr 16 22:05:28.650358 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.650296 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6dv5d" Apr 16 22:05:28.652401 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.652368 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 22:05:28.652401 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.652395 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 22:05:28.652401 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.652400 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-q2drx\"" Apr 16 22:05:28.652635 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.652400 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 22:05:28.789573 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.789531 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/049ab1ab-5b48-4b74-9527-0075f2bb7467-config-volume\") pod \"dns-default-xdjs7\" (UID: \"049ab1ab-5b48-4b74-9527-0075f2bb7467\") " pod="openshift-dns/dns-default-xdjs7" Apr 16 22:05:28.789573 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.789577 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls\") pod \"dns-default-xdjs7\" (UID: \"049ab1ab-5b48-4b74-9527-0075f2bb7467\") " pod="openshift-dns/dns-default-xdjs7" Apr 16 22:05:28.789805 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.789608 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/049ab1ab-5b48-4b74-9527-0075f2bb7467-tmp-dir\") pod \"dns-default-xdjs7\" (UID: \"049ab1ab-5b48-4b74-9527-0075f2bb7467\") " pod="openshift-dns/dns-default-xdjs7" Apr 16 22:05:28.789805 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.789634 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7cwg\" (UniqueName: \"kubernetes.io/projected/049ab1ab-5b48-4b74-9527-0075f2bb7467-kube-api-access-x7cwg\") pod \"dns-default-xdjs7\" (UID: \"049ab1ab-5b48-4b74-9527-0075f2bb7467\") " pod="openshift-dns/dns-default-xdjs7" Apr 16 22:05:28.789805 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.789693 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert\") pod \"ingress-canary-6dv5d\" (UID: \"6d61274e-1ceb-496e-8a75-17916b110ed5\") " pod="openshift-ingress-canary/ingress-canary-6dv5d" Apr 16 22:05:28.789805 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.789738 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-648pm\" (UniqueName: \"kubernetes.io/projected/6d61274e-1ceb-496e-8a75-17916b110ed5-kube-api-access-648pm\") pod \"ingress-canary-6dv5d\" (UID: \"6d61274e-1ceb-496e-8a75-17916b110ed5\") " pod="openshift-ingress-canary/ingress-canary-6dv5d" Apr 16 22:05:28.890357 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.890266 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/049ab1ab-5b48-4b74-9527-0075f2bb7467-config-volume\") pod \"dns-default-xdjs7\" (UID: \"049ab1ab-5b48-4b74-9527-0075f2bb7467\") " pod="openshift-dns/dns-default-xdjs7" Apr 16 22:05:28.890357 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.890317 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls\") pod \"dns-default-xdjs7\" (UID: \"049ab1ab-5b48-4b74-9527-0075f2bb7467\") " pod="openshift-dns/dns-default-xdjs7" Apr 16 22:05:28.890576 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.890366 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/049ab1ab-5b48-4b74-9527-0075f2bb7467-tmp-dir\") pod \"dns-default-xdjs7\" (UID: \"049ab1ab-5b48-4b74-9527-0075f2bb7467\") " pod="openshift-dns/dns-default-xdjs7" Apr 16 22:05:28.890576 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.890390 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7cwg\" (UniqueName: \"kubernetes.io/projected/049ab1ab-5b48-4b74-9527-0075f2bb7467-kube-api-access-x7cwg\") pod \"dns-default-xdjs7\" (UID: \"049ab1ab-5b48-4b74-9527-0075f2bb7467\") " pod="openshift-dns/dns-default-xdjs7" Apr 16 22:05:28.890576 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.890406 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert\") pod \"ingress-canary-6dv5d\" (UID: \"6d61274e-1ceb-496e-8a75-17916b110ed5\") " pod="openshift-ingress-canary/ingress-canary-6dv5d" Apr 16 22:05:28.890576 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.890438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-648pm\" (UniqueName: \"kubernetes.io/projected/6d61274e-1ceb-496e-8a75-17916b110ed5-kube-api-access-648pm\") pod \"ingress-canary-6dv5d\" (UID: \"6d61274e-1ceb-496e-8a75-17916b110ed5\") " pod="openshift-ingress-canary/ingress-canary-6dv5d" Apr 16 22:05:28.890576 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.890478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs\") pod \"network-metrics-daemon-nrljs\" (UID: \"2f7c8d95-90cb-497a-8866-d2c45b825b72\") " pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:05:28.890576 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:28.890531 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:05:28.890576 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:28.890555 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:05:28.890900 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:28.890631 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs podName:2f7c8d95-90cb-497a-8866-d2c45b825b72 nodeName:}" failed. No retries permitted until 2026-04-16 22:06:00.890609627 +0000 UTC m=+65.151335729 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs") pod "network-metrics-daemon-nrljs" (UID: "2f7c8d95-90cb-497a-8866-d2c45b825b72") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 22:05:28.890900 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:28.890660 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert podName:6d61274e-1ceb-496e-8a75-17916b110ed5 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:29.390641822 +0000 UTC m=+33.651367909 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert") pod "ingress-canary-6dv5d" (UID: "6d61274e-1ceb-496e-8a75-17916b110ed5") : secret "canary-serving-cert" not found Apr 16 22:05:28.890900 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:28.890719 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:05:28.890900 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:28.890770 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls podName:049ab1ab-5b48-4b74-9527-0075f2bb7467 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:29.39075694 +0000 UTC m=+33.651483037 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls") pod "dns-default-xdjs7" (UID: "049ab1ab-5b48-4b74-9527-0075f2bb7467") : secret "dns-default-metrics-tls" not found Apr 16 22:05:28.890900 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.890785 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/049ab1ab-5b48-4b74-9527-0075f2bb7467-tmp-dir\") pod \"dns-default-xdjs7\" (UID: \"049ab1ab-5b48-4b74-9527-0075f2bb7467\") " pod="openshift-dns/dns-default-xdjs7" Apr 16 22:05:28.890900 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.890872 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/049ab1ab-5b48-4b74-9527-0075f2bb7467-config-volume\") pod \"dns-default-xdjs7\" (UID: \"049ab1ab-5b48-4b74-9527-0075f2bb7467\") " pod="openshift-dns/dns-default-xdjs7" Apr 16 22:05:28.900359 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.900340 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-648pm\" (UniqueName: \"kubernetes.io/projected/6d61274e-1ceb-496e-8a75-17916b110ed5-kube-api-access-648pm\") pod \"ingress-canary-6dv5d\" (UID: \"6d61274e-1ceb-496e-8a75-17916b110ed5\") " pod="openshift-ingress-canary/ingress-canary-6dv5d" Apr 16 22:05:28.904975 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.904950 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7cwg\" (UniqueName: \"kubernetes.io/projected/049ab1ab-5b48-4b74-9527-0075f2bb7467-kube-api-access-x7cwg\") pod \"dns-default-xdjs7\" (UID: \"049ab1ab-5b48-4b74-9527-0075f2bb7467\") " pod="openshift-dns/dns-default-xdjs7" Apr 16 22:05:28.991347 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:28.991301 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fq9x\" (UniqueName: \"kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x\") pod \"network-check-target-jngsx\" (UID: \"2d82db40-2daf-4082-bfaa-d74e9b453817\") " pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:05:28.991508 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:28.991486 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 22:05:28.991573 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:28.991511 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 22:05:28.991573 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:28.991526 2576 projected.go:194] Error preparing data for projected volume kube-api-access-7fq9x for pod openshift-network-diagnostics/network-check-target-jngsx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:05:28.991669 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:28.991592 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x podName:2d82db40-2daf-4082-bfaa-d74e9b453817 nodeName:}" failed. No retries permitted until 2026-04-16 22:06:00.991573734 +0000 UTC m=+65.252299837 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-7fq9x" (UniqueName: "kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x") pod "network-check-target-jngsx" (UID: "2d82db40-2daf-4082-bfaa-d74e9b453817") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 22:05:29.277514 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:29.277439 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:05:29.277701 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:29.277439 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:05:29.280894 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:29.280870 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:05:29.281026 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:29.280880 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:05:29.281081 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:29.281064 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jm5fk\"" Apr 16 22:05:29.281254 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:29.281155 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vrzhk\"" Apr 16 22:05:29.281394 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:29.281279 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:05:29.394394 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:29.394364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls\") pod \"dns-default-xdjs7\" (UID: \"049ab1ab-5b48-4b74-9527-0075f2bb7467\") " pod="openshift-dns/dns-default-xdjs7" Apr 16 22:05:29.394575 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:29.394402 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert\") pod \"ingress-canary-6dv5d\" (UID: \"6d61274e-1ceb-496e-8a75-17916b110ed5\") " pod="openshift-ingress-canary/ingress-canary-6dv5d" Apr 16 22:05:29.394575 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:29.394545 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:05:29.394693 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:29.394625 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert podName:6d61274e-1ceb-496e-8a75-17916b110ed5 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:30.394604913 +0000 UTC m=+34.655331000 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert") pod "ingress-canary-6dv5d" (UID: "6d61274e-1ceb-496e-8a75-17916b110ed5") : secret "canary-serving-cert" not found Apr 16 22:05:29.394693 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:29.394545 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:05:29.394693 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:29.394673 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls podName:049ab1ab-5b48-4b74-9527-0075f2bb7467 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:30.39466289 +0000 UTC m=+34.655388973 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls") pod "dns-default-xdjs7" (UID: "049ab1ab-5b48-4b74-9527-0075f2bb7467") : secret "dns-default-metrics-tls" not found Apr 16 22:05:30.404421 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:30.404396 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls\") pod \"dns-default-xdjs7\" (UID: \"049ab1ab-5b48-4b74-9527-0075f2bb7467\") " pod="openshift-dns/dns-default-xdjs7" Apr 16 22:05:30.404714 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:30.404429 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert\") pod \"ingress-canary-6dv5d\" (UID: \"6d61274e-1ceb-496e-8a75-17916b110ed5\") " pod="openshift-ingress-canary/ingress-canary-6dv5d" Apr 16 22:05:30.404714 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:30.404524 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:05:30.404714 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:30.404525 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:05:30.404714 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:30.404568 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert podName:6d61274e-1ceb-496e-8a75-17916b110ed5 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:32.404553041 +0000 UTC m=+36.665279125 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert") pod "ingress-canary-6dv5d" (UID: "6d61274e-1ceb-496e-8a75-17916b110ed5") : secret "canary-serving-cert" not found Apr 16 22:05:30.404714 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:30.404580 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls podName:049ab1ab-5b48-4b74-9527-0075f2bb7467 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:32.404574635 +0000 UTC m=+36.665300717 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls") pod "dns-default-xdjs7" (UID: "049ab1ab-5b48-4b74-9527-0075f2bb7467") : secret "dns-default-metrics-tls" not found Apr 16 22:05:31.513564 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:31.513387 2576 generic.go:358] "Generic (PLEG): container finished" podID="87bd69c1-f44b-42b9-ba15-a3ed7fdad078" containerID="225594968f2b259cf182657f2601509c0d8be7cd25e25fe55acf118e143bb7ae" exitCode=0 Apr 16 22:05:31.513564 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:31.513456 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mz6rs" event={"ID":"87bd69c1-f44b-42b9-ba15-a3ed7fdad078","Type":"ContainerDied","Data":"225594968f2b259cf182657f2601509c0d8be7cd25e25fe55acf118e143bb7ae"} Apr 16 22:05:32.419493 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:32.419455 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls\") pod \"dns-default-xdjs7\" (UID: \"049ab1ab-5b48-4b74-9527-0075f2bb7467\") " pod="openshift-dns/dns-default-xdjs7" Apr 16 22:05:32.419493 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:32.419497 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert\") pod \"ingress-canary-6dv5d\" (UID: \"6d61274e-1ceb-496e-8a75-17916b110ed5\") " pod="openshift-ingress-canary/ingress-canary-6dv5d" Apr 16 22:05:32.419721 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:32.419607 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:05:32.419721 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:32.419649 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:05:32.419721 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:32.419674 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls podName:049ab1ab-5b48-4b74-9527-0075f2bb7467 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:36.419657649 +0000 UTC m=+40.680383733 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls") pod "dns-default-xdjs7" (UID: "049ab1ab-5b48-4b74-9527-0075f2bb7467") : secret "dns-default-metrics-tls" not found Apr 16 22:05:32.419721 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:32.419689 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert podName:6d61274e-1ceb-496e-8a75-17916b110ed5 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:36.419682752 +0000 UTC m=+40.680408835 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert") pod "ingress-canary-6dv5d" (UID: "6d61274e-1ceb-496e-8a75-17916b110ed5") : secret "canary-serving-cert" not found Apr 16 22:05:32.517714 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:32.517682 2576 generic.go:358] "Generic (PLEG): container finished" podID="87bd69c1-f44b-42b9-ba15-a3ed7fdad078" containerID="8f87cae8c296f59447a76238c146f65ee4e5c390fef34ce02c6a741ce1435d49" exitCode=0 Apr 16 22:05:32.518109 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:32.517737 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mz6rs" event={"ID":"87bd69c1-f44b-42b9-ba15-a3ed7fdad078","Type":"ContainerDied","Data":"8f87cae8c296f59447a76238c146f65ee4e5c390fef34ce02c6a741ce1435d49"} Apr 16 22:05:33.522871 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:33.522836 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-mz6rs" event={"ID":"87bd69c1-f44b-42b9-ba15-a3ed7fdad078","Type":"ContainerStarted","Data":"e507dff02dab870423dbe0e422ebe9f3e5cbd3be18acaaf96198c2a856270f7e"} Apr 16 22:05:33.550345 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:33.550281 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-mz6rs" podStartSLOduration=4.705059739 podStartE2EDuration="37.550269889s" podCreationTimestamp="2026-04-16 22:04:56 +0000 UTC" firstStartedPulling="2026-04-16 22:04:57.537627967 +0000 UTC m=+1.798354053" lastFinishedPulling="2026-04-16 22:05:30.382838107 +0000 UTC m=+34.643564203" observedRunningTime="2026-04-16 22:05:33.550136436 +0000 UTC m=+37.810862566" watchObservedRunningTime="2026-04-16 22:05:33.550269889 +0000 UTC m=+37.810995993" Apr 16 22:05:36.446029 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:36.445992 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls\") pod \"dns-default-xdjs7\" (UID: \"049ab1ab-5b48-4b74-9527-0075f2bb7467\") " pod="openshift-dns/dns-default-xdjs7" Apr 16 22:05:36.446029 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:36.446032 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert\") pod \"ingress-canary-6dv5d\" (UID: \"6d61274e-1ceb-496e-8a75-17916b110ed5\") " pod="openshift-ingress-canary/ingress-canary-6dv5d" Apr 16 22:05:36.446515 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:36.446124 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:05:36.446515 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:36.446131 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:05:36.446515 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:36.446173 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert podName:6d61274e-1ceb-496e-8a75-17916b110ed5 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:44.446159204 +0000 UTC m=+48.706885287 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert") pod "ingress-canary-6dv5d" (UID: "6d61274e-1ceb-496e-8a75-17916b110ed5") : secret "canary-serving-cert" not found Apr 16 22:05:36.446515 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:36.446185 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls podName:049ab1ab-5b48-4b74-9527-0075f2bb7467 nodeName:}" failed. No retries permitted until 2026-04-16 22:05:44.446179659 +0000 UTC m=+48.706905741 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls") pod "dns-default-xdjs7" (UID: "049ab1ab-5b48-4b74-9527-0075f2bb7467") : secret "dns-default-metrics-tls" not found Apr 16 22:05:44.504665 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:44.504629 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert\") pod \"ingress-canary-6dv5d\" (UID: \"6d61274e-1ceb-496e-8a75-17916b110ed5\") " pod="openshift-ingress-canary/ingress-canary-6dv5d" Apr 16 22:05:44.505124 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:44.504704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls\") pod \"dns-default-xdjs7\" (UID: \"049ab1ab-5b48-4b74-9527-0075f2bb7467\") " pod="openshift-dns/dns-default-xdjs7" Apr 16 22:05:44.505124 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:44.504788 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:05:44.505124 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:44.504805 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:05:44.505124 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:44.504858 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert podName:6d61274e-1ceb-496e-8a75-17916b110ed5 nodeName:}" failed. No retries permitted until 2026-04-16 22:06:00.504839267 +0000 UTC m=+64.765565351 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert") pod "ingress-canary-6dv5d" (UID: "6d61274e-1ceb-496e-8a75-17916b110ed5") : secret "canary-serving-cert" not found Apr 16 22:05:44.505124 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:05:44.504872 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls podName:049ab1ab-5b48-4b74-9527-0075f2bb7467 nodeName:}" failed. No retries permitted until 2026-04-16 22:06:00.504866544 +0000 UTC m=+64.765592626 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls") pod "dns-default-xdjs7" (UID: "049ab1ab-5b48-4b74-9527-0075f2bb7467") : secret "dns-default-metrics-tls" not found Apr 16 22:05:54.508844 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:05:54.508813 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fw6md" Apr 16 22:06:00.508397 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:06:00.508362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls\") pod \"dns-default-xdjs7\" (UID: \"049ab1ab-5b48-4b74-9527-0075f2bb7467\") " pod="openshift-dns/dns-default-xdjs7" Apr 16 22:06:00.508397 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:06:00.508399 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert\") pod \"ingress-canary-6dv5d\" (UID: \"6d61274e-1ceb-496e-8a75-17916b110ed5\") " pod="openshift-ingress-canary/ingress-canary-6dv5d" Apr 16 22:06:00.508875 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:06:00.508507 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:06:00.508875 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:06:00.508551 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:06:00.508875 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:06:00.508577 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls podName:049ab1ab-5b48-4b74-9527-0075f2bb7467 nodeName:}" failed. No retries permitted until 2026-04-16 22:06:32.508561639 +0000 UTC m=+96.769287722 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls") pod "dns-default-xdjs7" (UID: "049ab1ab-5b48-4b74-9527-0075f2bb7467") : secret "dns-default-metrics-tls" not found Apr 16 22:06:00.508875 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:06:00.508597 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert podName:6d61274e-1ceb-496e-8a75-17916b110ed5 nodeName:}" failed. No retries permitted until 2026-04-16 22:06:32.508586418 +0000 UTC m=+96.769312501 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert") pod "ingress-canary-6dv5d" (UID: "6d61274e-1ceb-496e-8a75-17916b110ed5") : secret "canary-serving-cert" not found Apr 16 22:06:00.910248 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:06:00.910221 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs\") pod \"network-metrics-daemon-nrljs\" (UID: \"2f7c8d95-90cb-497a-8866-d2c45b825b72\") " pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:06:00.912285 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:06:00.912267 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 22:06:00.920912 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:06:00.920893 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:06:00.920966 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:06:00.920947 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs podName:2f7c8d95-90cb-497a-8866-d2c45b825b72 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:04.920932166 +0000 UTC m=+129.181658262 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs") pod "network-metrics-daemon-nrljs" (UID: "2f7c8d95-90cb-497a-8866-d2c45b825b72") : secret "metrics-daemon-secret" not found Apr 16 22:06:01.011198 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:06:01.011172 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fq9x\" (UniqueName: \"kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x\") pod \"network-check-target-jngsx\" (UID: \"2d82db40-2daf-4082-bfaa-d74e9b453817\") " pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:06:01.013569 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:06:01.013552 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 22:06:01.023911 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:06:01.023893 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 22:06:01.045485 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:06:01.045463 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fq9x\" (UniqueName: \"kubernetes.io/projected/2d82db40-2daf-4082-bfaa-d74e9b453817-kube-api-access-7fq9x\") pod \"network-check-target-jngsx\" (UID: \"2d82db40-2daf-4082-bfaa-d74e9b453817\") " pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:06:01.096897 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:06:01.096871 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-jm5fk\"" Apr 16 22:06:01.105504 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:06:01.105484 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:06:01.287322 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:06:01.287273 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jngsx"] Apr 16 22:06:01.291569 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:06:01.291530 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d82db40_2daf_4082_bfaa_d74e9b453817.slice/crio-58e47720625da7dc5cae203f0352e01f0d52cb4e13e8807c0901fcc1e534e19d WatchSource:0}: Error finding container 58e47720625da7dc5cae203f0352e01f0d52cb4e13e8807c0901fcc1e534e19d: Status 404 returned error can't find the container with id 58e47720625da7dc5cae203f0352e01f0d52cb4e13e8807c0901fcc1e534e19d Apr 16 22:06:01.572883 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:06:01.572807 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jngsx" event={"ID":"2d82db40-2daf-4082-bfaa-d74e9b453817","Type":"ContainerStarted","Data":"58e47720625da7dc5cae203f0352e01f0d52cb4e13e8807c0901fcc1e534e19d"} Apr 16 22:06:04.579616 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:06:04.579581 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jngsx" event={"ID":"2d82db40-2daf-4082-bfaa-d74e9b453817","Type":"ContainerStarted","Data":"7509162611d2825873a75f90ed6fa4e9ee0b84971ab81bc204289634ee70e17d"} Apr 16 22:06:04.579970 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:06:04.579719 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:06:04.594115 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:06:04.594051 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-jngsx" podStartSLOduration=65.889920124 podStartE2EDuration="1m8.594038858s" podCreationTimestamp="2026-04-16 22:04:56 +0000 UTC" firstStartedPulling="2026-04-16 22:06:01.293256993 +0000 UTC m=+65.553983079" lastFinishedPulling="2026-04-16 22:06:03.99737573 +0000 UTC m=+68.258101813" observedRunningTime="2026-04-16 22:06:04.593483493 +0000 UTC m=+68.854209597" watchObservedRunningTime="2026-04-16 22:06:04.594038858 +0000 UTC m=+68.854764969" Apr 16 22:06:32.516652 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:06:32.516622 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls\") pod \"dns-default-xdjs7\" (UID: \"049ab1ab-5b48-4b74-9527-0075f2bb7467\") " pod="openshift-dns/dns-default-xdjs7" Apr 16 22:06:32.516652 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:06:32.516655 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert\") pod \"ingress-canary-6dv5d\" (UID: \"6d61274e-1ceb-496e-8a75-17916b110ed5\") " pod="openshift-ingress-canary/ingress-canary-6dv5d" Apr 16 22:06:32.517147 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:06:32.516733 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 22:06:32.517147 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:06:32.516741 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 22:06:32.517147 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:06:32.516791 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert podName:6d61274e-1ceb-496e-8a75-17916b110ed5 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:36.516776529 +0000 UTC m=+160.777502612 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert") pod "ingress-canary-6dv5d" (UID: "6d61274e-1ceb-496e-8a75-17916b110ed5") : secret "canary-serving-cert" not found Apr 16 22:06:32.517147 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:06:32.516811 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls podName:049ab1ab-5b48-4b74-9527-0075f2bb7467 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:36.516798 +0000 UTC m=+160.777524087 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls") pod "dns-default-xdjs7" (UID: "049ab1ab-5b48-4b74-9527-0075f2bb7467") : secret "dns-default-metrics-tls" not found Apr 16 22:06:35.583984 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:06:35.583951 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jngsx" Apr 16 22:07:01.238719 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.238686 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-r49p7"] Apr 16 22:07:01.241479 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.241455 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-r49p7" Apr 16 22:07:01.242137 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.242108 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-567b8b8c4d-864ph"] Apr 16 22:07:01.243492 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.243471 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 22:07:01.243711 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.243690 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-kthn2\"" Apr 16 22:07:01.243791 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.243754 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 22:07:01.243791 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.243694 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 22:07:01.243882 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.243812 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 22:07:01.245025 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.245008 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:01.246916 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.246897 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 22:07:01.247048 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.246913 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 22:07:01.247124 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.247103 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-66j7n\"" Apr 16 22:07:01.247124 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.247117 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 22:07:01.247362 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.247341 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 22:07:01.247496 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.247476 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 22:07:01.247729 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.247700 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 22:07:01.248996 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.248710 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 22:07:01.248996 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.248887 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-r49p7"] Apr 16 22:07:01.255941 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.255921 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-567b8b8c4d-864ph"] Apr 16 22:07:01.297352 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.297325 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8d4070-576e-4533-a43c-28e13d203ec1-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-r49p7\" (UID: \"9b8d4070-576e-4533-a43c-28e13d203ec1\") " pod="openshift-insights/insights-operator-585dfdc468-r49p7" Apr 16 22:07:01.297505 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.297360 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-metrics-certs\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:01.297505 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.297404 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-default-certificate\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:01.297505 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.297422 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b8d4070-576e-4533-a43c-28e13d203ec1-tmp\") pod \"insights-operator-585dfdc468-r49p7\" (UID: \"9b8d4070-576e-4533-a43c-28e13d203ec1\") " pod="openshift-insights/insights-operator-585dfdc468-r49p7" Apr 16 22:07:01.297505 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.297440 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrh7c\" (UniqueName: \"kubernetes.io/projected/51846908-299c-4e1f-b417-c0af1029a45f-kube-api-access-zrh7c\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:01.297632 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.297542 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9b8d4070-576e-4533-a43c-28e13d203ec1-snapshots\") pod \"insights-operator-585dfdc468-r49p7\" (UID: \"9b8d4070-576e-4533-a43c-28e13d203ec1\") " pod="openshift-insights/insights-operator-585dfdc468-r49p7" Apr 16 22:07:01.297632 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.297570 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8d4070-576e-4533-a43c-28e13d203ec1-service-ca-bundle\") pod \"insights-operator-585dfdc468-r49p7\" (UID: \"9b8d4070-576e-4533-a43c-28e13d203ec1\") " pod="openshift-insights/insights-operator-585dfdc468-r49p7" Apr 16 22:07:01.297632 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.297588 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhxww\" (UniqueName: \"kubernetes.io/projected/9b8d4070-576e-4533-a43c-28e13d203ec1-kube-api-access-bhxww\") pod \"insights-operator-585dfdc468-r49p7\" (UID: \"9b8d4070-576e-4533-a43c-28e13d203ec1\") " pod="openshift-insights/insights-operator-585dfdc468-r49p7" Apr 16 22:07:01.297632 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.297616 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-stats-auth\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:01.297748 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.297642 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b8d4070-576e-4533-a43c-28e13d203ec1-serving-cert\") pod \"insights-operator-585dfdc468-r49p7\" (UID: \"9b8d4070-576e-4533-a43c-28e13d203ec1\") " pod="openshift-insights/insights-operator-585dfdc468-r49p7" Apr 16 22:07:01.297748 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.297661 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51846908-299c-4e1f-b417-c0af1029a45f-service-ca-bundle\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:01.398502 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.398469 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b8d4070-576e-4533-a43c-28e13d203ec1-serving-cert\") pod \"insights-operator-585dfdc468-r49p7\" (UID: \"9b8d4070-576e-4533-a43c-28e13d203ec1\") " pod="openshift-insights/insights-operator-585dfdc468-r49p7" Apr 16 22:07:01.398661 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.398513 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51846908-299c-4e1f-b417-c0af1029a45f-service-ca-bundle\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:01.401933 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.398835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8d4070-576e-4533-a43c-28e13d203ec1-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-r49p7\" (UID: \"9b8d4070-576e-4533-a43c-28e13d203ec1\") " pod="openshift-insights/insights-operator-585dfdc468-r49p7" Apr 16 22:07:01.401933 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.399074 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-metrics-certs\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:01.401933 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:01.399323 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:07:01.401933 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:01.400147 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51846908-299c-4e1f-b417-c0af1029a45f-service-ca-bundle podName:51846908-299c-4e1f-b417-c0af1029a45f nodeName:}" failed. No retries permitted until 2026-04-16 22:07:01.900108353 +0000 UTC m=+126.160834448 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/51846908-299c-4e1f-b417-c0af1029a45f-service-ca-bundle") pod "router-default-567b8b8c4d-864ph" (UID: "51846908-299c-4e1f-b417-c0af1029a45f") : configmap references non-existent config key: service-ca.crt Apr 16 22:07:01.401933 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:01.400642 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-metrics-certs podName:51846908-299c-4e1f-b417-c0af1029a45f nodeName:}" failed. No retries permitted until 2026-04-16 22:07:01.900618598 +0000 UTC m=+126.161344693 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-metrics-certs") pod "router-default-567b8b8c4d-864ph" (UID: "51846908-299c-4e1f-b417-c0af1029a45f") : secret "router-metrics-certs-default" not found Apr 16 22:07:01.401933 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.400739 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-default-certificate\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:01.401933 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.400806 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b8d4070-576e-4533-a43c-28e13d203ec1-tmp\") pod \"insights-operator-585dfdc468-r49p7\" (UID: \"9b8d4070-576e-4533-a43c-28e13d203ec1\") " pod="openshift-insights/insights-operator-585dfdc468-r49p7" Apr 16 22:07:01.401933 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.400873 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrh7c\" (UniqueName: \"kubernetes.io/projected/51846908-299c-4e1f-b417-c0af1029a45f-kube-api-access-zrh7c\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:01.401933 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.401134 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9b8d4070-576e-4533-a43c-28e13d203ec1-snapshots\") pod \"insights-operator-585dfdc468-r49p7\" (UID: \"9b8d4070-576e-4533-a43c-28e13d203ec1\") " pod="openshift-insights/insights-operator-585dfdc468-r49p7" Apr 16 22:07:01.401933 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.401281 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8d4070-576e-4533-a43c-28e13d203ec1-service-ca-bundle\") pod \"insights-operator-585dfdc468-r49p7\" (UID: \"9b8d4070-576e-4533-a43c-28e13d203ec1\") " pod="openshift-insights/insights-operator-585dfdc468-r49p7" Apr 16 22:07:01.401933 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.401350 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhxww\" (UniqueName: \"kubernetes.io/projected/9b8d4070-576e-4533-a43c-28e13d203ec1-kube-api-access-bhxww\") pod \"insights-operator-585dfdc468-r49p7\" (UID: \"9b8d4070-576e-4533-a43c-28e13d203ec1\") " pod="openshift-insights/insights-operator-585dfdc468-r49p7" Apr 16 22:07:01.401933 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.401382 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-stats-auth\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:01.401933 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.401460 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9b8d4070-576e-4533-a43c-28e13d203ec1-tmp\") pod \"insights-operator-585dfdc468-r49p7\" (UID: \"9b8d4070-576e-4533-a43c-28e13d203ec1\") " pod="openshift-insights/insights-operator-585dfdc468-r49p7" Apr 16 22:07:01.402742 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.402018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9b8d4070-576e-4533-a43c-28e13d203ec1-snapshots\") pod \"insights-operator-585dfdc468-r49p7\" (UID: \"9b8d4070-576e-4533-a43c-28e13d203ec1\") " pod="openshift-insights/insights-operator-585dfdc468-r49p7" Apr 16 22:07:01.402803 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.402749 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-default-certificate\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:01.403333 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.403051 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8d4070-576e-4533-a43c-28e13d203ec1-service-ca-bundle\") pod \"insights-operator-585dfdc468-r49p7\" (UID: \"9b8d4070-576e-4533-a43c-28e13d203ec1\") " pod="openshift-insights/insights-operator-585dfdc468-r49p7" Apr 16 22:07:01.403333 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.403296 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b8d4070-576e-4533-a43c-28e13d203ec1-serving-cert\") pod \"insights-operator-585dfdc468-r49p7\" (UID: \"9b8d4070-576e-4533-a43c-28e13d203ec1\") " pod="openshift-insights/insights-operator-585dfdc468-r49p7" Apr 16 22:07:01.403929 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.403905 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b8d4070-576e-4533-a43c-28e13d203ec1-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-r49p7\" (UID: \"9b8d4070-576e-4533-a43c-28e13d203ec1\") " pod="openshift-insights/insights-operator-585dfdc468-r49p7" Apr 16 22:07:01.404213 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.404197 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-stats-auth\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:01.410688 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.410668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhxww\" (UniqueName: \"kubernetes.io/projected/9b8d4070-576e-4533-a43c-28e13d203ec1-kube-api-access-bhxww\") pod \"insights-operator-585dfdc468-r49p7\" (UID: \"9b8d4070-576e-4533-a43c-28e13d203ec1\") " pod="openshift-insights/insights-operator-585dfdc468-r49p7" Apr 16 22:07:01.410777 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.410725 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrh7c\" (UniqueName: \"kubernetes.io/projected/51846908-299c-4e1f-b417-c0af1029a45f-kube-api-access-zrh7c\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:01.553136 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.553081 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-r49p7" Apr 16 22:07:01.662609 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.662581 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-r49p7"] Apr 16 22:07:01.667105 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:07:01.667078 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b8d4070_576e_4533_a43c_28e13d203ec1.slice/crio-ff94acc8cfd56cb26db1e4a1a0d384a5f78563d29434041e1bf27134b750f752 WatchSource:0}: Error finding container ff94acc8cfd56cb26db1e4a1a0d384a5f78563d29434041e1bf27134b750f752: Status 404 returned error can't find the container with id ff94acc8cfd56cb26db1e4a1a0d384a5f78563d29434041e1bf27134b750f752 Apr 16 22:07:01.682207 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.682183 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-r49p7" event={"ID":"9b8d4070-576e-4533-a43c-28e13d203ec1","Type":"ContainerStarted","Data":"ff94acc8cfd56cb26db1e4a1a0d384a5f78563d29434041e1bf27134b750f752"} Apr 16 22:07:01.905887 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.905860 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51846908-299c-4e1f-b417-c0af1029a45f-service-ca-bundle\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:01.906019 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:01.905897 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-metrics-certs\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:01.906019 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:01.905992 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:07:01.906087 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:01.906018 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51846908-299c-4e1f-b417-c0af1029a45f-service-ca-bundle podName:51846908-299c-4e1f-b417-c0af1029a45f nodeName:}" failed. No retries permitted until 2026-04-16 22:07:02.906000867 +0000 UTC m=+127.166726950 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/51846908-299c-4e1f-b417-c0af1029a45f-service-ca-bundle") pod "router-default-567b8b8c4d-864ph" (UID: "51846908-299c-4e1f-b417-c0af1029a45f") : configmap references non-existent config key: service-ca.crt Apr 16 22:07:01.906087 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:01.906041 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-metrics-certs podName:51846908-299c-4e1f-b417-c0af1029a45f nodeName:}" failed. No retries permitted until 2026-04-16 22:07:02.906033703 +0000 UTC m=+127.166759787 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-metrics-certs") pod "router-default-567b8b8c4d-864ph" (UID: "51846908-299c-4e1f-b417-c0af1029a45f") : secret "router-metrics-certs-default" not found Apr 16 22:07:02.913550 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:02.913511 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51846908-299c-4e1f-b417-c0af1029a45f-service-ca-bundle\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:02.914020 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:02.913568 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-metrics-certs\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:02.914020 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:02.913708 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51846908-299c-4e1f-b417-c0af1029a45f-service-ca-bundle podName:51846908-299c-4e1f-b417-c0af1029a45f nodeName:}" failed. No retries permitted until 2026-04-16 22:07:04.913683877 +0000 UTC m=+129.174409990 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/51846908-299c-4e1f-b417-c0af1029a45f-service-ca-bundle") pod "router-default-567b8b8c4d-864ph" (UID: "51846908-299c-4e1f-b417-c0af1029a45f") : configmap references non-existent config key: service-ca.crt Apr 16 22:07:02.914020 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:02.913720 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:07:02.914020 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:02.913763 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-metrics-certs podName:51846908-299c-4e1f-b417-c0af1029a45f nodeName:}" failed. No retries permitted until 2026-04-16 22:07:04.913749255 +0000 UTC m=+129.174475338 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-metrics-certs") pod "router-default-567b8b8c4d-864ph" (UID: "51846908-299c-4e1f-b417-c0af1029a45f") : secret "router-metrics-certs-default" not found Apr 16 22:07:03.067370 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:03.067337 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5jz8x"] Apr 16 22:07:03.071434 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:03.071414 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5jz8x" Apr 16 22:07:03.073619 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:03.073596 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-cqjrx\"" Apr 16 22:07:03.073728 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:03.073651 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 22:07:03.074510 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:03.074488 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:07:03.075596 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:03.075577 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5jz8x"] Apr 16 22:07:03.114847 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:03.114826 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkc6p\" (UniqueName: \"kubernetes.io/projected/07b94c63-bfb5-4f8c-8a11-7985e312b413-kube-api-access-bkc6p\") pod \"volume-data-source-validator-7c6cbb6c87-5jz8x\" (UID: \"07b94c63-bfb5-4f8c-8a11-7985e312b413\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5jz8x" Apr 16 22:07:03.215836 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:03.215755 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bkc6p\" (UniqueName: \"kubernetes.io/projected/07b94c63-bfb5-4f8c-8a11-7985e312b413-kube-api-access-bkc6p\") pod \"volume-data-source-validator-7c6cbb6c87-5jz8x\" (UID: \"07b94c63-bfb5-4f8c-8a11-7985e312b413\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5jz8x" Apr 16 22:07:03.224864 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:03.224842 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkc6p\" (UniqueName: \"kubernetes.io/projected/07b94c63-bfb5-4f8c-8a11-7985e312b413-kube-api-access-bkc6p\") pod \"volume-data-source-validator-7c6cbb6c87-5jz8x\" (UID: \"07b94c63-bfb5-4f8c-8a11-7985e312b413\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5jz8x" Apr 16 22:07:03.382885 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:03.382838 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5jz8x" Apr 16 22:07:03.515426 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:03.515400 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5jz8x"] Apr 16 22:07:03.518532 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:07:03.518507 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07b94c63_bfb5_4f8c_8a11_7985e312b413.slice/crio-733d081b4ca00d91df825c29c88292a5b4a2951a2473312a0367361c19141a53 WatchSource:0}: Error finding container 733d081b4ca00d91df825c29c88292a5b4a2951a2473312a0367361c19141a53: Status 404 returned error can't find the container with id 733d081b4ca00d91df825c29c88292a5b4a2951a2473312a0367361c19141a53 Apr 16 22:07:03.687241 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:03.687203 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-r49p7" event={"ID":"9b8d4070-576e-4533-a43c-28e13d203ec1","Type":"ContainerStarted","Data":"4abce5932b7163943a04a934b77b7fe25ad8b635a272f8fbc381a16d50e0e67f"} Apr 16 22:07:03.688193 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:03.688166 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5jz8x" event={"ID":"07b94c63-bfb5-4f8c-8a11-7985e312b413","Type":"ContainerStarted","Data":"733d081b4ca00d91df825c29c88292a5b4a2951a2473312a0367361c19141a53"} Apr 16 22:07:03.701284 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:03.701240 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-r49p7" podStartSLOduration=0.918642589 podStartE2EDuration="2.701224966s" podCreationTimestamp="2026-04-16 22:07:01 +0000 UTC" firstStartedPulling="2026-04-16 22:07:01.670078446 +0000 UTC m=+125.930804529" lastFinishedPulling="2026-04-16 22:07:03.452660813 +0000 UTC m=+127.713386906" observedRunningTime="2026-04-16 22:07:03.700321037 +0000 UTC m=+127.961047134" watchObservedRunningTime="2026-04-16 22:07:03.701224966 +0000 UTC m=+127.961951073" Apr 16 22:07:04.073120 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.073087 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fcfwt"] Apr 16 22:07:04.076224 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.076203 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" Apr 16 22:07:04.078391 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.078261 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 22:07:04.078651 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.078628 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-2hmvt\"" Apr 16 22:07:04.078738 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.078635 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 22:07:04.079063 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.079047 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 22:07:04.079131 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.079098 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:07:04.083080 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.083059 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 22:07:04.085814 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.085792 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fcfwt"] Apr 16 22:07:04.123612 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.123586 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad6a094a-d7bf-42af-90ae-94731039404b-config\") pod \"console-operator-9d4b6777b-fcfwt\" (UID: \"ad6a094a-d7bf-42af-90ae-94731039404b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" Apr 16 22:07:04.123748 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.123624 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad6a094a-d7bf-42af-90ae-94731039404b-trusted-ca\") pod \"console-operator-9d4b6777b-fcfwt\" (UID: \"ad6a094a-d7bf-42af-90ae-94731039404b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" Apr 16 22:07:04.123748 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.123667 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl5zh\" (UniqueName: \"kubernetes.io/projected/ad6a094a-d7bf-42af-90ae-94731039404b-kube-api-access-pl5zh\") pod \"console-operator-9d4b6777b-fcfwt\" (UID: \"ad6a094a-d7bf-42af-90ae-94731039404b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" Apr 16 22:07:04.123748 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.123724 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad6a094a-d7bf-42af-90ae-94731039404b-serving-cert\") pod \"console-operator-9d4b6777b-fcfwt\" (UID: \"ad6a094a-d7bf-42af-90ae-94731039404b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" Apr 16 22:07:04.225024 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.224989 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad6a094a-d7bf-42af-90ae-94731039404b-config\") pod \"console-operator-9d4b6777b-fcfwt\" (UID: \"ad6a094a-d7bf-42af-90ae-94731039404b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" Apr 16 22:07:04.225197 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.225040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad6a094a-d7bf-42af-90ae-94731039404b-trusted-ca\") pod \"console-operator-9d4b6777b-fcfwt\" (UID: \"ad6a094a-d7bf-42af-90ae-94731039404b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" Apr 16 22:07:04.225197 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.225089 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pl5zh\" (UniqueName: \"kubernetes.io/projected/ad6a094a-d7bf-42af-90ae-94731039404b-kube-api-access-pl5zh\") pod \"console-operator-9d4b6777b-fcfwt\" (UID: \"ad6a094a-d7bf-42af-90ae-94731039404b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" Apr 16 22:07:04.225197 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.225119 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad6a094a-d7bf-42af-90ae-94731039404b-serving-cert\") pod \"console-operator-9d4b6777b-fcfwt\" (UID: \"ad6a094a-d7bf-42af-90ae-94731039404b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" Apr 16 22:07:04.225804 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.225775 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad6a094a-d7bf-42af-90ae-94731039404b-config\") pod \"console-operator-9d4b6777b-fcfwt\" (UID: \"ad6a094a-d7bf-42af-90ae-94731039404b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" Apr 16 22:07:04.226020 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.225996 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ad6a094a-d7bf-42af-90ae-94731039404b-trusted-ca\") pod \"console-operator-9d4b6777b-fcfwt\" (UID: \"ad6a094a-d7bf-42af-90ae-94731039404b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" Apr 16 22:07:04.227813 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.227789 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad6a094a-d7bf-42af-90ae-94731039404b-serving-cert\") pod \"console-operator-9d4b6777b-fcfwt\" (UID: \"ad6a094a-d7bf-42af-90ae-94731039404b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" Apr 16 22:07:04.232802 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.232782 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl5zh\" (UniqueName: \"kubernetes.io/projected/ad6a094a-d7bf-42af-90ae-94731039404b-kube-api-access-pl5zh\") pod \"console-operator-9d4b6777b-fcfwt\" (UID: \"ad6a094a-d7bf-42af-90ae-94731039404b\") " pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" Apr 16 22:07:04.388642 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.388608 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" Apr 16 22:07:04.690989 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.690961 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5jz8x" event={"ID":"07b94c63-bfb5-4f8c-8a11-7985e312b413","Type":"ContainerStarted","Data":"79a80d25cb24aa3facbb0ae91a248534633bb2081a74162a7299d29c117fc8d6"} Apr 16 22:07:04.696722 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.696694 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fcfwt"] Apr 16 22:07:04.700079 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:07:04.700049 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad6a094a_d7bf_42af_90ae_94731039404b.slice/crio-7854f823a50dd59a8f4a81a7410bac02b40c804910113b31372a623d098143fe WatchSource:0}: Error finding container 7854f823a50dd59a8f4a81a7410bac02b40c804910113b31372a623d098143fe: Status 404 returned error can't find the container with id 7854f823a50dd59a8f4a81a7410bac02b40c804910113b31372a623d098143fe Apr 16 22:07:04.705680 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.705634 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5jz8x" podStartSLOduration=0.601120547 podStartE2EDuration="1.705616361s" podCreationTimestamp="2026-04-16 22:07:03 +0000 UTC" firstStartedPulling="2026-04-16 22:07:03.520349199 +0000 UTC m=+127.781075295" lastFinishedPulling="2026-04-16 22:07:04.624845026 +0000 UTC m=+128.885571109" observedRunningTime="2026-04-16 22:07:04.704769984 +0000 UTC m=+128.965496113" watchObservedRunningTime="2026-04-16 22:07:04.705616361 +0000 UTC m=+128.966342464" Apr 16 22:07:04.930343 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.930254 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs\") pod \"network-metrics-daemon-nrljs\" (UID: \"2f7c8d95-90cb-497a-8866-d2c45b825b72\") " pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:07:04.930485 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.930348 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51846908-299c-4e1f-b417-c0af1029a45f-service-ca-bundle\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:04.930485 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:04.930374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-metrics-certs\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:04.930485 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:04.930404 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 22:07:04.930485 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:04.930453 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:07:04.930485 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:04.930477 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs podName:2f7c8d95-90cb-497a-8866-d2c45b825b72 nodeName:}" failed. No retries permitted until 2026-04-16 22:09:06.930455234 +0000 UTC m=+251.191181333 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs") pod "network-metrics-daemon-nrljs" (UID: "2f7c8d95-90cb-497a-8866-d2c45b825b72") : secret "metrics-daemon-secret" not found Apr 16 22:07:04.930793 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:04.930498 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51846908-299c-4e1f-b417-c0af1029a45f-service-ca-bundle podName:51846908-299c-4e1f-b417-c0af1029a45f nodeName:}" failed. No retries permitted until 2026-04-16 22:07:08.930488356 +0000 UTC m=+133.191214453 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/51846908-299c-4e1f-b417-c0af1029a45f-service-ca-bundle") pod "router-default-567b8b8c4d-864ph" (UID: "51846908-299c-4e1f-b417-c0af1029a45f") : configmap references non-existent config key: service-ca.crt Apr 16 22:07:04.930793 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:04.930512 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-metrics-certs podName:51846908-299c-4e1f-b417-c0af1029a45f nodeName:}" failed. No retries permitted until 2026-04-16 22:07:08.930504529 +0000 UTC m=+133.191230617 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-metrics-certs") pod "router-default-567b8b8c4d-864ph" (UID: "51846908-299c-4e1f-b417-c0af1029a45f") : secret "router-metrics-certs-default" not found Apr 16 22:07:05.693942 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:05.693899 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" event={"ID":"ad6a094a-d7bf-42af-90ae-94731039404b","Type":"ContainerStarted","Data":"7854f823a50dd59a8f4a81a7410bac02b40c804910113b31372a623d098143fe"} Apr 16 22:07:06.696931 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:06.696850 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/0.log" Apr 16 22:07:06.696931 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:06.696897 2576 generic.go:358] "Generic (PLEG): container finished" podID="ad6a094a-d7bf-42af-90ae-94731039404b" containerID="96b245f4b4df0ecbee3a4be103e6a8036b37d5b8c88e329973c76d76f99d041f" exitCode=255 Apr 16 22:07:06.697343 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:06.696961 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" event={"ID":"ad6a094a-d7bf-42af-90ae-94731039404b","Type":"ContainerDied","Data":"96b245f4b4df0ecbee3a4be103e6a8036b37d5b8c88e329973c76d76f99d041f"} Apr 16 22:07:06.697343 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:06.697157 2576 scope.go:117] "RemoveContainer" containerID="96b245f4b4df0ecbee3a4be103e6a8036b37d5b8c88e329973c76d76f99d041f" Apr 16 22:07:07.610790 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:07.610760 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7pgrh_a333a392-a24f-4b6c-85d5-2cc457992bf5/dns-node-resolver/0.log" Apr 16 22:07:07.700359 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:07.700333 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/1.log" Apr 16 22:07:07.700750 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:07.700720 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/0.log" Apr 16 22:07:07.700802 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:07.700752 2576 generic.go:358] "Generic (PLEG): container finished" podID="ad6a094a-d7bf-42af-90ae-94731039404b" containerID="7539a8f475353af81fc32c6ecebef435853fe7cb86f24fc650bac92ef69685b9" exitCode=255 Apr 16 22:07:07.700845 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:07.700807 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" event={"ID":"ad6a094a-d7bf-42af-90ae-94731039404b","Type":"ContainerDied","Data":"7539a8f475353af81fc32c6ecebef435853fe7cb86f24fc650bac92ef69685b9"} Apr 16 22:07:07.700845 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:07.700833 2576 scope.go:117] "RemoveContainer" containerID="96b245f4b4df0ecbee3a4be103e6a8036b37d5b8c88e329973c76d76f99d041f" Apr 16 22:07:07.701041 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:07.701028 2576 scope.go:117] "RemoveContainer" containerID="7539a8f475353af81fc32c6ecebef435853fe7cb86f24fc650bac92ef69685b9" Apr 16 22:07:07.701248 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:07.701226 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fcfwt_openshift-console-operator(ad6a094a-d7bf-42af-90ae-94731039404b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" podUID="ad6a094a-d7bf-42af-90ae-94731039404b" Apr 16 22:07:08.072150 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.072076 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lxr9z"] Apr 16 22:07:08.075160 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.075146 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lxr9z" Apr 16 22:07:08.076945 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.076925 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 22:07:08.077167 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.077149 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 22:07:08.077272 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.077189 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 22:07:08.077272 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.077258 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:07:08.077734 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.077719 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-d6nhf\"" Apr 16 22:07:08.085373 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.085352 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lxr9z"] Apr 16 22:07:08.158805 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.158777 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69e30ac7-c0e6-4cd4-ab1f-3df7d3277790-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-lxr9z\" (UID: \"69e30ac7-c0e6-4cd4-ab1f-3df7d3277790\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lxr9z" Apr 16 22:07:08.158933 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.158841 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5bzk\" (UniqueName: \"kubernetes.io/projected/69e30ac7-c0e6-4cd4-ab1f-3df7d3277790-kube-api-access-x5bzk\") pod \"kube-storage-version-migrator-operator-6769c5d45-lxr9z\" (UID: \"69e30ac7-c0e6-4cd4-ab1f-3df7d3277790\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lxr9z" Apr 16 22:07:08.158933 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.158899 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69e30ac7-c0e6-4cd4-ab1f-3df7d3277790-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-lxr9z\" (UID: \"69e30ac7-c0e6-4cd4-ab1f-3df7d3277790\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lxr9z" Apr 16 22:07:08.259400 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.259369 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5bzk\" (UniqueName: \"kubernetes.io/projected/69e30ac7-c0e6-4cd4-ab1f-3df7d3277790-kube-api-access-x5bzk\") pod \"kube-storage-version-migrator-operator-6769c5d45-lxr9z\" (UID: \"69e30ac7-c0e6-4cd4-ab1f-3df7d3277790\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lxr9z" Apr 16 22:07:08.259561 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.259412 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69e30ac7-c0e6-4cd4-ab1f-3df7d3277790-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-lxr9z\" (UID: \"69e30ac7-c0e6-4cd4-ab1f-3df7d3277790\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lxr9z" Apr 16 22:07:08.259628 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.259560 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69e30ac7-c0e6-4cd4-ab1f-3df7d3277790-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-lxr9z\" (UID: \"69e30ac7-c0e6-4cd4-ab1f-3df7d3277790\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lxr9z" Apr 16 22:07:08.260089 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.260067 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69e30ac7-c0e6-4cd4-ab1f-3df7d3277790-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-lxr9z\" (UID: \"69e30ac7-c0e6-4cd4-ab1f-3df7d3277790\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lxr9z" Apr 16 22:07:08.261628 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.261607 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69e30ac7-c0e6-4cd4-ab1f-3df7d3277790-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-lxr9z\" (UID: \"69e30ac7-c0e6-4cd4-ab1f-3df7d3277790\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lxr9z" Apr 16 22:07:08.266533 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.266513 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5bzk\" (UniqueName: \"kubernetes.io/projected/69e30ac7-c0e6-4cd4-ab1f-3df7d3277790-kube-api-access-x5bzk\") pod \"kube-storage-version-migrator-operator-6769c5d45-lxr9z\" (UID: \"69e30ac7-c0e6-4cd4-ab1f-3df7d3277790\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lxr9z" Apr 16 22:07:08.384573 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.384529 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lxr9z" Apr 16 22:07:08.411906 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.411879 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4nfgz_4591d12d-774b-43ce-a862-67018bf47f0c/node-ca/0.log" Apr 16 22:07:08.494005 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.493975 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lxr9z"] Apr 16 22:07:08.497485 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:07:08.497461 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69e30ac7_c0e6_4cd4_ab1f_3df7d3277790.slice/crio-57b59d3ff088fb2529658767abbb488a6f8174015f4de05c6e586059db1e01be WatchSource:0}: Error finding container 57b59d3ff088fb2529658767abbb488a6f8174015f4de05c6e586059db1e01be: Status 404 returned error can't find the container with id 57b59d3ff088fb2529658767abbb488a6f8174015f4de05c6e586059db1e01be Apr 16 22:07:08.704332 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.704241 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/1.log" Apr 16 22:07:08.704770 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.704620 2576 scope.go:117] "RemoveContainer" containerID="7539a8f475353af81fc32c6ecebef435853fe7cb86f24fc650bac92ef69685b9" Apr 16 22:07:08.704837 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:08.704808 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fcfwt_openshift-console-operator(ad6a094a-d7bf-42af-90ae-94731039404b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" podUID="ad6a094a-d7bf-42af-90ae-94731039404b" Apr 16 22:07:08.705412 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.705394 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lxr9z" event={"ID":"69e30ac7-c0e6-4cd4-ab1f-3df7d3277790","Type":"ContainerStarted","Data":"57b59d3ff088fb2529658767abbb488a6f8174015f4de05c6e586059db1e01be"} Apr 16 22:07:08.965655 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.965567 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51846908-299c-4e1f-b417-c0af1029a45f-service-ca-bundle\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:08.965655 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:08.965612 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-metrics-certs\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:08.965841 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:08.965705 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:07:08.965841 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:08.965741 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51846908-299c-4e1f-b417-c0af1029a45f-service-ca-bundle podName:51846908-299c-4e1f-b417-c0af1029a45f nodeName:}" failed. No retries permitted until 2026-04-16 22:07:16.965722276 +0000 UTC m=+141.226448378 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/51846908-299c-4e1f-b417-c0af1029a45f-service-ca-bundle") pod "router-default-567b8b8c4d-864ph" (UID: "51846908-299c-4e1f-b417-c0af1029a45f") : configmap references non-existent config key: service-ca.crt Apr 16 22:07:08.965841 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:08.965766 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-metrics-certs podName:51846908-299c-4e1f-b417-c0af1029a45f nodeName:}" failed. No retries permitted until 2026-04-16 22:07:16.965756701 +0000 UTC m=+141.226482783 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-metrics-certs") pod "router-default-567b8b8c4d-864ph" (UID: "51846908-299c-4e1f-b417-c0af1029a45f") : secret "router-metrics-certs-default" not found Apr 16 22:07:10.090991 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:10.090958 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4cml"] Apr 16 22:07:10.094469 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:10.094449 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4cml" Apr 16 22:07:10.096807 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:10.096775 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 22:07:10.096807 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:10.096797 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:07:10.096983 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:10.096839 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-zn9dh\"" Apr 16 22:07:10.096983 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:10.096935 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 22:07:10.097842 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:10.097822 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 22:07:10.101059 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:10.101021 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4cml"] Apr 16 22:07:10.175498 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:10.175466 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a08162c7-e360-41a6-85ba-6c0fffcd0001-serving-cert\") pod \"service-ca-operator-d6fc45fc5-r4cml\" (UID: \"a08162c7-e360-41a6-85ba-6c0fffcd0001\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4cml" Apr 16 22:07:10.175655 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:10.175600 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a08162c7-e360-41a6-85ba-6c0fffcd0001-config\") pod \"service-ca-operator-d6fc45fc5-r4cml\" (UID: \"a08162c7-e360-41a6-85ba-6c0fffcd0001\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4cml" Apr 16 22:07:10.175655 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:10.175648 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sttwl\" (UniqueName: \"kubernetes.io/projected/a08162c7-e360-41a6-85ba-6c0fffcd0001-kube-api-access-sttwl\") pod \"service-ca-operator-d6fc45fc5-r4cml\" (UID: \"a08162c7-e360-41a6-85ba-6c0fffcd0001\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4cml" Apr 16 22:07:10.275995 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:10.275957 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a08162c7-e360-41a6-85ba-6c0fffcd0001-serving-cert\") pod \"service-ca-operator-d6fc45fc5-r4cml\" (UID: \"a08162c7-e360-41a6-85ba-6c0fffcd0001\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4cml" Apr 16 22:07:10.276177 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:10.276069 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a08162c7-e360-41a6-85ba-6c0fffcd0001-config\") pod \"service-ca-operator-d6fc45fc5-r4cml\" (UID: \"a08162c7-e360-41a6-85ba-6c0fffcd0001\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4cml" Apr 16 22:07:10.276177 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:10.276109 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sttwl\" (UniqueName: \"kubernetes.io/projected/a08162c7-e360-41a6-85ba-6c0fffcd0001-kube-api-access-sttwl\") pod \"service-ca-operator-d6fc45fc5-r4cml\" (UID: \"a08162c7-e360-41a6-85ba-6c0fffcd0001\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4cml" Apr 16 22:07:10.276771 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:10.276744 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a08162c7-e360-41a6-85ba-6c0fffcd0001-config\") pod \"service-ca-operator-d6fc45fc5-r4cml\" (UID: \"a08162c7-e360-41a6-85ba-6c0fffcd0001\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4cml" Apr 16 22:07:10.278666 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:10.278640 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a08162c7-e360-41a6-85ba-6c0fffcd0001-serving-cert\") pod \"service-ca-operator-d6fc45fc5-r4cml\" (UID: \"a08162c7-e360-41a6-85ba-6c0fffcd0001\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4cml" Apr 16 22:07:10.283148 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:10.283124 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sttwl\" (UniqueName: \"kubernetes.io/projected/a08162c7-e360-41a6-85ba-6c0fffcd0001-kube-api-access-sttwl\") pod \"service-ca-operator-d6fc45fc5-r4cml\" (UID: \"a08162c7-e360-41a6-85ba-6c0fffcd0001\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4cml" Apr 16 22:07:10.405642 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:10.405617 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4cml" Apr 16 22:07:10.521527 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:10.521499 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4cml"] Apr 16 22:07:10.524887 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:07:10.524856 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda08162c7_e360_41a6_85ba_6c0fffcd0001.slice/crio-0b51633ef6d482e643371e37a7cb0452451b85ed53cad11fdf8bd93aa039befa WatchSource:0}: Error finding container 0b51633ef6d482e643371e37a7cb0452451b85ed53cad11fdf8bd93aa039befa: Status 404 returned error can't find the container with id 0b51633ef6d482e643371e37a7cb0452451b85ed53cad11fdf8bd93aa039befa Apr 16 22:07:10.710501 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:10.710428 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lxr9z" event={"ID":"69e30ac7-c0e6-4cd4-ab1f-3df7d3277790","Type":"ContainerStarted","Data":"19cbbdfafa9ab6401e579031991fe46c9cbf625034734844c1a888e8caf0fd5b"} Apr 16 22:07:10.711520 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:10.711495 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4cml" event={"ID":"a08162c7-e360-41a6-85ba-6c0fffcd0001","Type":"ContainerStarted","Data":"0b51633ef6d482e643371e37a7cb0452451b85ed53cad11fdf8bd93aa039befa"} Apr 16 22:07:10.726784 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:10.726745 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lxr9z" podStartSLOduration=0.824804184 podStartE2EDuration="2.726729768s" podCreationTimestamp="2026-04-16 22:07:08 +0000 UTC" firstStartedPulling="2026-04-16 22:07:08.499212634 +0000 UTC m=+132.759938717" lastFinishedPulling="2026-04-16 22:07:10.401138203 +0000 UTC m=+134.661864301" observedRunningTime="2026-04-16 22:07:10.726412847 +0000 UTC m=+134.987138943" watchObservedRunningTime="2026-04-16 22:07:10.726729768 +0000 UTC m=+134.987455875" Apr 16 22:07:12.718213 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:12.718176 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4cml" event={"ID":"a08162c7-e360-41a6-85ba-6c0fffcd0001","Type":"ContainerStarted","Data":"dfe41e0eccec717c1782831890cf222f6e77474ff36e65e6348a0ac18d094e8d"} Apr 16 22:07:12.734299 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:12.734248 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4cml" podStartSLOduration=1.105533415 podStartE2EDuration="2.734230352s" podCreationTimestamp="2026-04-16 22:07:10 +0000 UTC" firstStartedPulling="2026-04-16 22:07:10.526699453 +0000 UTC m=+134.787425536" lastFinishedPulling="2026-04-16 22:07:12.155396376 +0000 UTC m=+136.416122473" observedRunningTime="2026-04-16 22:07:12.733907186 +0000 UTC m=+136.994633293" watchObservedRunningTime="2026-04-16 22:07:12.734230352 +0000 UTC m=+136.994956458" Apr 16 22:07:14.389092 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:14.389061 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" Apr 16 22:07:14.389558 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:14.389098 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" Apr 16 22:07:14.389558 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:14.389552 2576 scope.go:117] "RemoveContainer" containerID="7539a8f475353af81fc32c6ecebef435853fe7cb86f24fc650bac92ef69685b9" Apr 16 22:07:14.389755 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:14.389734 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fcfwt_openshift-console-operator(ad6a094a-d7bf-42af-90ae-94731039404b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" podUID="ad6a094a-d7bf-42af-90ae-94731039404b" Apr 16 22:07:15.349365 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:15.349328 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-8cgvh"] Apr 16 22:07:15.352312 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:15.352286 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-8cgvh" Apr 16 22:07:15.354258 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:15.354236 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-cl927\"" Apr 16 22:07:15.354392 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:15.354243 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 22:07:15.354392 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:15.354280 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 22:07:15.354915 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:15.354901 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 22:07:15.354973 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:15.354931 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 22:07:15.357841 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:15.357824 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-8cgvh"] Apr 16 22:07:15.413786 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:15.413754 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8cc640b4-5ee9-4a23-9800-9f0e3c5f353f-signing-cabundle\") pod \"service-ca-865cb79987-8cgvh\" (UID: \"8cc640b4-5ee9-4a23-9800-9f0e3c5f353f\") " pod="openshift-service-ca/service-ca-865cb79987-8cgvh" Apr 16 22:07:15.414114 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:15.413865 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8cc640b4-5ee9-4a23-9800-9f0e3c5f353f-signing-key\") pod \"service-ca-865cb79987-8cgvh\" (UID: \"8cc640b4-5ee9-4a23-9800-9f0e3c5f353f\") " pod="openshift-service-ca/service-ca-865cb79987-8cgvh" Apr 16 22:07:15.414114 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:15.413946 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g5gx\" (UniqueName: \"kubernetes.io/projected/8cc640b4-5ee9-4a23-9800-9f0e3c5f353f-kube-api-access-5g5gx\") pod \"service-ca-865cb79987-8cgvh\" (UID: \"8cc640b4-5ee9-4a23-9800-9f0e3c5f353f\") " pod="openshift-service-ca/service-ca-865cb79987-8cgvh" Apr 16 22:07:15.514481 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:15.514452 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5g5gx\" (UniqueName: \"kubernetes.io/projected/8cc640b4-5ee9-4a23-9800-9f0e3c5f353f-kube-api-access-5g5gx\") pod \"service-ca-865cb79987-8cgvh\" (UID: \"8cc640b4-5ee9-4a23-9800-9f0e3c5f353f\") " pod="openshift-service-ca/service-ca-865cb79987-8cgvh" Apr 16 22:07:15.514481 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:15.514482 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8cc640b4-5ee9-4a23-9800-9f0e3c5f353f-signing-cabundle\") pod \"service-ca-865cb79987-8cgvh\" (UID: \"8cc640b4-5ee9-4a23-9800-9f0e3c5f353f\") " pod="openshift-service-ca/service-ca-865cb79987-8cgvh" Apr 16 22:07:15.514659 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:15.514543 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8cc640b4-5ee9-4a23-9800-9f0e3c5f353f-signing-key\") pod \"service-ca-865cb79987-8cgvh\" (UID: \"8cc640b4-5ee9-4a23-9800-9f0e3c5f353f\") " pod="openshift-service-ca/service-ca-865cb79987-8cgvh" Apr 16 22:07:15.515294 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:15.515277 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/8cc640b4-5ee9-4a23-9800-9f0e3c5f353f-signing-cabundle\") pod \"service-ca-865cb79987-8cgvh\" (UID: \"8cc640b4-5ee9-4a23-9800-9f0e3c5f353f\") " pod="openshift-service-ca/service-ca-865cb79987-8cgvh" Apr 16 22:07:15.516906 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:15.516887 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/8cc640b4-5ee9-4a23-9800-9f0e3c5f353f-signing-key\") pod \"service-ca-865cb79987-8cgvh\" (UID: \"8cc640b4-5ee9-4a23-9800-9f0e3c5f353f\") " pod="openshift-service-ca/service-ca-865cb79987-8cgvh" Apr 16 22:07:15.555713 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:15.555681 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g5gx\" (UniqueName: \"kubernetes.io/projected/8cc640b4-5ee9-4a23-9800-9f0e3c5f353f-kube-api-access-5g5gx\") pod \"service-ca-865cb79987-8cgvh\" (UID: \"8cc640b4-5ee9-4a23-9800-9f0e3c5f353f\") " pod="openshift-service-ca/service-ca-865cb79987-8cgvh" Apr 16 22:07:15.661176 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:15.661099 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-8cgvh" Apr 16 22:07:15.771111 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:15.771081 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-8cgvh"] Apr 16 22:07:15.775876 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:07:15.775848 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cc640b4_5ee9_4a23_9800_9f0e3c5f353f.slice/crio-37744b0f40036d5aa8cf5242efecf0962894f8d74dcb0e324cd67cf9d735d165 WatchSource:0}: Error finding container 37744b0f40036d5aa8cf5242efecf0962894f8d74dcb0e324cd67cf9d735d165: Status 404 returned error can't find the container with id 37744b0f40036d5aa8cf5242efecf0962894f8d74dcb0e324cd67cf9d735d165 Apr 16 22:07:16.729343 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:16.729285 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-8cgvh" event={"ID":"8cc640b4-5ee9-4a23-9800-9f0e3c5f353f","Type":"ContainerStarted","Data":"2e9856ccd179ec34fdfcc328d6a3845b409084b766ed3966cecf00ac13662d81"} Apr 16 22:07:16.729343 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:16.729348 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-8cgvh" event={"ID":"8cc640b4-5ee9-4a23-9800-9f0e3c5f353f","Type":"ContainerStarted","Data":"37744b0f40036d5aa8cf5242efecf0962894f8d74dcb0e324cd67cf9d735d165"} Apr 16 22:07:16.745190 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:16.745149 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-8cgvh" podStartSLOduration=1.745135181 podStartE2EDuration="1.745135181s" podCreationTimestamp="2026-04-16 22:07:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:07:16.744300779 +0000 UTC m=+141.005026878" watchObservedRunningTime="2026-04-16 22:07:16.745135181 +0000 UTC m=+141.005861303" Apr 16 22:07:17.029016 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:17.028927 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51846908-299c-4e1f-b417-c0af1029a45f-service-ca-bundle\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:17.029016 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:17.028990 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-metrics-certs\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:17.029218 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:17.029095 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 22:07:17.029218 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:17.029118 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/51846908-299c-4e1f-b417-c0af1029a45f-service-ca-bundle podName:51846908-299c-4e1f-b417-c0af1029a45f nodeName:}" failed. No retries permitted until 2026-04-16 22:07:33.02909617 +0000 UTC m=+157.289822253 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/51846908-299c-4e1f-b417-c0af1029a45f-service-ca-bundle") pod "router-default-567b8b8c4d-864ph" (UID: "51846908-299c-4e1f-b417-c0af1029a45f") : configmap references non-existent config key: service-ca.crt Apr 16 22:07:17.029218 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:17.029145 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-metrics-certs podName:51846908-299c-4e1f-b417-c0af1029a45f nodeName:}" failed. No retries permitted until 2026-04-16 22:07:33.029136785 +0000 UTC m=+157.289862868 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-metrics-certs") pod "router-default-567b8b8c4d-864ph" (UID: "51846908-299c-4e1f-b417-c0af1029a45f") : secret "router-metrics-certs-default" not found Apr 16 22:07:27.278067 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:27.278036 2576 scope.go:117] "RemoveContainer" containerID="7539a8f475353af81fc32c6ecebef435853fe7cb86f24fc650bac92ef69685b9" Apr 16 22:07:27.757425 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:27.757399 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/2.log" Apr 16 22:07:27.757800 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:27.757782 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/1.log" Apr 16 22:07:27.757895 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:27.757821 2576 generic.go:358] "Generic (PLEG): container finished" podID="ad6a094a-d7bf-42af-90ae-94731039404b" containerID="4a0ec061a10c7079915ae41f63e3793ffd88c7eee02b036053d3adc0c419273a" exitCode=255 Apr 16 22:07:27.757895 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:27.757856 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" event={"ID":"ad6a094a-d7bf-42af-90ae-94731039404b","Type":"ContainerDied","Data":"4a0ec061a10c7079915ae41f63e3793ffd88c7eee02b036053d3adc0c419273a"} Apr 16 22:07:27.757895 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:27.757890 2576 scope.go:117] "RemoveContainer" containerID="7539a8f475353af81fc32c6ecebef435853fe7cb86f24fc650bac92ef69685b9" Apr 16 22:07:27.758183 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:27.758166 2576 scope.go:117] "RemoveContainer" containerID="4a0ec061a10c7079915ae41f63e3793ffd88c7eee02b036053d3adc0c419273a" Apr 16 22:07:27.758379 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:27.758358 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-fcfwt_openshift-console-operator(ad6a094a-d7bf-42af-90ae-94731039404b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" podUID="ad6a094a-d7bf-42af-90ae-94731039404b" Apr 16 22:07:28.761970 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:28.761937 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/2.log" Apr 16 22:07:31.648598 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:31.648560 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-xdjs7" podUID="049ab1ab-5b48-4b74-9527-0075f2bb7467" Apr 16 22:07:31.661237 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:31.661215 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-6dv5d" podUID="6d61274e-1ceb-496e-8a75-17916b110ed5" Apr 16 22:07:31.768412 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:31.768387 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xdjs7" Apr 16 22:07:32.290408 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:32.290376 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-nrljs" podUID="2f7c8d95-90cb-497a-8866-d2c45b825b72" Apr 16 22:07:33.049377 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:33.049333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-metrics-certs\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:33.049762 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:33.049516 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51846908-299c-4e1f-b417-c0af1029a45f-service-ca-bundle\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:33.050188 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:33.050165 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51846908-299c-4e1f-b417-c0af1029a45f-service-ca-bundle\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:33.051855 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:33.051836 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51846908-299c-4e1f-b417-c0af1029a45f-metrics-certs\") pod \"router-default-567b8b8c4d-864ph\" (UID: \"51846908-299c-4e1f-b417-c0af1029a45f\") " pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:33.058758 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:33.058735 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:33.174592 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:33.174567 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-567b8b8c4d-864ph"] Apr 16 22:07:33.177411 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:07:33.177382 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51846908_299c_4e1f_b417_c0af1029a45f.slice/crio-dc15a844185ca746e38c0d0c2f9f3b9474d0dc8d7aefb147bdb5fef99aee2a32 WatchSource:0}: Error finding container dc15a844185ca746e38c0d0c2f9f3b9474d0dc8d7aefb147bdb5fef99aee2a32: Status 404 returned error can't find the container with id dc15a844185ca746e38c0d0c2f9f3b9474d0dc8d7aefb147bdb5fef99aee2a32 Apr 16 22:07:33.774669 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:33.774632 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-567b8b8c4d-864ph" event={"ID":"51846908-299c-4e1f-b417-c0af1029a45f","Type":"ContainerStarted","Data":"822c3af7c1a2d07cd008b7d2c4afbe58e54b600ceb132df2f90abfddd44f62e4"} Apr 16 22:07:33.774669 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:33.774669 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-567b8b8c4d-864ph" event={"ID":"51846908-299c-4e1f-b417-c0af1029a45f","Type":"ContainerStarted","Data":"dc15a844185ca746e38c0d0c2f9f3b9474d0dc8d7aefb147bdb5fef99aee2a32"} Apr 16 22:07:33.790665 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:33.790622 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-567b8b8c4d-864ph" podStartSLOduration=32.790608235 podStartE2EDuration="32.790608235s" podCreationTimestamp="2026-04-16 22:07:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:07:33.789609278 +0000 UTC m=+158.050335382" watchObservedRunningTime="2026-04-16 22:07:33.790608235 +0000 UTC m=+158.051334339" Apr 16 22:07:34.059143 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:34.059070 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:34.062265 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:34.062235 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:34.389278 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:34.389256 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" Apr 16 22:07:34.389278 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:34.389282 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" Apr 16 22:07:34.389612 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:34.389599 2576 scope.go:117] "RemoveContainer" containerID="4a0ec061a10c7079915ae41f63e3793ffd88c7eee02b036053d3adc0c419273a" Apr 16 22:07:34.389780 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:34.389763 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-fcfwt_openshift-console-operator(ad6a094a-d7bf-42af-90ae-94731039404b)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" podUID="ad6a094a-d7bf-42af-90ae-94731039404b" Apr 16 22:07:34.777456 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:34.777377 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:34.778740 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:34.778721 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-567b8b8c4d-864ph" Apr 16 22:07:36.320653 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.320619 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-87bvh"] Apr 16 22:07:36.323740 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.323717 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-87bvh" Apr 16 22:07:36.326533 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.326512 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 22:07:36.326635 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.326542 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-wk4hv\"" Apr 16 22:07:36.326635 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.326595 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 22:07:36.334061 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.334002 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-87bvh"] Apr 16 22:07:36.422422 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.422394 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7958cfb6f-htkcm"] Apr 16 22:07:36.425261 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.425244 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.430566 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.430543 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 22:07:36.431823 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.431798 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 22:07:36.434630 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.434608 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6952s\"" Apr 16 22:07:36.434721 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.434685 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 22:07:36.440849 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.440830 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 22:07:36.452269 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.452243 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7958cfb6f-htkcm"] Apr 16 22:07:36.476505 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.476481 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ab00035f-c880-4d27-99da-a870c1adb974-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-87bvh\" (UID: \"ab00035f-c880-4d27-99da-a870c1adb974\") " pod="openshift-insights/insights-runtime-extractor-87bvh" Apr 16 22:07:36.476591 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.476521 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ab00035f-c880-4d27-99da-a870c1adb974-crio-socket\") pod \"insights-runtime-extractor-87bvh\" (UID: \"ab00035f-c880-4d27-99da-a870c1adb974\") " pod="openshift-insights/insights-runtime-extractor-87bvh" Apr 16 22:07:36.476631 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.476588 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqwcg\" (UniqueName: \"kubernetes.io/projected/ab00035f-c880-4d27-99da-a870c1adb974-kube-api-access-nqwcg\") pod \"insights-runtime-extractor-87bvh\" (UID: \"ab00035f-c880-4d27-99da-a870c1adb974\") " pod="openshift-insights/insights-runtime-extractor-87bvh" Apr 16 22:07:36.476631 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.476612 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ab00035f-c880-4d27-99da-a870c1adb974-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-87bvh\" (UID: \"ab00035f-c880-4d27-99da-a870c1adb974\") " pod="openshift-insights/insights-runtime-extractor-87bvh" Apr 16 22:07:36.476832 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.476692 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ab00035f-c880-4d27-99da-a870c1adb974-data-volume\") pod \"insights-runtime-extractor-87bvh\" (UID: \"ab00035f-c880-4d27-99da-a870c1adb974\") " pod="openshift-insights/insights-runtime-extractor-87bvh" Apr 16 22:07:36.577770 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.577714 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ab00035f-c880-4d27-99da-a870c1adb974-crio-socket\") pod \"insights-runtime-extractor-87bvh\" (UID: \"ab00035f-c880-4d27-99da-a870c1adb974\") " pod="openshift-insights/insights-runtime-extractor-87bvh" Apr 16 22:07:36.577770 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.577750 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls\") pod \"dns-default-xdjs7\" (UID: \"049ab1ab-5b48-4b74-9527-0075f2bb7467\") " pod="openshift-dns/dns-default-xdjs7" Apr 16 22:07:36.577929 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.577779 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-registry-certificates\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.577929 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.577830 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ab00035f-c880-4d27-99da-a870c1adb974-crio-socket\") pod \"insights-runtime-extractor-87bvh\" (UID: \"ab00035f-c880-4d27-99da-a870c1adb974\") " pod="openshift-insights/insights-runtime-extractor-87bvh" Apr 16 22:07:36.577929 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.577898 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8sk9\" (UniqueName: \"kubernetes.io/projected/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-kube-api-access-f8sk9\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.578087 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.577932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-bound-sa-token\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.578087 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.577954 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqwcg\" (UniqueName: \"kubernetes.io/projected/ab00035f-c880-4d27-99da-a870c1adb974-kube-api-access-nqwcg\") pod \"insights-runtime-extractor-87bvh\" (UID: \"ab00035f-c880-4d27-99da-a870c1adb974\") " pod="openshift-insights/insights-runtime-extractor-87bvh" Apr 16 22:07:36.578087 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.577980 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-image-registry-private-configuration\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.578087 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.578007 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ab00035f-c880-4d27-99da-a870c1adb974-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-87bvh\" (UID: \"ab00035f-c880-4d27-99da-a870c1adb974\") " pod="openshift-insights/insights-runtime-extractor-87bvh" Apr 16 22:07:36.578087 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.578035 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-registry-tls\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.578087 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.578064 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert\") pod \"ingress-canary-6dv5d\" (UID: \"6d61274e-1ceb-496e-8a75-17916b110ed5\") " pod="openshift-ingress-canary/ingress-canary-6dv5d" Apr 16 22:07:36.578087 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.578080 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-ca-trust-extracted\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.578441 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.578108 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ab00035f-c880-4d27-99da-a870c1adb974-data-volume\") pod \"insights-runtime-extractor-87bvh\" (UID: \"ab00035f-c880-4d27-99da-a870c1adb974\") " pod="openshift-insights/insights-runtime-extractor-87bvh" Apr 16 22:07:36.578441 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.578237 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ab00035f-c880-4d27-99da-a870c1adb974-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-87bvh\" (UID: \"ab00035f-c880-4d27-99da-a870c1adb974\") " pod="openshift-insights/insights-runtime-extractor-87bvh" Apr 16 22:07:36.578441 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.578276 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-installation-pull-secrets\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.578441 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.578336 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-trusted-ca\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.578654 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.578474 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ab00035f-c880-4d27-99da-a870c1adb974-data-volume\") pod \"insights-runtime-extractor-87bvh\" (UID: \"ab00035f-c880-4d27-99da-a870c1adb974\") " pod="openshift-insights/insights-runtime-extractor-87bvh" Apr 16 22:07:36.578775 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.578750 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ab00035f-c880-4d27-99da-a870c1adb974-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-87bvh\" (UID: \"ab00035f-c880-4d27-99da-a870c1adb974\") " pod="openshift-insights/insights-runtime-extractor-87bvh" Apr 16 22:07:36.580182 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.580164 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/049ab1ab-5b48-4b74-9527-0075f2bb7467-metrics-tls\") pod \"dns-default-xdjs7\" (UID: \"049ab1ab-5b48-4b74-9527-0075f2bb7467\") " pod="openshift-dns/dns-default-xdjs7" Apr 16 22:07:36.580256 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.580238 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ab00035f-c880-4d27-99da-a870c1adb974-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-87bvh\" (UID: \"ab00035f-c880-4d27-99da-a870c1adb974\") " pod="openshift-insights/insights-runtime-extractor-87bvh" Apr 16 22:07:36.580538 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.580521 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d61274e-1ceb-496e-8a75-17916b110ed5-cert\") pod \"ingress-canary-6dv5d\" (UID: \"6d61274e-1ceb-496e-8a75-17916b110ed5\") " pod="openshift-ingress-canary/ingress-canary-6dv5d" Apr 16 22:07:36.586142 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.586122 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqwcg\" (UniqueName: \"kubernetes.io/projected/ab00035f-c880-4d27-99da-a870c1adb974-kube-api-access-nqwcg\") pod \"insights-runtime-extractor-87bvh\" (UID: \"ab00035f-c880-4d27-99da-a870c1adb974\") " pod="openshift-insights/insights-runtime-extractor-87bvh" Apr 16 22:07:36.633154 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.633134 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-87bvh" Apr 16 22:07:36.679731 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.679682 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8sk9\" (UniqueName: \"kubernetes.io/projected/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-kube-api-access-f8sk9\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.680481 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.679736 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-bound-sa-token\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.680481 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.679769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-image-registry-private-configuration\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.680481 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.679811 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-registry-tls\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.680481 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.679846 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-ca-trust-extracted\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.680481 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.679889 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-installation-pull-secrets\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.680481 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.679920 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-trusted-ca\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.680481 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.679976 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-registry-certificates\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.680481 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.680334 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-ca-trust-extracted\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.682922 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.681235 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-registry-certificates\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.682922 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.682394 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-trusted-ca\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.684478 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.683795 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-registry-tls\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.684478 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.684171 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-image-registry-private-configuration\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.686174 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.686122 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-installation-pull-secrets\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.689209 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.688580 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-bound-sa-token\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.690407 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.690382 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8sk9\" (UniqueName: \"kubernetes.io/projected/e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954-kube-api-access-f8sk9\") pod \"image-registry-7958cfb6f-htkcm\" (UID: \"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954\") " pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.738601 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.738567 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:36.763685 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.763654 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-87bvh"] Apr 16 22:07:36.768414 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:07:36.768383 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab00035f_c880_4d27_99da_a870c1adb974.slice/crio-e4e1e9190709faf1a02e47627c85bceb610e8b8f4a1c5e81620de73e150a0c8c WatchSource:0}: Error finding container e4e1e9190709faf1a02e47627c85bceb610e8b8f4a1c5e81620de73e150a0c8c: Status 404 returned error can't find the container with id e4e1e9190709faf1a02e47627c85bceb610e8b8f4a1c5e81620de73e150a0c8c Apr 16 22:07:36.782734 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.782704 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-87bvh" event={"ID":"ab00035f-c880-4d27-99da-a870c1adb974","Type":"ContainerStarted","Data":"e4e1e9190709faf1a02e47627c85bceb610e8b8f4a1c5e81620de73e150a0c8c"} Apr 16 22:07:36.856154 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.856128 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7958cfb6f-htkcm"] Apr 16 22:07:36.859567 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:07:36.859537 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7e1d96e_bc71_418f_bb1f_ed1d7b4b4954.slice/crio-bd020fe800b63fd206947a78475f3721068c33b627199f9c767a8c894dba5e61 WatchSource:0}: Error finding container bd020fe800b63fd206947a78475f3721068c33b627199f9c767a8c894dba5e61: Status 404 returned error can't find the container with id bd020fe800b63fd206947a78475f3721068c33b627199f9c767a8c894dba5e61 Apr 16 22:07:36.870886 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.870867 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vts6k\"" Apr 16 22:07:36.879298 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.879279 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-xdjs7" Apr 16 22:07:36.991324 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:36.991283 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-xdjs7"] Apr 16 22:07:36.995393 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:07:36.995372 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod049ab1ab_5b48_4b74_9527_0075f2bb7467.slice/crio-a314e82f1d2c2c6d3ecbf518cc45fbd40c4b2f0d2704fe8185b3e5e7af54b317 WatchSource:0}: Error finding container a314e82f1d2c2c6d3ecbf518cc45fbd40c4b2f0d2704fe8185b3e5e7af54b317: Status 404 returned error can't find the container with id a314e82f1d2c2c6d3ecbf518cc45fbd40c4b2f0d2704fe8185b3e5e7af54b317 Apr 16 22:07:37.786510 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:37.786419 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xdjs7" event={"ID":"049ab1ab-5b48-4b74-9527-0075f2bb7467","Type":"ContainerStarted","Data":"a314e82f1d2c2c6d3ecbf518cc45fbd40c4b2f0d2704fe8185b3e5e7af54b317"} Apr 16 22:07:37.788141 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:37.788116 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-87bvh" event={"ID":"ab00035f-c880-4d27-99da-a870c1adb974","Type":"ContainerStarted","Data":"0b03df9d355d536b5d060fa52be1e1655b12e58ae33b8acd6705f3961f60050e"} Apr 16 22:07:37.788267 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:37.788146 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-87bvh" event={"ID":"ab00035f-c880-4d27-99da-a870c1adb974","Type":"ContainerStarted","Data":"84fb8418270080b0afc1e06df0e64d328d4177016cf186360b51ae755725704e"} Apr 16 22:07:37.789435 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:37.789410 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" event={"ID":"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954","Type":"ContainerStarted","Data":"2d10be9ca4eea2b870bbc06017bfb1ebf595df232c729d633ad72debad42a845"} Apr 16 22:07:37.789512 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:37.789441 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" event={"ID":"e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954","Type":"ContainerStarted","Data":"bd020fe800b63fd206947a78475f3721068c33b627199f9c767a8c894dba5e61"} Apr 16 22:07:37.789593 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:37.789579 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:37.807290 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:37.807249 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" podStartSLOduration=1.807236353 podStartE2EDuration="1.807236353s" podCreationTimestamp="2026-04-16 22:07:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:07:37.806327429 +0000 UTC m=+162.067053527" watchObservedRunningTime="2026-04-16 22:07:37.807236353 +0000 UTC m=+162.067962452" Apr 16 22:07:39.796073 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:39.796036 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xdjs7" event={"ID":"049ab1ab-5b48-4b74-9527-0075f2bb7467","Type":"ContainerStarted","Data":"76cddf6dcb5eaa9f6ae794f1a9242d71197a3faa0e2cd759fdbbc2527c210515"} Apr 16 22:07:39.796073 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:39.796076 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-xdjs7" event={"ID":"049ab1ab-5b48-4b74-9527-0075f2bb7467","Type":"ContainerStarted","Data":"651e09927155b0d772c8bb2b3f772d7eaec0bef4bc006470c71d721f57f6ad31"} Apr 16 22:07:39.796586 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:39.796173 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-xdjs7" Apr 16 22:07:39.797844 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:39.797822 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-87bvh" event={"ID":"ab00035f-c880-4d27-99da-a870c1adb974","Type":"ContainerStarted","Data":"d24fbe068cc95e782eb82e68f41e2ffae22e9a190580fae36a61f2c11d5cc356"} Apr 16 22:07:39.810863 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:39.810823 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-xdjs7" podStartSLOduration=129.873432925 podStartE2EDuration="2m11.810811875s" podCreationTimestamp="2026-04-16 22:05:28 +0000 UTC" firstStartedPulling="2026-04-16 22:07:36.997080984 +0000 UTC m=+161.257807067" lastFinishedPulling="2026-04-16 22:07:38.93445993 +0000 UTC m=+163.195186017" observedRunningTime="2026-04-16 22:07:39.809705814 +0000 UTC m=+164.070431918" watchObservedRunningTime="2026-04-16 22:07:39.810811875 +0000 UTC m=+164.071537980" Apr 16 22:07:39.824958 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:39.824898 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-87bvh" podStartSLOduration=1.7276634899999999 podStartE2EDuration="3.824886934s" podCreationTimestamp="2026-04-16 22:07:36 +0000 UTC" firstStartedPulling="2026-04-16 22:07:36.839600151 +0000 UTC m=+161.100326248" lastFinishedPulling="2026-04-16 22:07:38.936823609 +0000 UTC m=+163.197549692" observedRunningTime="2026-04-16 22:07:39.824247244 +0000 UTC m=+164.084973348" watchObservedRunningTime="2026-04-16 22:07:39.824886934 +0000 UTC m=+164.085613062" Apr 16 22:07:40.508390 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.508344 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-t8f88"] Apr 16 22:07:40.511400 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.511384 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-t8f88" Apr 16 22:07:40.514272 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.514249 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 22:07:40.514272 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.514262 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 22:07:40.514443 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.514357 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 22:07:40.514443 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.514392 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-zttk6\"" Apr 16 22:07:40.514443 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.514428 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 22:07:40.514591 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.514434 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 22:07:40.518889 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.518864 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-t8f88"] Apr 16 22:07:40.602524 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.602487 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-t8f88\" (UID: \"54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t8f88" Apr 16 22:07:40.602631 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.602543 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-t8f88\" (UID: \"54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t8f88" Apr 16 22:07:40.602631 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.602603 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z75fp\" (UniqueName: \"kubernetes.io/projected/54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3-kube-api-access-z75fp\") pod \"prometheus-operator-5676c8c784-t8f88\" (UID: \"54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t8f88" Apr 16 22:07:40.602705 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.602681 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-t8f88\" (UID: \"54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t8f88" Apr 16 22:07:40.703182 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.703154 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-t8f88\" (UID: \"54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t8f88" Apr 16 22:07:40.703279 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.703196 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-t8f88\" (UID: \"54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t8f88" Apr 16 22:07:40.703279 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.703230 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-t8f88\" (UID: \"54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t8f88" Apr 16 22:07:40.703279 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.703257 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z75fp\" (UniqueName: \"kubernetes.io/projected/54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3-kube-api-access-z75fp\") pod \"prometheus-operator-5676c8c784-t8f88\" (UID: \"54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t8f88" Apr 16 22:07:40.703765 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.703734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-t8f88\" (UID: \"54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t8f88" Apr 16 22:07:40.705647 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.705618 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-t8f88\" (UID: \"54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t8f88" Apr 16 22:07:40.705647 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.705644 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-t8f88\" (UID: \"54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t8f88" Apr 16 22:07:40.710771 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.710749 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z75fp\" (UniqueName: \"kubernetes.io/projected/54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3-kube-api-access-z75fp\") pod \"prometheus-operator-5676c8c784-t8f88\" (UID: \"54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t8f88" Apr 16 22:07:40.820328 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.820244 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-t8f88" Apr 16 22:07:40.928616 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:40.928590 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-t8f88"] Apr 16 22:07:40.931785 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:07:40.931763 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54ee9079_d4a2_4fab_a8d3_2b0d9cc43da3.slice/crio-cc3fcefa4727843327aad273a95837d1be43378d55950d76ad0850a4dba3a976 WatchSource:0}: Error finding container cc3fcefa4727843327aad273a95837d1be43378d55950d76ad0850a4dba3a976: Status 404 returned error can't find the container with id cc3fcefa4727843327aad273a95837d1be43378d55950d76ad0850a4dba3a976 Apr 16 22:07:41.805198 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:41.805164 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-t8f88" event={"ID":"54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3","Type":"ContainerStarted","Data":"cc3fcefa4727843327aad273a95837d1be43378d55950d76ad0850a4dba3a976"} Apr 16 22:07:42.809192 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:42.809153 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-t8f88" event={"ID":"54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3","Type":"ContainerStarted","Data":"cf20cf6696e520625829ac3b1b8bd39b670eae54d84fdc8e4a382e10287b55db"} Apr 16 22:07:42.809192 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:42.809198 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-t8f88" event={"ID":"54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3","Type":"ContainerStarted","Data":"83274efc6d9ef518fbe7fa627623464504233f0fdc9512d4929babebe3bf556a"} Apr 16 22:07:42.823664 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:42.823622 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-t8f88" podStartSLOduration=1.708024741 podStartE2EDuration="2.823608744s" podCreationTimestamp="2026-04-16 22:07:40 +0000 UTC" firstStartedPulling="2026-04-16 22:07:40.934040001 +0000 UTC m=+165.194766084" lastFinishedPulling="2026-04-16 22:07:42.04962399 +0000 UTC m=+166.310350087" observedRunningTime="2026-04-16 22:07:42.823055829 +0000 UTC m=+167.083781934" watchObservedRunningTime="2026-04-16 22:07:42.823608744 +0000 UTC m=+167.084334848" Apr 16 22:07:43.277530 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:43.277499 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6dv5d" Apr 16 22:07:43.279511 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:43.279490 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-q2drx\"" Apr 16 22:07:43.288204 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:43.288189 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6dv5d" Apr 16 22:07:43.400346 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:43.400322 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6dv5d"] Apr 16 22:07:43.403042 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:07:43.403017 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d61274e_1ceb_496e_8a75_17916b110ed5.slice/crio-90e461053068b40c95e530d5f6f216af126cd6b738dad1673801d142d02dba91 WatchSource:0}: Error finding container 90e461053068b40c95e530d5f6f216af126cd6b738dad1673801d142d02dba91: Status 404 returned error can't find the container with id 90e461053068b40c95e530d5f6f216af126cd6b738dad1673801d142d02dba91 Apr 16 22:07:43.813702 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:43.813605 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6dv5d" event={"ID":"6d61274e-1ceb-496e-8a75-17916b110ed5","Type":"ContainerStarted","Data":"90e461053068b40c95e530d5f6f216af126cd6b738dad1673801d142d02dba91"} Apr 16 22:07:44.837513 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.837042 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-dsspm"] Apr 16 22:07:44.857980 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.856624 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-fmjz2"] Apr 16 22:07:44.857980 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.856830 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" Apr 16 22:07:44.859706 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.858837 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-2nxml\"" Apr 16 22:07:44.859706 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.859076 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 22:07:44.859706 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.859361 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 22:07:44.859706 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.859554 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 22:07:44.869334 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.868455 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-dsspm"] Apr 16 22:07:44.869334 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.868592 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:44.871722 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.870840 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 22:07:44.871722 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.871055 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-gz4kv\"" Apr 16 22:07:44.871722 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.871334 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 22:07:44.871722 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.871570 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 22:07:44.944431 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.943771 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvmjf\" (UniqueName: \"kubernetes.io/projected/d91998a7-1fa5-477b-8779-9c8df24c3680-kube-api-access-pvmjf\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:44.944431 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.943834 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d91998a7-1fa5-477b-8779-9c8df24c3680-node-exporter-textfile\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:44.944431 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.943859 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc2vn\" (UniqueName: \"kubernetes.io/projected/d7ac4c50-4d27-4e9e-8655-01b821d74833-kube-api-access-nc2vn\") pod \"kube-state-metrics-69db897b98-dsspm\" (UID: \"d7ac4c50-4d27-4e9e-8655-01b821d74833\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" Apr 16 22:07:44.944431 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.943889 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d91998a7-1fa5-477b-8779-9c8df24c3680-node-exporter-tls\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:44.944431 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.943912 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d91998a7-1fa5-477b-8779-9c8df24c3680-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:44.944431 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.943943 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d91998a7-1fa5-477b-8779-9c8df24c3680-metrics-client-ca\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:44.944431 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.943965 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d7ac4c50-4d27-4e9e-8655-01b821d74833-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-dsspm\" (UID: \"d7ac4c50-4d27-4e9e-8655-01b821d74833\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" Apr 16 22:07:44.944431 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.943995 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d91998a7-1fa5-477b-8779-9c8df24c3680-sys\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:44.944431 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.944020 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d91998a7-1fa5-477b-8779-9c8df24c3680-node-exporter-accelerators-collector-config\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:44.944431 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.944052 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d91998a7-1fa5-477b-8779-9c8df24c3680-root\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:44.944431 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.944074 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d91998a7-1fa5-477b-8779-9c8df24c3680-node-exporter-wtmp\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:44.944431 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.944111 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d7ac4c50-4d27-4e9e-8655-01b821d74833-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-dsspm\" (UID: \"d7ac4c50-4d27-4e9e-8655-01b821d74833\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" Apr 16 22:07:44.944431 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.944145 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d7ac4c50-4d27-4e9e-8655-01b821d74833-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-dsspm\" (UID: \"d7ac4c50-4d27-4e9e-8655-01b821d74833\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" Apr 16 22:07:44.944431 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.944202 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d7ac4c50-4d27-4e9e-8655-01b821d74833-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-dsspm\" (UID: \"d7ac4c50-4d27-4e9e-8655-01b821d74833\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" Apr 16 22:07:44.944431 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:44.944235 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d7ac4c50-4d27-4e9e-8655-01b821d74833-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-dsspm\" (UID: \"d7ac4c50-4d27-4e9e-8655-01b821d74833\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" Apr 16 22:07:45.045144 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.045085 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pvmjf\" (UniqueName: \"kubernetes.io/projected/d91998a7-1fa5-477b-8779-9c8df24c3680-kube-api-access-pvmjf\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:45.045336 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.045149 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d91998a7-1fa5-477b-8779-9c8df24c3680-node-exporter-textfile\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:45.045336 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.045193 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nc2vn\" (UniqueName: \"kubernetes.io/projected/d7ac4c50-4d27-4e9e-8655-01b821d74833-kube-api-access-nc2vn\") pod \"kube-state-metrics-69db897b98-dsspm\" (UID: \"d7ac4c50-4d27-4e9e-8655-01b821d74833\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" Apr 16 22:07:45.045336 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.045223 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d91998a7-1fa5-477b-8779-9c8df24c3680-node-exporter-tls\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:45.045336 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.045250 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d91998a7-1fa5-477b-8779-9c8df24c3680-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:45.045336 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.045281 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d91998a7-1fa5-477b-8779-9c8df24c3680-metrics-client-ca\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:45.045336 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.045320 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d7ac4c50-4d27-4e9e-8655-01b821d74833-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-dsspm\" (UID: \"d7ac4c50-4d27-4e9e-8655-01b821d74833\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" Apr 16 22:07:45.045643 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.045352 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d91998a7-1fa5-477b-8779-9c8df24c3680-sys\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:45.045643 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.045379 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d91998a7-1fa5-477b-8779-9c8df24c3680-node-exporter-accelerators-collector-config\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:45.045643 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.045412 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d91998a7-1fa5-477b-8779-9c8df24c3680-root\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:45.045643 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.045434 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d91998a7-1fa5-477b-8779-9c8df24c3680-node-exporter-wtmp\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:45.045643 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.045458 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d7ac4c50-4d27-4e9e-8655-01b821d74833-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-dsspm\" (UID: \"d7ac4c50-4d27-4e9e-8655-01b821d74833\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" Apr 16 22:07:45.045643 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.045488 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d7ac4c50-4d27-4e9e-8655-01b821d74833-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-dsspm\" (UID: \"d7ac4c50-4d27-4e9e-8655-01b821d74833\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" Apr 16 22:07:45.045643 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.045541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d7ac4c50-4d27-4e9e-8655-01b821d74833-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-dsspm\" (UID: \"d7ac4c50-4d27-4e9e-8655-01b821d74833\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" Apr 16 22:07:45.045643 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.045578 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d7ac4c50-4d27-4e9e-8655-01b821d74833-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-dsspm\" (UID: \"d7ac4c50-4d27-4e9e-8655-01b821d74833\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" Apr 16 22:07:45.046224 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.046201 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d7ac4c50-4d27-4e9e-8655-01b821d74833-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-dsspm\" (UID: \"d7ac4c50-4d27-4e9e-8655-01b821d74833\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" Apr 16 22:07:45.046277 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.046258 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d7ac4c50-4d27-4e9e-8655-01b821d74833-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-dsspm\" (UID: \"d7ac4c50-4d27-4e9e-8655-01b821d74833\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" Apr 16 22:07:45.046363 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.046349 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d91998a7-1fa5-477b-8779-9c8df24c3680-sys\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:45.046502 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.046486 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d91998a7-1fa5-477b-8779-9c8df24c3680-node-exporter-textfile\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:45.046809 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:45.046795 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 22:07:45.046850 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.046814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d91998a7-1fa5-477b-8779-9c8df24c3680-node-exporter-accelerators-collector-config\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:45.046883 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:45.046851 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d91998a7-1fa5-477b-8779-9c8df24c3680-node-exporter-tls podName:d91998a7-1fa5-477b-8779-9c8df24c3680 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:45.54683346 +0000 UTC m=+169.807559547 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/d91998a7-1fa5-477b-8779-9c8df24c3680-node-exporter-tls") pod "node-exporter-fmjz2" (UID: "d91998a7-1fa5-477b-8779-9c8df24c3680") : secret "node-exporter-tls" not found Apr 16 22:07:45.046925 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.046874 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d91998a7-1fa5-477b-8779-9c8df24c3680-root\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:45.047024 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.047007 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d91998a7-1fa5-477b-8779-9c8df24c3680-node-exporter-wtmp\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:45.047465 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.047265 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d7ac4c50-4d27-4e9e-8655-01b821d74833-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-dsspm\" (UID: \"d7ac4c50-4d27-4e9e-8655-01b821d74833\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" Apr 16 22:07:45.047465 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:45.047386 2576 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 22:07:45.047465 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:07:45.047432 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d7ac4c50-4d27-4e9e-8655-01b821d74833-kube-state-metrics-tls podName:d7ac4c50-4d27-4e9e-8655-01b821d74833 nodeName:}" failed. No retries permitted until 2026-04-16 22:07:45.547416393 +0000 UTC m=+169.808142495 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/d7ac4c50-4d27-4e9e-8655-01b821d74833-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-dsspm" (UID: "d7ac4c50-4d27-4e9e-8655-01b821d74833") : secret "kube-state-metrics-tls" not found Apr 16 22:07:45.047687 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.047631 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d91998a7-1fa5-477b-8779-9c8df24c3680-metrics-client-ca\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:45.049971 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.049949 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d7ac4c50-4d27-4e9e-8655-01b821d74833-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-dsspm\" (UID: \"d7ac4c50-4d27-4e9e-8655-01b821d74833\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" Apr 16 22:07:45.050235 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.050215 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d91998a7-1fa5-477b-8779-9c8df24c3680-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:45.053432 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.053374 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvmjf\" (UniqueName: \"kubernetes.io/projected/d91998a7-1fa5-477b-8779-9c8df24c3680-kube-api-access-pvmjf\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:45.054214 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.054185 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc2vn\" (UniqueName: \"kubernetes.io/projected/d7ac4c50-4d27-4e9e-8655-01b821d74833-kube-api-access-nc2vn\") pod \"kube-state-metrics-69db897b98-dsspm\" (UID: \"d7ac4c50-4d27-4e9e-8655-01b821d74833\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" Apr 16 22:07:45.549262 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.549169 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d7ac4c50-4d27-4e9e-8655-01b821d74833-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-dsspm\" (UID: \"d7ac4c50-4d27-4e9e-8655-01b821d74833\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" Apr 16 22:07:45.549262 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.549245 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d91998a7-1fa5-477b-8779-9c8df24c3680-node-exporter-tls\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:45.551545 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.551515 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d91998a7-1fa5-477b-8779-9c8df24c3680-node-exporter-tls\") pod \"node-exporter-fmjz2\" (UID: \"d91998a7-1fa5-477b-8779-9c8df24c3680\") " pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:45.551661 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.551585 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d7ac4c50-4d27-4e9e-8655-01b821d74833-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-dsspm\" (UID: \"d7ac4c50-4d27-4e9e-8655-01b821d74833\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" Apr 16 22:07:45.770808 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.770776 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" Apr 16 22:07:45.780943 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.780921 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-fmjz2" Apr 16 22:07:45.789651 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:07:45.789622 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd91998a7_1fa5_477b_8779_9c8df24c3680.slice/crio-4b37cdf0b38b51882a86d327ab0e0436a058674466a3d60cc1ec8bbce53697ec WatchSource:0}: Error finding container 4b37cdf0b38b51882a86d327ab0e0436a058674466a3d60cc1ec8bbce53697ec: Status 404 returned error can't find the container with id 4b37cdf0b38b51882a86d327ab0e0436a058674466a3d60cc1ec8bbce53697ec Apr 16 22:07:45.824015 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.823977 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6dv5d" event={"ID":"6d61274e-1ceb-496e-8a75-17916b110ed5","Type":"ContainerStarted","Data":"ade69206468d4d4c571a4f2eb743313bcc12bb5b01ab331026e11b0e1a04c381"} Apr 16 22:07:45.825988 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.825937 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fmjz2" event={"ID":"d91998a7-1fa5-477b-8779-9c8df24c3680","Type":"ContainerStarted","Data":"4b37cdf0b38b51882a86d327ab0e0436a058674466a3d60cc1ec8bbce53697ec"} Apr 16 22:07:45.843248 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.843208 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6dv5d" podStartSLOduration=135.960510656 podStartE2EDuration="2m17.843194853s" podCreationTimestamp="2026-04-16 22:05:28 +0000 UTC" firstStartedPulling="2026-04-16 22:07:43.404941886 +0000 UTC m=+167.665667968" lastFinishedPulling="2026-04-16 22:07:45.287626079 +0000 UTC m=+169.548352165" observedRunningTime="2026-04-16 22:07:45.842009617 +0000 UTC m=+170.102735723" watchObservedRunningTime="2026-04-16 22:07:45.843194853 +0000 UTC m=+170.103920957" Apr 16 22:07:45.899624 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:45.899595 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-dsspm"] Apr 16 22:07:45.903014 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:07:45.902974 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7ac4c50_4d27_4e9e_8655_01b821d74833.slice/crio-1a04a8af6ac646f87bb7bd92ba024f79a54d61d9ccb076a0e78d7d751551f761 WatchSource:0}: Error finding container 1a04a8af6ac646f87bb7bd92ba024f79a54d61d9ccb076a0e78d7d751551f761: Status 404 returned error can't find the container with id 1a04a8af6ac646f87bb7bd92ba024f79a54d61d9ccb076a0e78d7d751551f761 Apr 16 22:07:46.280720 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:46.280681 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:07:46.830170 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:46.830133 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" event={"ID":"d7ac4c50-4d27-4e9e-8655-01b821d74833","Type":"ContainerStarted","Data":"1a04a8af6ac646f87bb7bd92ba024f79a54d61d9ccb076a0e78d7d751551f761"} Apr 16 22:07:47.836902 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:47.836872 2576 generic.go:358] "Generic (PLEG): container finished" podID="d91998a7-1fa5-477b-8779-9c8df24c3680" containerID="cc2cd268e7addbb12a5e26d41ac84579deed3ba3e8640d0bd4cdedab76122cfc" exitCode=0 Apr 16 22:07:47.837354 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:47.836971 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fmjz2" event={"ID":"d91998a7-1fa5-477b-8779-9c8df24c3680","Type":"ContainerDied","Data":"cc2cd268e7addbb12a5e26d41ac84579deed3ba3e8640d0bd4cdedab76122cfc"} Apr 16 22:07:47.838915 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:47.838873 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" event={"ID":"d7ac4c50-4d27-4e9e-8655-01b821d74833","Type":"ContainerStarted","Data":"4439de6c2d4a0b8580b1206fa90545a48b301eb43266c7f1a91fee0b171bb047"} Apr 16 22:07:47.838915 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:47.838908 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" event={"ID":"d7ac4c50-4d27-4e9e-8655-01b821d74833","Type":"ContainerStarted","Data":"444ec17c66069cef0f1684ce5ce6e673efdda74eeecfd376e8227b502046349f"} Apr 16 22:07:47.839073 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:47.838922 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" event={"ID":"d7ac4c50-4d27-4e9e-8655-01b821d74833","Type":"ContainerStarted","Data":"ab77897c6c36a20002cb3d9f044b6c4436549af7e366e7b56d608a82fb3e8200"} Apr 16 22:07:47.867574 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:47.867535 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-dsspm" podStartSLOduration=2.845223953 podStartE2EDuration="3.867522665s" podCreationTimestamp="2026-04-16 22:07:44 +0000 UTC" firstStartedPulling="2026-04-16 22:07:45.905083504 +0000 UTC m=+170.165809586" lastFinishedPulling="2026-04-16 22:07:46.927382215 +0000 UTC m=+171.188108298" observedRunningTime="2026-04-16 22:07:47.865970284 +0000 UTC m=+172.126696389" watchObservedRunningTime="2026-04-16 22:07:47.867522665 +0000 UTC m=+172.128248770" Apr 16 22:07:48.277956 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:48.277870 2576 scope.go:117] "RemoveContainer" containerID="4a0ec061a10c7079915ae41f63e3793ffd88c7eee02b036053d3adc0c419273a" Apr 16 22:07:48.843222 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:48.843183 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fmjz2" event={"ID":"d91998a7-1fa5-477b-8779-9c8df24c3680","Type":"ContainerStarted","Data":"948ded519ab97a984bce9a2b5602647eac063879f4002e15144913a39e5c2f7f"} Apr 16 22:07:48.843222 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:48.843225 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-fmjz2" event={"ID":"d91998a7-1fa5-477b-8779-9c8df24c3680","Type":"ContainerStarted","Data":"d0951d8e471eb80be1efb4f83fd31612070e5c6cbfd88022424735fb083337ef"} Apr 16 22:07:48.844693 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:48.844676 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/2.log" Apr 16 22:07:48.844806 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:48.844787 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" event={"ID":"ad6a094a-d7bf-42af-90ae-94731039404b","Type":"ContainerStarted","Data":"39a8c2a29a7a8182d8f496f41bba2437a4d4ad714f0e75a03a47aea47e564969"} Apr 16 22:07:48.845194 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:48.845171 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" Apr 16 22:07:48.859968 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:48.859920 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-fmjz2" podStartSLOduration=3.765012605 podStartE2EDuration="4.859908954s" podCreationTimestamp="2026-04-16 22:07:44 +0000 UTC" firstStartedPulling="2026-04-16 22:07:45.791587279 +0000 UTC m=+170.052313366" lastFinishedPulling="2026-04-16 22:07:46.886483628 +0000 UTC m=+171.147209715" observedRunningTime="2026-04-16 22:07:48.859409367 +0000 UTC m=+173.120135498" watchObservedRunningTime="2026-04-16 22:07:48.859908954 +0000 UTC m=+173.120635059" Apr 16 22:07:48.873749 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:48.873714 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" podStartSLOduration=43.160695825 podStartE2EDuration="44.873701151s" podCreationTimestamp="2026-04-16 22:07:04 +0000 UTC" firstStartedPulling="2026-04-16 22:07:04.702476642 +0000 UTC m=+128.963202740" lastFinishedPulling="2026-04-16 22:07:06.415481969 +0000 UTC m=+130.676208066" observedRunningTime="2026-04-16 22:07:48.87271241 +0000 UTC m=+173.133438539" watchObservedRunningTime="2026-04-16 22:07:48.873701151 +0000 UTC m=+173.134427255" Apr 16 22:07:49.348663 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:49.348633 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-fcfwt" Apr 16 22:07:49.628961 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:49.628880 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-zxhlm"] Apr 16 22:07:49.632713 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:49.632692 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zxhlm" Apr 16 22:07:49.634695 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:49.634675 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-l9d88\"" Apr 16 22:07:49.634779 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:49.634675 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 22:07:49.637660 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:49.637638 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-zxhlm"] Apr 16 22:07:49.687290 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:49.687262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/47b14de1-d053-4a3b-9584-2febe7614435-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-zxhlm\" (UID: \"47b14de1-d053-4a3b-9584-2febe7614435\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zxhlm" Apr 16 22:07:49.788187 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:49.788159 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/47b14de1-d053-4a3b-9584-2febe7614435-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-zxhlm\" (UID: \"47b14de1-d053-4a3b-9584-2febe7614435\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zxhlm" Apr 16 22:07:49.790589 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:49.790567 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/47b14de1-d053-4a3b-9584-2febe7614435-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-zxhlm\" (UID: \"47b14de1-d053-4a3b-9584-2febe7614435\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zxhlm" Apr 16 22:07:49.802952 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:49.802925 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-xdjs7" Apr 16 22:07:49.944143 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:49.944059 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zxhlm" Apr 16 22:07:50.063128 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:50.063052 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-zxhlm"] Apr 16 22:07:50.067097 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:07:50.067055 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47b14de1_d053_4a3b_9584_2febe7614435.slice/crio-59514a3073ba3db5c3fc5a39f250a567b82048e3257a1d7c8dbcb265b863d515 WatchSource:0}: Error finding container 59514a3073ba3db5c3fc5a39f250a567b82048e3257a1d7c8dbcb265b863d515: Status 404 returned error can't find the container with id 59514a3073ba3db5c3fc5a39f250a567b82048e3257a1d7c8dbcb265b863d515 Apr 16 22:07:50.852645 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:50.852598 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zxhlm" event={"ID":"47b14de1-d053-4a3b-9584-2febe7614435","Type":"ContainerStarted","Data":"59514a3073ba3db5c3fc5a39f250a567b82048e3257a1d7c8dbcb265b863d515"} Apr 16 22:07:51.861119 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:51.861082 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zxhlm" event={"ID":"47b14de1-d053-4a3b-9584-2febe7614435","Type":"ContainerStarted","Data":"1ef494fd251e11fcd2b64d13429cc63228556c1c8c08eaf14547b501b169e4d0"} Apr 16 22:07:51.861520 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:51.861287 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zxhlm" Apr 16 22:07:51.866529 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:51.866510 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zxhlm" Apr 16 22:07:51.875106 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:51.875069 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-zxhlm" podStartSLOduration=1.640647472 podStartE2EDuration="2.875056747s" podCreationTimestamp="2026-04-16 22:07:49 +0000 UTC" firstStartedPulling="2026-04-16 22:07:50.069354676 +0000 UTC m=+174.330080759" lastFinishedPulling="2026-04-16 22:07:51.303763947 +0000 UTC m=+175.564490034" observedRunningTime="2026-04-16 22:07:51.874103886 +0000 UTC m=+176.134829991" watchObservedRunningTime="2026-04-16 22:07:51.875056747 +0000 UTC m=+176.135782851" Apr 16 22:07:58.796397 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:58.796369 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7958cfb6f-htkcm" Apr 16 22:07:58.859000 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:58.858972 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6cb6d5dc9f-8k6qv"] Apr 16 22:07:58.863690 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:58.863667 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:07:58.865955 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:58.865680 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 22:07:58.865955 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:58.865689 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 22:07:58.865955 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:58.865825 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 22:07:58.865955 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:58.865839 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-np89x\"" Apr 16 22:07:58.865955 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:58.865878 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 22:07:58.865955 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:58.865941 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 22:07:58.866249 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:58.866124 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 22:07:58.866249 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:58.866153 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 22:07:58.869478 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:58.869459 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cb6d5dc9f-8k6qv"] Apr 16 22:07:58.961523 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:58.961493 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f0f7343-a628-49f2-a633-7df39d5ba0c8-console-oauth-config\") pod \"console-6cb6d5dc9f-8k6qv\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:07:58.961664 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:58.961536 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f0f7343-a628-49f2-a633-7df39d5ba0c8-oauth-serving-cert\") pod \"console-6cb6d5dc9f-8k6qv\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:07:58.961664 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:58.961566 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f0f7343-a628-49f2-a633-7df39d5ba0c8-console-serving-cert\") pod \"console-6cb6d5dc9f-8k6qv\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:07:58.961664 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:58.961586 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpspd\" (UniqueName: \"kubernetes.io/projected/8f0f7343-a628-49f2-a633-7df39d5ba0c8-kube-api-access-mpspd\") pod \"console-6cb6d5dc9f-8k6qv\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:07:58.961664 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:58.961621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f0f7343-a628-49f2-a633-7df39d5ba0c8-service-ca\") pod \"console-6cb6d5dc9f-8k6qv\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:07:58.961664 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:58.961646 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f0f7343-a628-49f2-a633-7df39d5ba0c8-console-config\") pod \"console-6cb6d5dc9f-8k6qv\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:07:59.063116 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:59.063037 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f0f7343-a628-49f2-a633-7df39d5ba0c8-console-oauth-config\") pod \"console-6cb6d5dc9f-8k6qv\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:07:59.063116 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:59.063087 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f0f7343-a628-49f2-a633-7df39d5ba0c8-oauth-serving-cert\") pod \"console-6cb6d5dc9f-8k6qv\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:07:59.063116 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:59.063118 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f0f7343-a628-49f2-a633-7df39d5ba0c8-console-serving-cert\") pod \"console-6cb6d5dc9f-8k6qv\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:07:59.063406 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:59.063137 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mpspd\" (UniqueName: \"kubernetes.io/projected/8f0f7343-a628-49f2-a633-7df39d5ba0c8-kube-api-access-mpspd\") pod \"console-6cb6d5dc9f-8k6qv\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:07:59.063406 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:59.063155 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f0f7343-a628-49f2-a633-7df39d5ba0c8-service-ca\") pod \"console-6cb6d5dc9f-8k6qv\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:07:59.063406 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:59.063197 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f0f7343-a628-49f2-a633-7df39d5ba0c8-console-config\") pod \"console-6cb6d5dc9f-8k6qv\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:07:59.063895 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:59.063872 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f0f7343-a628-49f2-a633-7df39d5ba0c8-service-ca\") pod \"console-6cb6d5dc9f-8k6qv\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:07:59.064035 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:59.064014 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f0f7343-a628-49f2-a633-7df39d5ba0c8-oauth-serving-cert\") pod \"console-6cb6d5dc9f-8k6qv\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:07:59.064079 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:59.064014 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f0f7343-a628-49f2-a633-7df39d5ba0c8-console-config\") pod \"console-6cb6d5dc9f-8k6qv\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:07:59.065620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:59.065591 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f0f7343-a628-49f2-a633-7df39d5ba0c8-console-oauth-config\") pod \"console-6cb6d5dc9f-8k6qv\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:07:59.065914 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:59.065898 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f0f7343-a628-49f2-a633-7df39d5ba0c8-console-serving-cert\") pod \"console-6cb6d5dc9f-8k6qv\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:07:59.070690 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:59.070667 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpspd\" (UniqueName: \"kubernetes.io/projected/8f0f7343-a628-49f2-a633-7df39d5ba0c8-kube-api-access-mpspd\") pod \"console-6cb6d5dc9f-8k6qv\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:07:59.174102 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:59.174066 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:07:59.309344 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:59.309290 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cb6d5dc9f-8k6qv"] Apr 16 22:07:59.313181 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:07:59.313116 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f0f7343_a628_49f2_a633_7df39d5ba0c8.slice/crio-ecf3ab8d7cf827541e023ee3d3c3cac030239a2872053c0a8a3dd231b2919727 WatchSource:0}: Error finding container ecf3ab8d7cf827541e023ee3d3c3cac030239a2872053c0a8a3dd231b2919727: Status 404 returned error can't find the container with id ecf3ab8d7cf827541e023ee3d3c3cac030239a2872053c0a8a3dd231b2919727 Apr 16 22:07:59.882103 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:07:59.882065 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cb6d5dc9f-8k6qv" event={"ID":"8f0f7343-a628-49f2-a633-7df39d5ba0c8","Type":"ContainerStarted","Data":"ecf3ab8d7cf827541e023ee3d3c3cac030239a2872053c0a8a3dd231b2919727"} Apr 16 22:08:01.231933 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.231897 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-699c78c78f-xz9gl"] Apr 16 22:08:01.235466 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.235438 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:01.242833 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.242802 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 22:08:01.243589 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.243566 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-699c78c78f-xz9gl"] Apr 16 22:08:01.280472 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.280444 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-trusted-ca-bundle\") pod \"console-699c78c78f-xz9gl\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:01.280628 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.280476 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2m5h\" (UniqueName: \"kubernetes.io/projected/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-kube-api-access-n2m5h\") pod \"console-699c78c78f-xz9gl\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:01.280628 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.280541 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-service-ca\") pod \"console-699c78c78f-xz9gl\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:01.280628 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.280596 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-console-oauth-config\") pod \"console-699c78c78f-xz9gl\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:01.280628 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.280615 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-console-config\") pod \"console-699c78c78f-xz9gl\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:01.280770 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.280644 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-oauth-serving-cert\") pod \"console-699c78c78f-xz9gl\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:01.280770 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.280700 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-console-serving-cert\") pod \"console-699c78c78f-xz9gl\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:01.381277 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.381243 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-trusted-ca-bundle\") pod \"console-699c78c78f-xz9gl\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:01.381468 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.381291 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2m5h\" (UniqueName: \"kubernetes.io/projected/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-kube-api-access-n2m5h\") pod \"console-699c78c78f-xz9gl\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:01.381468 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.381362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-service-ca\") pod \"console-699c78c78f-xz9gl\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:01.381468 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.381404 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-console-oauth-config\") pod \"console-699c78c78f-xz9gl\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:01.381468 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.381427 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-console-config\") pod \"console-699c78c78f-xz9gl\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:01.381468 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.381463 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-oauth-serving-cert\") pod \"console-699c78c78f-xz9gl\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:01.381729 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.381553 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-console-serving-cert\") pod \"console-699c78c78f-xz9gl\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:01.382261 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.382233 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-trusted-ca-bundle\") pod \"console-699c78c78f-xz9gl\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:01.382474 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.382230 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-service-ca\") pod \"console-699c78c78f-xz9gl\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:01.382474 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.382357 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-console-config\") pod \"console-699c78c78f-xz9gl\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:01.382474 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.382365 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-oauth-serving-cert\") pod \"console-699c78c78f-xz9gl\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:01.384532 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.384475 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-console-oauth-config\") pod \"console-699c78c78f-xz9gl\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:01.384663 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.384624 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-console-serving-cert\") pod \"console-699c78c78f-xz9gl\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:01.388992 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.388966 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2m5h\" (UniqueName: \"kubernetes.io/projected/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-kube-api-access-n2m5h\") pod \"console-699c78c78f-xz9gl\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:01.569854 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:01.569777 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:02.002060 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:02.002015 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-699c78c78f-xz9gl"] Apr 16 22:08:02.004396 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:08:02.004371 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62fc57e1_f7ef_49a3_a51e_4f1ad1e624fb.slice/crio-58d1a6adb99c6bc15e7cfbbbcff41650f52a98afe3dcc16a95c1df7dfb9b1df8 WatchSource:0}: Error finding container 58d1a6adb99c6bc15e7cfbbbcff41650f52a98afe3dcc16a95c1df7dfb9b1df8: Status 404 returned error can't find the container with id 58d1a6adb99c6bc15e7cfbbbcff41650f52a98afe3dcc16a95c1df7dfb9b1df8 Apr 16 22:08:02.891137 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:02.891098 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cb6d5dc9f-8k6qv" event={"ID":"8f0f7343-a628-49f2-a633-7df39d5ba0c8","Type":"ContainerStarted","Data":"ac0e29e9fe6a7a90d2f73ff8435f0d941e39f77dc919e2674db6a98d009db933"} Apr 16 22:08:02.892542 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:02.892513 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699c78c78f-xz9gl" event={"ID":"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb","Type":"ContainerStarted","Data":"a7154748887300aea9d5dfb0b61a16b962c9b166d4c92d3066ae53bdd76c9510"} Apr 16 22:08:02.892652 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:02.892549 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699c78c78f-xz9gl" event={"ID":"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb","Type":"ContainerStarted","Data":"58d1a6adb99c6bc15e7cfbbbcff41650f52a98afe3dcc16a95c1df7dfb9b1df8"} Apr 16 22:08:02.907600 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:02.907561 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6cb6d5dc9f-8k6qv" podStartSLOduration=2.278848678 podStartE2EDuration="4.907547427s" podCreationTimestamp="2026-04-16 22:07:58 +0000 UTC" firstStartedPulling="2026-04-16 22:07:59.315690427 +0000 UTC m=+183.576416526" lastFinishedPulling="2026-04-16 22:08:01.944389189 +0000 UTC m=+186.205115275" observedRunningTime="2026-04-16 22:08:02.906601589 +0000 UTC m=+187.167327693" watchObservedRunningTime="2026-04-16 22:08:02.907547427 +0000 UTC m=+187.168273531" Apr 16 22:08:02.920374 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:02.920329 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-699c78c78f-xz9gl" podStartSLOduration=1.9202935810000001 podStartE2EDuration="1.920293581s" podCreationTimestamp="2026-04-16 22:08:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:08:02.920269078 +0000 UTC m=+187.180995180" watchObservedRunningTime="2026-04-16 22:08:02.920293581 +0000 UTC m=+187.181019686" Apr 16 22:08:09.175241 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:09.175200 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:08:09.175651 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:09.175257 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:08:09.180031 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:09.180011 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:08:09.915528 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:09.915502 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:08:11.570224 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:11.570192 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:11.570224 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:11.570231 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:11.575906 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:11.575882 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:11.920423 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:11.920399 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:08:11.963642 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:11.963615 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6cb6d5dc9f-8k6qv"] Apr 16 22:08:14.926275 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:14.926238 2576 generic.go:358] "Generic (PLEG): container finished" podID="9b8d4070-576e-4533-a43c-28e13d203ec1" containerID="4abce5932b7163943a04a934b77b7fe25ad8b635a272f8fbc381a16d50e0e67f" exitCode=0 Apr 16 22:08:14.926668 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:14.926326 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-r49p7" event={"ID":"9b8d4070-576e-4533-a43c-28e13d203ec1","Type":"ContainerDied","Data":"4abce5932b7163943a04a934b77b7fe25ad8b635a272f8fbc381a16d50e0e67f"} Apr 16 22:08:14.926715 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:14.926675 2576 scope.go:117] "RemoveContainer" containerID="4abce5932b7163943a04a934b77b7fe25ad8b635a272f8fbc381a16d50e0e67f" Apr 16 22:08:15.930821 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:15.930788 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-r49p7" event={"ID":"9b8d4070-576e-4533-a43c-28e13d203ec1","Type":"ContainerStarted","Data":"b1cda6e96e61832a3d64d72929e46f0f65e756d79990aaa50cdd9bc838db35ab"} Apr 16 22:08:18.056205 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:18.056173 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-dsspm_d7ac4c50-4d27-4e9e-8655-01b821d74833/kube-state-metrics/0.log" Apr 16 22:08:18.255998 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:18.255974 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-dsspm_d7ac4c50-4d27-4e9e-8655-01b821d74833/kube-rbac-proxy-main/0.log" Apr 16 22:08:18.455330 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:18.455296 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-dsspm_d7ac4c50-4d27-4e9e-8655-01b821d74833/kube-rbac-proxy-self/0.log" Apr 16 22:08:18.855341 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:18.855295 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-zxhlm_47b14de1-d053-4a3b-9584-2febe7614435/monitoring-plugin/0.log" Apr 16 22:08:20.255216 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:20.255189 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fmjz2_d91998a7-1fa5-477b-8779-9c8df24c3680/init-textfile/0.log" Apr 16 22:08:20.456192 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:20.456158 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fmjz2_d91998a7-1fa5-477b-8779-9c8df24c3680/node-exporter/0.log" Apr 16 22:08:20.655465 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:20.655438 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fmjz2_d91998a7-1fa5-477b-8779-9c8df24c3680/kube-rbac-proxy/0.log" Apr 16 22:08:22.857337 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:22.857293 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-t8f88_54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3/prometheus-operator/0.log" Apr 16 22:08:23.054859 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:23.054813 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-t8f88_54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3/kube-rbac-proxy/0.log" Apr 16 22:08:24.855768 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:24.855736 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/2.log" Apr 16 22:08:25.058712 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:25.058679 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/3.log" Apr 16 22:08:25.256060 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:25.255982 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-699c78c78f-xz9gl_62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb/console/0.log" Apr 16 22:08:25.455269 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:25.455233 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cb6d5dc9f-8k6qv_8f0f7343-a628-49f2-a633-7df39d5ba0c8/console/0.log" Apr 16 22:08:25.856223 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:25.856197 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-567b8b8c4d-864ph_51846908-299c-4e1f-b417-c0af1029a45f/router/0.log" Apr 16 22:08:26.255247 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:26.255165 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6dv5d_6d61274e-1ceb-496e-8a75-17916b110ed5/serve-healthcheck-canary/0.log" Apr 16 22:08:35.984975 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:35.984938 2576 generic.go:358] "Generic (PLEG): container finished" podID="69e30ac7-c0e6-4cd4-ab1f-3df7d3277790" containerID="19cbbdfafa9ab6401e579031991fe46c9cbf625034734844c1a888e8caf0fd5b" exitCode=0 Apr 16 22:08:35.985386 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:35.985013 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lxr9z" event={"ID":"69e30ac7-c0e6-4cd4-ab1f-3df7d3277790","Type":"ContainerDied","Data":"19cbbdfafa9ab6401e579031991fe46c9cbf625034734844c1a888e8caf0fd5b"} Apr 16 22:08:35.985386 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:35.985339 2576 scope.go:117] "RemoveContainer" containerID="19cbbdfafa9ab6401e579031991fe46c9cbf625034734844c1a888e8caf0fd5b" Apr 16 22:08:36.983206 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:36.983151 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6cb6d5dc9f-8k6qv" podUID="8f0f7343-a628-49f2-a633-7df39d5ba0c8" containerName="console" containerID="cri-o://ac0e29e9fe6a7a90d2f73ff8435f0d941e39f77dc919e2674db6a98d009db933" gracePeriod=15 Apr 16 22:08:36.990544 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:36.990516 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-lxr9z" event={"ID":"69e30ac7-c0e6-4cd4-ab1f-3df7d3277790","Type":"ContainerStarted","Data":"0df5924ea01bfee5db4b7b25529386038fa4ffdeca5295cc5d478ed4c609d1ec"} Apr 16 22:08:37.233768 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.233714 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cb6d5dc9f-8k6qv_8f0f7343-a628-49f2-a633-7df39d5ba0c8/console/0.log" Apr 16 22:08:37.233870 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.233785 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:08:37.375238 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.375201 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f0f7343-a628-49f2-a633-7df39d5ba0c8-console-serving-cert\") pod \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " Apr 16 22:08:37.375406 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.375261 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f0f7343-a628-49f2-a633-7df39d5ba0c8-console-config\") pod \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " Apr 16 22:08:37.375406 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.375297 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f0f7343-a628-49f2-a633-7df39d5ba0c8-oauth-serving-cert\") pod \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " Apr 16 22:08:37.375406 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.375367 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f0f7343-a628-49f2-a633-7df39d5ba0c8-console-oauth-config\") pod \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " Apr 16 22:08:37.375406 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.375392 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpspd\" (UniqueName: \"kubernetes.io/projected/8f0f7343-a628-49f2-a633-7df39d5ba0c8-kube-api-access-mpspd\") pod \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " Apr 16 22:08:37.375633 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.375428 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f0f7343-a628-49f2-a633-7df39d5ba0c8-service-ca\") pod \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\" (UID: \"8f0f7343-a628-49f2-a633-7df39d5ba0c8\") " Apr 16 22:08:37.375797 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.375764 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f0f7343-a628-49f2-a633-7df39d5ba0c8-console-config" (OuterVolumeSpecName: "console-config") pod "8f0f7343-a628-49f2-a633-7df39d5ba0c8" (UID: "8f0f7343-a628-49f2-a633-7df39d5ba0c8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:08:37.375921 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.375865 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f0f7343-a628-49f2-a633-7df39d5ba0c8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8f0f7343-a628-49f2-a633-7df39d5ba0c8" (UID: "8f0f7343-a628-49f2-a633-7df39d5ba0c8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:08:37.375921 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.375876 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f0f7343-a628-49f2-a633-7df39d5ba0c8-service-ca" (OuterVolumeSpecName: "service-ca") pod "8f0f7343-a628-49f2-a633-7df39d5ba0c8" (UID: "8f0f7343-a628-49f2-a633-7df39d5ba0c8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:08:37.377674 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.377654 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f0f7343-a628-49f2-a633-7df39d5ba0c8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8f0f7343-a628-49f2-a633-7df39d5ba0c8" (UID: "8f0f7343-a628-49f2-a633-7df39d5ba0c8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:08:37.377753 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.377679 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f0f7343-a628-49f2-a633-7df39d5ba0c8-kube-api-access-mpspd" (OuterVolumeSpecName: "kube-api-access-mpspd") pod "8f0f7343-a628-49f2-a633-7df39d5ba0c8" (UID: "8f0f7343-a628-49f2-a633-7df39d5ba0c8"). InnerVolumeSpecName "kube-api-access-mpspd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:08:37.377753 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.377683 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f0f7343-a628-49f2-a633-7df39d5ba0c8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8f0f7343-a628-49f2-a633-7df39d5ba0c8" (UID: "8f0f7343-a628-49f2-a633-7df39d5ba0c8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:08:37.476379 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.476337 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f0f7343-a628-49f2-a633-7df39d5ba0c8-console-oauth-config\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:08:37.476379 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.476366 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mpspd\" (UniqueName: \"kubernetes.io/projected/8f0f7343-a628-49f2-a633-7df39d5ba0c8-kube-api-access-mpspd\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:08:37.476379 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.476385 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f0f7343-a628-49f2-a633-7df39d5ba0c8-service-ca\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:08:37.476603 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.476395 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f0f7343-a628-49f2-a633-7df39d5ba0c8-console-serving-cert\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:08:37.476603 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.476405 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f0f7343-a628-49f2-a633-7df39d5ba0c8-console-config\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:08:37.476603 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.476413 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f0f7343-a628-49f2-a633-7df39d5ba0c8-oauth-serving-cert\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:08:37.994489 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.994461 2576 generic.go:358] "Generic (PLEG): container finished" podID="a08162c7-e360-41a6-85ba-6c0fffcd0001" containerID="dfe41e0eccec717c1782831890cf222f6e77474ff36e65e6348a0ac18d094e8d" exitCode=0 Apr 16 22:08:37.994826 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.994527 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4cml" event={"ID":"a08162c7-e360-41a6-85ba-6c0fffcd0001","Type":"ContainerDied","Data":"dfe41e0eccec717c1782831890cf222f6e77474ff36e65e6348a0ac18d094e8d"} Apr 16 22:08:37.994915 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.994900 2576 scope.go:117] "RemoveContainer" containerID="dfe41e0eccec717c1782831890cf222f6e77474ff36e65e6348a0ac18d094e8d" Apr 16 22:08:37.995771 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.995756 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cb6d5dc9f-8k6qv_8f0f7343-a628-49f2-a633-7df39d5ba0c8/console/0.log" Apr 16 22:08:37.995846 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.995788 2576 generic.go:358] "Generic (PLEG): container finished" podID="8f0f7343-a628-49f2-a633-7df39d5ba0c8" containerID="ac0e29e9fe6a7a90d2f73ff8435f0d941e39f77dc919e2674db6a98d009db933" exitCode=2 Apr 16 22:08:37.995846 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.995815 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cb6d5dc9f-8k6qv" event={"ID":"8f0f7343-a628-49f2-a633-7df39d5ba0c8","Type":"ContainerDied","Data":"ac0e29e9fe6a7a90d2f73ff8435f0d941e39f77dc919e2674db6a98d009db933"} Apr 16 22:08:37.995846 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.995832 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cb6d5dc9f-8k6qv" event={"ID":"8f0f7343-a628-49f2-a633-7df39d5ba0c8","Type":"ContainerDied","Data":"ecf3ab8d7cf827541e023ee3d3c3cac030239a2872053c0a8a3dd231b2919727"} Apr 16 22:08:37.995846 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.995845 2576 scope.go:117] "RemoveContainer" containerID="ac0e29e9fe6a7a90d2f73ff8435f0d941e39f77dc919e2674db6a98d009db933" Apr 16 22:08:37.995982 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:37.995847 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cb6d5dc9f-8k6qv" Apr 16 22:08:38.006349 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:38.006326 2576 scope.go:117] "RemoveContainer" containerID="ac0e29e9fe6a7a90d2f73ff8435f0d941e39f77dc919e2674db6a98d009db933" Apr 16 22:08:38.006628 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:08:38.006599 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac0e29e9fe6a7a90d2f73ff8435f0d941e39f77dc919e2674db6a98d009db933\": container with ID starting with ac0e29e9fe6a7a90d2f73ff8435f0d941e39f77dc919e2674db6a98d009db933 not found: ID does not exist" containerID="ac0e29e9fe6a7a90d2f73ff8435f0d941e39f77dc919e2674db6a98d009db933" Apr 16 22:08:38.006700 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:38.006636 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac0e29e9fe6a7a90d2f73ff8435f0d941e39f77dc919e2674db6a98d009db933"} err="failed to get container status \"ac0e29e9fe6a7a90d2f73ff8435f0d941e39f77dc919e2674db6a98d009db933\": rpc error: code = NotFound desc = could not find container \"ac0e29e9fe6a7a90d2f73ff8435f0d941e39f77dc919e2674db6a98d009db933\": container with ID starting with ac0e29e9fe6a7a90d2f73ff8435f0d941e39f77dc919e2674db6a98d009db933 not found: ID does not exist" Apr 16 22:08:38.022422 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:38.022400 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6cb6d5dc9f-8k6qv"] Apr 16 22:08:38.026065 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:38.026046 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6cb6d5dc9f-8k6qv"] Apr 16 22:08:38.281885 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:38.281804 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f0f7343-a628-49f2-a633-7df39d5ba0c8" path="/var/lib/kubelet/pods/8f0f7343-a628-49f2-a633-7df39d5ba0c8/volumes" Apr 16 22:08:39.001035 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:08:39.001007 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-r4cml" event={"ID":"a08162c7-e360-41a6-85ba-6c0fffcd0001","Type":"ContainerStarted","Data":"f6353edcbfa2e1a562f94401a50daadef5aeac72fe285397ab67a63ff1ac4d7f"} Apr 16 22:09:06.997922 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:06.997885 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs\") pod \"network-metrics-daemon-nrljs\" (UID: \"2f7c8d95-90cb-497a-8866-d2c45b825b72\") " pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:09:07.000219 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:07.000195 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2f7c8d95-90cb-497a-8866-d2c45b825b72-metrics-certs\") pod \"network-metrics-daemon-nrljs\" (UID: \"2f7c8d95-90cb-497a-8866-d2c45b825b72\") " pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:09:07.283718 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:07.283632 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vrzhk\"" Apr 16 22:09:07.292219 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:07.292197 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nrljs" Apr 16 22:09:07.410412 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:07.410382 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nrljs"] Apr 16 22:09:07.413373 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:09:07.413345 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f7c8d95_90cb_497a_8866_d2c45b825b72.slice/crio-f971d75a6825fb81f6970d0ee2b93343f2a8145c5a3ca631c09615fa01d5765c WatchSource:0}: Error finding container f971d75a6825fb81f6970d0ee2b93343f2a8145c5a3ca631c09615fa01d5765c: Status 404 returned error can't find the container with id f971d75a6825fb81f6970d0ee2b93343f2a8145c5a3ca631c09615fa01d5765c Apr 16 22:09:08.084063 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:08.084022 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nrljs" event={"ID":"2f7c8d95-90cb-497a-8866-d2c45b825b72","Type":"ContainerStarted","Data":"f971d75a6825fb81f6970d0ee2b93343f2a8145c5a3ca631c09615fa01d5765c"} Apr 16 22:09:09.089348 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:09.089298 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nrljs" event={"ID":"2f7c8d95-90cb-497a-8866-d2c45b825b72","Type":"ContainerStarted","Data":"55cb4bfc6b127975dd647cbd13287671a52e76abcfe67d4709a00029dc30e314"} Apr 16 22:09:09.089348 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:09.089348 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nrljs" event={"ID":"2f7c8d95-90cb-497a-8866-d2c45b825b72","Type":"ContainerStarted","Data":"976dec541fea18088d031ad32448d60dda44779a3aa9c4ca2fc1df4182e1687f"} Apr 16 22:09:09.106174 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:09.105781 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-nrljs" podStartSLOduration=252.091530963 podStartE2EDuration="4m13.105741764s" podCreationTimestamp="2026-04-16 22:04:56 +0000 UTC" firstStartedPulling="2026-04-16 22:09:07.415362519 +0000 UTC m=+251.676088620" lastFinishedPulling="2026-04-16 22:09:08.429573337 +0000 UTC m=+252.690299421" observedRunningTime="2026-04-16 22:09:09.105749177 +0000 UTC m=+253.366475306" watchObservedRunningTime="2026-04-16 22:09:09.105741764 +0000 UTC m=+253.366467867" Apr 16 22:09:12.441959 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.441927 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6ff74786d-xxm2t"] Apr 16 22:09:12.442327 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.442223 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f0f7343-a628-49f2-a633-7df39d5ba0c8" containerName="console" Apr 16 22:09:12.442327 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.442237 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f0f7343-a628-49f2-a633-7df39d5ba0c8" containerName="console" Apr 16 22:09:12.442327 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.442285 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f0f7343-a628-49f2-a633-7df39d5ba0c8" containerName="console" Apr 16 22:09:12.445472 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.445449 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.453812 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.453787 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6ff74786d-xxm2t"] Apr 16 22:09:12.541212 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.541171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-oauth-serving-cert\") pod \"console-6ff74786d-xxm2t\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.541212 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.541213 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95b77ed3-de0b-4903-a989-e27d60d37342-console-oauth-config\") pod \"console-6ff74786d-xxm2t\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.541422 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.541244 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-console-config\") pod \"console-6ff74786d-xxm2t\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.541422 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.541330 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-trusted-ca-bundle\") pod \"console-6ff74786d-xxm2t\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.541422 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.541372 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-service-ca\") pod \"console-6ff74786d-xxm2t\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.541422 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.541411 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95b77ed3-de0b-4903-a989-e27d60d37342-console-serving-cert\") pod \"console-6ff74786d-xxm2t\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.541542 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.541435 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xkd6\" (UniqueName: \"kubernetes.io/projected/95b77ed3-de0b-4903-a989-e27d60d37342-kube-api-access-5xkd6\") pod \"console-6ff74786d-xxm2t\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.642575 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.642540 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-console-config\") pod \"console-6ff74786d-xxm2t\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.642728 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.642583 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-trusted-ca-bundle\") pod \"console-6ff74786d-xxm2t\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.642728 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.642619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-service-ca\") pod \"console-6ff74786d-xxm2t\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.642728 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.642646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95b77ed3-de0b-4903-a989-e27d60d37342-console-serving-cert\") pod \"console-6ff74786d-xxm2t\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.642728 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.642680 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xkd6\" (UniqueName: \"kubernetes.io/projected/95b77ed3-de0b-4903-a989-e27d60d37342-kube-api-access-5xkd6\") pod \"console-6ff74786d-xxm2t\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.642728 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.642722 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-oauth-serving-cert\") pod \"console-6ff74786d-xxm2t\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.642981 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.642760 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95b77ed3-de0b-4903-a989-e27d60d37342-console-oauth-config\") pod \"console-6ff74786d-xxm2t\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.643381 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.643354 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-console-config\") pod \"console-6ff74786d-xxm2t\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.643518 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.643403 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-service-ca\") pod \"console-6ff74786d-xxm2t\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.643518 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.643484 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-oauth-serving-cert\") pod \"console-6ff74786d-xxm2t\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.643657 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.643599 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-trusted-ca-bundle\") pod \"console-6ff74786d-xxm2t\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.645581 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.645562 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95b77ed3-de0b-4903-a989-e27d60d37342-console-serving-cert\") pod \"console-6ff74786d-xxm2t\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.645685 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.645667 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95b77ed3-de0b-4903-a989-e27d60d37342-console-oauth-config\") pod \"console-6ff74786d-xxm2t\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.650568 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.650542 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xkd6\" (UniqueName: \"kubernetes.io/projected/95b77ed3-de0b-4903-a989-e27d60d37342-kube-api-access-5xkd6\") pod \"console-6ff74786d-xxm2t\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.756942 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.756888 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:12.880813 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:12.880781 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6ff74786d-xxm2t"] Apr 16 22:09:12.883628 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:09:12.883595 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95b77ed3_de0b_4903_a989_e27d60d37342.slice/crio-93f309f146e9a26f5d321cf48e022ac7f54a8581220de8f8e3955d432701d52b WatchSource:0}: Error finding container 93f309f146e9a26f5d321cf48e022ac7f54a8581220de8f8e3955d432701d52b: Status 404 returned error can't find the container with id 93f309f146e9a26f5d321cf48e022ac7f54a8581220de8f8e3955d432701d52b Apr 16 22:09:13.102621 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:13.102589 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ff74786d-xxm2t" event={"ID":"95b77ed3-de0b-4903-a989-e27d60d37342","Type":"ContainerStarted","Data":"f278fdc42971f4bb16c214718adb3b79e178bc8cb2751f76db565a86d0b433a7"} Apr 16 22:09:13.102621 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:13.102624 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ff74786d-xxm2t" event={"ID":"95b77ed3-de0b-4903-a989-e27d60d37342","Type":"ContainerStarted","Data":"93f309f146e9a26f5d321cf48e022ac7f54a8581220de8f8e3955d432701d52b"} Apr 16 22:09:13.119577 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:13.119536 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6ff74786d-xxm2t" podStartSLOduration=1.119523708 podStartE2EDuration="1.119523708s" podCreationTimestamp="2026-04-16 22:09:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:09:13.117256136 +0000 UTC m=+257.377982238" watchObservedRunningTime="2026-04-16 22:09:13.119523708 +0000 UTC m=+257.380249813" Apr 16 22:09:22.757600 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:22.757562 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:22.757964 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:22.757612 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:22.762373 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:22.762352 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:23.134450 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:23.134420 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:09:23.174236 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:23.174206 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-699c78c78f-xz9gl"] Apr 16 22:09:48.195734 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.195693 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-699c78c78f-xz9gl" podUID="62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb" containerName="console" containerID="cri-o://a7154748887300aea9d5dfb0b61a16b962c9b166d4c92d3066ae53bdd76c9510" gracePeriod=15 Apr 16 22:09:48.437124 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.437103 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-699c78c78f-xz9gl_62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb/console/0.log" Apr 16 22:09:48.437223 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.437162 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:09:48.620664 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.620628 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-oauth-serving-cert\") pod \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " Apr 16 22:09:48.620859 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.620678 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2m5h\" (UniqueName: \"kubernetes.io/projected/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-kube-api-access-n2m5h\") pod \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " Apr 16 22:09:48.620859 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.620713 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-console-config\") pod \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " Apr 16 22:09:48.620859 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.620750 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-service-ca\") pod \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " Apr 16 22:09:48.620859 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.620786 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-console-oauth-config\") pod \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " Apr 16 22:09:48.620859 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.620817 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-trusted-ca-bundle\") pod \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " Apr 16 22:09:48.620859 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.620853 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-console-serving-cert\") pod \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\" (UID: \"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb\") " Apr 16 22:09:48.621175 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.621114 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb" (UID: "62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:09:48.621175 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.621136 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-console-config" (OuterVolumeSpecName: "console-config") pod "62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb" (UID: "62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:09:48.621254 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.621201 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-service-ca" (OuterVolumeSpecName: "service-ca") pod "62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb" (UID: "62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:09:48.621254 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.621209 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb" (UID: "62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:09:48.622810 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.622784 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-kube-api-access-n2m5h" (OuterVolumeSpecName: "kube-api-access-n2m5h") pod "62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb" (UID: "62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb"). InnerVolumeSpecName "kube-api-access-n2m5h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:09:48.622918 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.622903 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb" (UID: "62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:09:48.623024 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.623001 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb" (UID: "62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:09:48.721433 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.721404 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-service-ca\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:09:48.721433 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.721431 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-console-oauth-config\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:09:48.721601 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.721442 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-trusted-ca-bundle\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:09:48.721601 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.721453 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-console-serving-cert\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:09:48.721601 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.721462 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-oauth-serving-cert\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:09:48.721601 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.721470 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n2m5h\" (UniqueName: \"kubernetes.io/projected/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-kube-api-access-n2m5h\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:09:48.721601 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:48.721480 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb-console-config\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:09:49.201485 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:49.201458 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-699c78c78f-xz9gl_62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb/console/0.log" Apr 16 22:09:49.201883 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:49.201498 2576 generic.go:358] "Generic (PLEG): container finished" podID="62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb" containerID="a7154748887300aea9d5dfb0b61a16b962c9b166d4c92d3066ae53bdd76c9510" exitCode=2 Apr 16 22:09:49.201883 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:49.201530 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699c78c78f-xz9gl" event={"ID":"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb","Type":"ContainerDied","Data":"a7154748887300aea9d5dfb0b61a16b962c9b166d4c92d3066ae53bdd76c9510"} Apr 16 22:09:49.201883 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:49.201559 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-699c78c78f-xz9gl" Apr 16 22:09:49.201883 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:49.201565 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-699c78c78f-xz9gl" event={"ID":"62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb","Type":"ContainerDied","Data":"58d1a6adb99c6bc15e7cfbbbcff41650f52a98afe3dcc16a95c1df7dfb9b1df8"} Apr 16 22:09:49.201883 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:49.201585 2576 scope.go:117] "RemoveContainer" containerID="a7154748887300aea9d5dfb0b61a16b962c9b166d4c92d3066ae53bdd76c9510" Apr 16 22:09:49.210151 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:49.210137 2576 scope.go:117] "RemoveContainer" containerID="a7154748887300aea9d5dfb0b61a16b962c9b166d4c92d3066ae53bdd76c9510" Apr 16 22:09:49.210414 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:09:49.210395 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7154748887300aea9d5dfb0b61a16b962c9b166d4c92d3066ae53bdd76c9510\": container with ID starting with a7154748887300aea9d5dfb0b61a16b962c9b166d4c92d3066ae53bdd76c9510 not found: ID does not exist" containerID="a7154748887300aea9d5dfb0b61a16b962c9b166d4c92d3066ae53bdd76c9510" Apr 16 22:09:49.210470 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:49.210420 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7154748887300aea9d5dfb0b61a16b962c9b166d4c92d3066ae53bdd76c9510"} err="failed to get container status \"a7154748887300aea9d5dfb0b61a16b962c9b166d4c92d3066ae53bdd76c9510\": rpc error: code = NotFound desc = could not find container \"a7154748887300aea9d5dfb0b61a16b962c9b166d4c92d3066ae53bdd76c9510\": container with ID starting with a7154748887300aea9d5dfb0b61a16b962c9b166d4c92d3066ae53bdd76c9510 not found: ID does not exist" Apr 16 22:09:49.220344 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:49.220323 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-699c78c78f-xz9gl"] Apr 16 22:09:49.223080 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:49.223059 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-699c78c78f-xz9gl"] Apr 16 22:09:50.285297 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:50.285263 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb" path="/var/lib/kubelet/pods/62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb/volumes" Apr 16 22:09:56.160766 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:56.160732 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/2.log" Apr 16 22:09:56.161269 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:56.160853 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/2.log" Apr 16 22:09:56.170657 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:09:56.170641 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 22:10:21.978575 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:21.978536 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c94b67fdf-tjtlv"] Apr 16 22:10:21.981150 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:21.978840 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb" containerName="console" Apr 16 22:10:21.981150 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:21.978854 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb" containerName="console" Apr 16 22:10:21.981150 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:21.978903 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="62fc57e1-f7ef-49a3-a51e-4f1ad1e624fb" containerName="console" Apr 16 22:10:21.982073 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:21.982050 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:21.997761 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:21.997737 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c94b67fdf-tjtlv"] Apr 16 22:10:22.057920 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.057885 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f265668-3d67-4794-ace1-23da716b45fa-console-serving-cert\") pod \"console-5c94b67fdf-tjtlv\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:22.058048 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.057945 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-service-ca\") pod \"console-5c94b67fdf-tjtlv\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:22.058048 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.057979 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f265668-3d67-4794-ace1-23da716b45fa-console-oauth-config\") pod \"console-5c94b67fdf-tjtlv\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:22.058048 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.058001 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-trusted-ca-bundle\") pod \"console-5c94b67fdf-tjtlv\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:22.058164 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.058046 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-console-config\") pod \"console-5c94b67fdf-tjtlv\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:22.058164 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.058105 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-oauth-serving-cert\") pod \"console-5c94b67fdf-tjtlv\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:22.058164 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.058138 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gs68\" (UniqueName: \"kubernetes.io/projected/1f265668-3d67-4794-ace1-23da716b45fa-kube-api-access-9gs68\") pod \"console-5c94b67fdf-tjtlv\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:22.159400 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.159355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-service-ca\") pod \"console-5c94b67fdf-tjtlv\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:22.159400 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.159406 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f265668-3d67-4794-ace1-23da716b45fa-console-oauth-config\") pod \"console-5c94b67fdf-tjtlv\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:22.159632 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.159489 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-trusted-ca-bundle\") pod \"console-5c94b67fdf-tjtlv\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:22.159632 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.159508 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-console-config\") pod \"console-5c94b67fdf-tjtlv\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:22.159632 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.159548 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-oauth-serving-cert\") pod \"console-5c94b67fdf-tjtlv\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:22.159632 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.159578 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9gs68\" (UniqueName: \"kubernetes.io/projected/1f265668-3d67-4794-ace1-23da716b45fa-kube-api-access-9gs68\") pod \"console-5c94b67fdf-tjtlv\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:22.159843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.159686 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f265668-3d67-4794-ace1-23da716b45fa-console-serving-cert\") pod \"console-5c94b67fdf-tjtlv\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:22.160148 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.160114 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-service-ca\") pod \"console-5c94b67fdf-tjtlv\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:22.160270 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.160208 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-console-config\") pod \"console-5c94b67fdf-tjtlv\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:22.160355 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.160301 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-trusted-ca-bundle\") pod \"console-5c94b67fdf-tjtlv\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:22.160396 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.160378 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-oauth-serving-cert\") pod \"console-5c94b67fdf-tjtlv\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:22.161919 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.161901 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f265668-3d67-4794-ace1-23da716b45fa-console-oauth-config\") pod \"console-5c94b67fdf-tjtlv\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:22.162200 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.162179 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f265668-3d67-4794-ace1-23da716b45fa-console-serving-cert\") pod \"console-5c94b67fdf-tjtlv\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:22.167339 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.167319 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gs68\" (UniqueName: \"kubernetes.io/projected/1f265668-3d67-4794-ace1-23da716b45fa-kube-api-access-9gs68\") pod \"console-5c94b67fdf-tjtlv\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:22.293279 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.293204 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:22.409975 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.409954 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c94b67fdf-tjtlv"] Apr 16 22:10:22.412101 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:10:22.412074 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f265668_3d67_4794_ace1_23da716b45fa.slice/crio-9609529f80ef37b2d7afa06c503ab7cbae78eb6bae41cee9924f42cf8caa4f83 WatchSource:0}: Error finding container 9609529f80ef37b2d7afa06c503ab7cbae78eb6bae41cee9924f42cf8caa4f83: Status 404 returned error can't find the container with id 9609529f80ef37b2d7afa06c503ab7cbae78eb6bae41cee9924f42cf8caa4f83 Apr 16 22:10:22.413873 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:22.413859 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:10:23.303512 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:23.303471 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c94b67fdf-tjtlv" event={"ID":"1f265668-3d67-4794-ace1-23da716b45fa","Type":"ContainerStarted","Data":"a74661f95f56d20067505cd17e094fcb71f6337ac0b52313600a782958e336ad"} Apr 16 22:10:23.303512 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:23.303513 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c94b67fdf-tjtlv" event={"ID":"1f265668-3d67-4794-ace1-23da716b45fa","Type":"ContainerStarted","Data":"9609529f80ef37b2d7afa06c503ab7cbae78eb6bae41cee9924f42cf8caa4f83"} Apr 16 22:10:23.318620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:23.318565 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c94b67fdf-tjtlv" podStartSLOduration=2.318550185 podStartE2EDuration="2.318550185s" podCreationTimestamp="2026-04-16 22:10:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:10:23.317197761 +0000 UTC m=+327.577923866" watchObservedRunningTime="2026-04-16 22:10:23.318550185 +0000 UTC m=+327.579276290" Apr 16 22:10:32.294045 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:32.294010 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:32.294431 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:32.294056 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:32.298806 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:32.298780 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:32.329329 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:32.329288 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:10:32.363194 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:32.363163 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6ff74786d-xxm2t"] Apr 16 22:10:44.745558 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:44.745480 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-gjxcr"] Apr 16 22:10:44.748743 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:44.748720 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gjxcr" Apr 16 22:10:44.750608 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:44.750588 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 22:10:44.756472 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:44.755285 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gjxcr"] Apr 16 22:10:44.827913 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:44.827883 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9d43755a-3076-45ec-8abc-1fa99470f09b-original-pull-secret\") pod \"global-pull-secret-syncer-gjxcr\" (UID: \"9d43755a-3076-45ec-8abc-1fa99470f09b\") " pod="kube-system/global-pull-secret-syncer-gjxcr" Apr 16 22:10:44.828026 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:44.827920 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9d43755a-3076-45ec-8abc-1fa99470f09b-dbus\") pod \"global-pull-secret-syncer-gjxcr\" (UID: \"9d43755a-3076-45ec-8abc-1fa99470f09b\") " pod="kube-system/global-pull-secret-syncer-gjxcr" Apr 16 22:10:44.828026 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:44.827991 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9d43755a-3076-45ec-8abc-1fa99470f09b-kubelet-config\") pod \"global-pull-secret-syncer-gjxcr\" (UID: \"9d43755a-3076-45ec-8abc-1fa99470f09b\") " pod="kube-system/global-pull-secret-syncer-gjxcr" Apr 16 22:10:44.928525 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:44.928498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9d43755a-3076-45ec-8abc-1fa99470f09b-original-pull-secret\") pod \"global-pull-secret-syncer-gjxcr\" (UID: \"9d43755a-3076-45ec-8abc-1fa99470f09b\") " pod="kube-system/global-pull-secret-syncer-gjxcr" Apr 16 22:10:44.928638 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:44.928530 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9d43755a-3076-45ec-8abc-1fa99470f09b-dbus\") pod \"global-pull-secret-syncer-gjxcr\" (UID: \"9d43755a-3076-45ec-8abc-1fa99470f09b\") " pod="kube-system/global-pull-secret-syncer-gjxcr" Apr 16 22:10:44.928638 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:44.928567 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9d43755a-3076-45ec-8abc-1fa99470f09b-kubelet-config\") pod \"global-pull-secret-syncer-gjxcr\" (UID: \"9d43755a-3076-45ec-8abc-1fa99470f09b\") " pod="kube-system/global-pull-secret-syncer-gjxcr" Apr 16 22:10:44.928726 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:44.928695 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9d43755a-3076-45ec-8abc-1fa99470f09b-dbus\") pod \"global-pull-secret-syncer-gjxcr\" (UID: \"9d43755a-3076-45ec-8abc-1fa99470f09b\") " pod="kube-system/global-pull-secret-syncer-gjxcr" Apr 16 22:10:44.928726 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:44.928700 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9d43755a-3076-45ec-8abc-1fa99470f09b-kubelet-config\") pod \"global-pull-secret-syncer-gjxcr\" (UID: \"9d43755a-3076-45ec-8abc-1fa99470f09b\") " pod="kube-system/global-pull-secret-syncer-gjxcr" Apr 16 22:10:44.930731 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:44.930714 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9d43755a-3076-45ec-8abc-1fa99470f09b-original-pull-secret\") pod \"global-pull-secret-syncer-gjxcr\" (UID: \"9d43755a-3076-45ec-8abc-1fa99470f09b\") " pod="kube-system/global-pull-secret-syncer-gjxcr" Apr 16 22:10:45.063719 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:45.063656 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gjxcr" Apr 16 22:10:45.175962 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:45.175938 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gjxcr"] Apr 16 22:10:45.178545 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:10:45.178518 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d43755a_3076_45ec_8abc_1fa99470f09b.slice/crio-eb258a3383a1125787874d4e168e33777fbe272bd0b20799e75452945e8315c7 WatchSource:0}: Error finding container eb258a3383a1125787874d4e168e33777fbe272bd0b20799e75452945e8315c7: Status 404 returned error can't find the container with id eb258a3383a1125787874d4e168e33777fbe272bd0b20799e75452945e8315c7 Apr 16 22:10:45.365897 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:45.365867 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gjxcr" event={"ID":"9d43755a-3076-45ec-8abc-1fa99470f09b","Type":"ContainerStarted","Data":"eb258a3383a1125787874d4e168e33777fbe272bd0b20799e75452945e8315c7"} Apr 16 22:10:49.379579 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:49.379492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gjxcr" event={"ID":"9d43755a-3076-45ec-8abc-1fa99470f09b","Type":"ContainerStarted","Data":"83c955959edcb442ae515dc610fe9a217db6a120419e046a16caa03e2f97a2f9"} Apr 16 22:10:49.394169 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:49.394119 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-gjxcr" podStartSLOduration=1.444487742 podStartE2EDuration="5.394104968s" podCreationTimestamp="2026-04-16 22:10:44 +0000 UTC" firstStartedPulling="2026-04-16 22:10:45.180552773 +0000 UTC m=+349.441278871" lastFinishedPulling="2026-04-16 22:10:49.130169999 +0000 UTC m=+353.390896097" observedRunningTime="2026-04-16 22:10:49.392397625 +0000 UTC m=+353.653123740" watchObservedRunningTime="2026-04-16 22:10:49.394104968 +0000 UTC m=+353.654831072" Apr 16 22:10:57.170980 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.170942 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g"] Apr 16 22:10:57.177409 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.177384 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g" Apr 16 22:10:57.179611 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.179590 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 22:10:57.179733 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.179611 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 22:10:57.180163 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.180149 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-v84j8\"" Apr 16 22:10:57.183160 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.183140 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g"] Apr 16 22:10:57.324236 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.324204 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55b2ee4a-927c-4369-b82d-ae4b586ba69b-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g\" (UID: \"55b2ee4a-927c-4369-b82d-ae4b586ba69b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g" Apr 16 22:10:57.324427 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.324259 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55b2ee4a-927c-4369-b82d-ae4b586ba69b-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g\" (UID: \"55b2ee4a-927c-4369-b82d-ae4b586ba69b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g" Apr 16 22:10:57.324427 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.324289 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ftvs\" (UniqueName: \"kubernetes.io/projected/55b2ee4a-927c-4369-b82d-ae4b586ba69b-kube-api-access-5ftvs\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g\" (UID: \"55b2ee4a-927c-4369-b82d-ae4b586ba69b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g" Apr 16 22:10:57.386299 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.386240 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6ff74786d-xxm2t" podUID="95b77ed3-de0b-4903-a989-e27d60d37342" containerName="console" containerID="cri-o://f278fdc42971f4bb16c214718adb3b79e178bc8cb2751f76db565a86d0b433a7" gracePeriod=15 Apr 16 22:10:57.424993 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.424939 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55b2ee4a-927c-4369-b82d-ae4b586ba69b-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g\" (UID: \"55b2ee4a-927c-4369-b82d-ae4b586ba69b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g" Apr 16 22:10:57.425080 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.424989 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55b2ee4a-927c-4369-b82d-ae4b586ba69b-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g\" (UID: \"55b2ee4a-927c-4369-b82d-ae4b586ba69b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g" Apr 16 22:10:57.425080 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.425014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ftvs\" (UniqueName: \"kubernetes.io/projected/55b2ee4a-927c-4369-b82d-ae4b586ba69b-kube-api-access-5ftvs\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g\" (UID: \"55b2ee4a-927c-4369-b82d-ae4b586ba69b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g" Apr 16 22:10:57.425292 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.425272 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55b2ee4a-927c-4369-b82d-ae4b586ba69b-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g\" (UID: \"55b2ee4a-927c-4369-b82d-ae4b586ba69b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g" Apr 16 22:10:57.425359 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.425332 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55b2ee4a-927c-4369-b82d-ae4b586ba69b-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g\" (UID: \"55b2ee4a-927c-4369-b82d-ae4b586ba69b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g" Apr 16 22:10:57.431922 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.431896 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ftvs\" (UniqueName: \"kubernetes.io/projected/55b2ee4a-927c-4369-b82d-ae4b586ba69b-kube-api-access-5ftvs\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g\" (UID: \"55b2ee4a-927c-4369-b82d-ae4b586ba69b\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g" Apr 16 22:10:57.486915 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.486888 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g" Apr 16 22:10:57.626253 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.626160 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g"] Apr 16 22:10:57.627888 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.627866 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6ff74786d-xxm2t_95b77ed3-de0b-4903-a989-e27d60d37342/console/0.log" Apr 16 22:10:57.627991 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.627939 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:10:57.628806 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:10:57.628780 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55b2ee4a_927c_4369_b82d_ae4b586ba69b.slice/crio-a9a73b36ed5088efb09ff3eec360f27e1d1141244f9cb0cff4b3e0d756ec9e61 WatchSource:0}: Error finding container a9a73b36ed5088efb09ff3eec360f27e1d1141244f9cb0cff4b3e0d756ec9e61: Status 404 returned error can't find the container with id a9a73b36ed5088efb09ff3eec360f27e1d1141244f9cb0cff4b3e0d756ec9e61 Apr 16 22:10:57.727751 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.727672 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95b77ed3-de0b-4903-a989-e27d60d37342-console-serving-cert\") pod \"95b77ed3-de0b-4903-a989-e27d60d37342\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " Apr 16 22:10:57.727751 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.727734 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xkd6\" (UniqueName: \"kubernetes.io/projected/95b77ed3-de0b-4903-a989-e27d60d37342-kube-api-access-5xkd6\") pod \"95b77ed3-de0b-4903-a989-e27d60d37342\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " Apr 16 22:10:57.727751 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.727753 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-trusted-ca-bundle\") pod \"95b77ed3-de0b-4903-a989-e27d60d37342\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " Apr 16 22:10:57.728015 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.727800 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-console-config\") pod \"95b77ed3-de0b-4903-a989-e27d60d37342\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " Apr 16 22:10:57.728015 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.727825 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-service-ca\") pod \"95b77ed3-de0b-4903-a989-e27d60d37342\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " Apr 16 22:10:57.728015 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.727855 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-oauth-serving-cert\") pod \"95b77ed3-de0b-4903-a989-e27d60d37342\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " Apr 16 22:10:57.728015 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.727878 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95b77ed3-de0b-4903-a989-e27d60d37342-console-oauth-config\") pod \"95b77ed3-de0b-4903-a989-e27d60d37342\" (UID: \"95b77ed3-de0b-4903-a989-e27d60d37342\") " Apr 16 22:10:57.728272 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.728245 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-service-ca" (OuterVolumeSpecName: "service-ca") pod "95b77ed3-de0b-4903-a989-e27d60d37342" (UID: "95b77ed3-de0b-4903-a989-e27d60d37342"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:10:57.728272 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.728261 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "95b77ed3-de0b-4903-a989-e27d60d37342" (UID: "95b77ed3-de0b-4903-a989-e27d60d37342"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:10:57.728399 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.728267 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "95b77ed3-de0b-4903-a989-e27d60d37342" (UID: "95b77ed3-de0b-4903-a989-e27d60d37342"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:10:57.728399 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.728255 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-console-config" (OuterVolumeSpecName: "console-config") pod "95b77ed3-de0b-4903-a989-e27d60d37342" (UID: "95b77ed3-de0b-4903-a989-e27d60d37342"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:10:57.729971 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.729946 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95b77ed3-de0b-4903-a989-e27d60d37342-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "95b77ed3-de0b-4903-a989-e27d60d37342" (UID: "95b77ed3-de0b-4903-a989-e27d60d37342"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:10:57.730050 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.729982 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95b77ed3-de0b-4903-a989-e27d60d37342-kube-api-access-5xkd6" (OuterVolumeSpecName: "kube-api-access-5xkd6") pod "95b77ed3-de0b-4903-a989-e27d60d37342" (UID: "95b77ed3-de0b-4903-a989-e27d60d37342"). InnerVolumeSpecName "kube-api-access-5xkd6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:10:57.730050 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.729995 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95b77ed3-de0b-4903-a989-e27d60d37342-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "95b77ed3-de0b-4903-a989-e27d60d37342" (UID: "95b77ed3-de0b-4903-a989-e27d60d37342"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:10:57.829039 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.829008 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-trusted-ca-bundle\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:10:57.829039 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.829035 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-console-config\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:10:57.829178 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.829045 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-service-ca\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:10:57.829178 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.829054 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/95b77ed3-de0b-4903-a989-e27d60d37342-oauth-serving-cert\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:10:57.829178 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.829063 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/95b77ed3-de0b-4903-a989-e27d60d37342-console-oauth-config\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:10:57.829178 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.829072 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/95b77ed3-de0b-4903-a989-e27d60d37342-console-serving-cert\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:10:57.829178 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:57.829081 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5xkd6\" (UniqueName: \"kubernetes.io/projected/95b77ed3-de0b-4903-a989-e27d60d37342-kube-api-access-5xkd6\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:10:58.407255 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:58.407216 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g" event={"ID":"55b2ee4a-927c-4369-b82d-ae4b586ba69b","Type":"ContainerStarted","Data":"a9a73b36ed5088efb09ff3eec360f27e1d1141244f9cb0cff4b3e0d756ec9e61"} Apr 16 22:10:58.408696 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:58.408678 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6ff74786d-xxm2t_95b77ed3-de0b-4903-a989-e27d60d37342/console/0.log" Apr 16 22:10:58.408813 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:58.408712 2576 generic.go:358] "Generic (PLEG): container finished" podID="95b77ed3-de0b-4903-a989-e27d60d37342" containerID="f278fdc42971f4bb16c214718adb3b79e178bc8cb2751f76db565a86d0b433a7" exitCode=2 Apr 16 22:10:58.408813 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:58.408747 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ff74786d-xxm2t" event={"ID":"95b77ed3-de0b-4903-a989-e27d60d37342","Type":"ContainerDied","Data":"f278fdc42971f4bb16c214718adb3b79e178bc8cb2751f76db565a86d0b433a7"} Apr 16 22:10:58.408813 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:58.408793 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6ff74786d-xxm2t" event={"ID":"95b77ed3-de0b-4903-a989-e27d60d37342","Type":"ContainerDied","Data":"93f309f146e9a26f5d321cf48e022ac7f54a8581220de8f8e3955d432701d52b"} Apr 16 22:10:58.408813 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:58.408793 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6ff74786d-xxm2t" Apr 16 22:10:58.409003 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:58.408862 2576 scope.go:117] "RemoveContainer" containerID="f278fdc42971f4bb16c214718adb3b79e178bc8cb2751f76db565a86d0b433a7" Apr 16 22:10:58.417920 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:58.417897 2576 scope.go:117] "RemoveContainer" containerID="f278fdc42971f4bb16c214718adb3b79e178bc8cb2751f76db565a86d0b433a7" Apr 16 22:10:58.418254 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:10:58.418226 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f278fdc42971f4bb16c214718adb3b79e178bc8cb2751f76db565a86d0b433a7\": container with ID starting with f278fdc42971f4bb16c214718adb3b79e178bc8cb2751f76db565a86d0b433a7 not found: ID does not exist" containerID="f278fdc42971f4bb16c214718adb3b79e178bc8cb2751f76db565a86d0b433a7" Apr 16 22:10:58.418348 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:58.418257 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f278fdc42971f4bb16c214718adb3b79e178bc8cb2751f76db565a86d0b433a7"} err="failed to get container status \"f278fdc42971f4bb16c214718adb3b79e178bc8cb2751f76db565a86d0b433a7\": rpc error: code = NotFound desc = could not find container \"f278fdc42971f4bb16c214718adb3b79e178bc8cb2751f76db565a86d0b433a7\": container with ID starting with f278fdc42971f4bb16c214718adb3b79e178bc8cb2751f76db565a86d0b433a7 not found: ID does not exist" Apr 16 22:10:58.424868 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:58.424845 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6ff74786d-xxm2t"] Apr 16 22:10:58.428565 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:10:58.428547 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6ff74786d-xxm2t"] Apr 16 22:11:00.282592 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:00.282559 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95b77ed3-de0b-4903-a989-e27d60d37342" path="/var/lib/kubelet/pods/95b77ed3-de0b-4903-a989-e27d60d37342/volumes" Apr 16 22:11:03.425679 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:03.425642 2576 generic.go:358] "Generic (PLEG): container finished" podID="55b2ee4a-927c-4369-b82d-ae4b586ba69b" containerID="71de5806bced4d364a63583c19dab9877b6a3cea52ae6e06b822a4c0457e9bf1" exitCode=0 Apr 16 22:11:03.426057 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:03.425726 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g" event={"ID":"55b2ee4a-927c-4369-b82d-ae4b586ba69b","Type":"ContainerDied","Data":"71de5806bced4d364a63583c19dab9877b6a3cea52ae6e06b822a4c0457e9bf1"} Apr 16 22:11:06.436041 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:06.436006 2576 generic.go:358] "Generic (PLEG): container finished" podID="55b2ee4a-927c-4369-b82d-ae4b586ba69b" containerID="c48ffdf73466e86aea43980658ac9e7a8210bbf177b6fc130700bcead66df22c" exitCode=0 Apr 16 22:11:06.436431 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:06.436091 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g" event={"ID":"55b2ee4a-927c-4369-b82d-ae4b586ba69b","Type":"ContainerDied","Data":"c48ffdf73466e86aea43980658ac9e7a8210bbf177b6fc130700bcead66df22c"} Apr 16 22:11:13.457807 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:13.457775 2576 generic.go:358] "Generic (PLEG): container finished" podID="55b2ee4a-927c-4369-b82d-ae4b586ba69b" containerID="9002dd4cd1c7af2821a703a9b5c49c6991f9ef1fcef84306455679df22d521c3" exitCode=0 Apr 16 22:11:13.458161 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:13.457857 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g" event={"ID":"55b2ee4a-927c-4369-b82d-ae4b586ba69b","Type":"ContainerDied","Data":"9002dd4cd1c7af2821a703a9b5c49c6991f9ef1fcef84306455679df22d521c3"} Apr 16 22:11:14.584017 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:14.583998 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g" Apr 16 22:11:14.680704 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:14.680676 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55b2ee4a-927c-4369-b82d-ae4b586ba69b-bundle\") pod \"55b2ee4a-927c-4369-b82d-ae4b586ba69b\" (UID: \"55b2ee4a-927c-4369-b82d-ae4b586ba69b\") " Apr 16 22:11:14.680833 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:14.680742 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55b2ee4a-927c-4369-b82d-ae4b586ba69b-util\") pod \"55b2ee4a-927c-4369-b82d-ae4b586ba69b\" (UID: \"55b2ee4a-927c-4369-b82d-ae4b586ba69b\") " Apr 16 22:11:14.680833 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:14.680769 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ftvs\" (UniqueName: \"kubernetes.io/projected/55b2ee4a-927c-4369-b82d-ae4b586ba69b-kube-api-access-5ftvs\") pod \"55b2ee4a-927c-4369-b82d-ae4b586ba69b\" (UID: \"55b2ee4a-927c-4369-b82d-ae4b586ba69b\") " Apr 16 22:11:14.681231 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:14.681200 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55b2ee4a-927c-4369-b82d-ae4b586ba69b-bundle" (OuterVolumeSpecName: "bundle") pod "55b2ee4a-927c-4369-b82d-ae4b586ba69b" (UID: "55b2ee4a-927c-4369-b82d-ae4b586ba69b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:11:14.682930 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:14.682906 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55b2ee4a-927c-4369-b82d-ae4b586ba69b-kube-api-access-5ftvs" (OuterVolumeSpecName: "kube-api-access-5ftvs") pod "55b2ee4a-927c-4369-b82d-ae4b586ba69b" (UID: "55b2ee4a-927c-4369-b82d-ae4b586ba69b"). InnerVolumeSpecName "kube-api-access-5ftvs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:11:14.684723 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:14.684702 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55b2ee4a-927c-4369-b82d-ae4b586ba69b-util" (OuterVolumeSpecName: "util") pod "55b2ee4a-927c-4369-b82d-ae4b586ba69b" (UID: "55b2ee4a-927c-4369-b82d-ae4b586ba69b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:11:14.781624 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:14.781568 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5ftvs\" (UniqueName: \"kubernetes.io/projected/55b2ee4a-927c-4369-b82d-ae4b586ba69b-kube-api-access-5ftvs\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:11:14.781624 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:14.781590 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55b2ee4a-927c-4369-b82d-ae4b586ba69b-bundle\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:11:14.781624 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:14.781603 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55b2ee4a-927c-4369-b82d-ae4b586ba69b-util\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:11:15.465628 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:15.465594 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g" event={"ID":"55b2ee4a-927c-4369-b82d-ae4b586ba69b","Type":"ContainerDied","Data":"a9a73b36ed5088efb09ff3eec360f27e1d1141244f9cb0cff4b3e0d756ec9e61"} Apr 16 22:11:15.465628 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:15.465618 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e554w5g" Apr 16 22:11:15.465628 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:15.465628 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9a73b36ed5088efb09ff3eec360f27e1d1141244f9cb0cff4b3e0d756ec9e61" Apr 16 22:11:19.459040 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.459005 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-gzzf7"] Apr 16 22:11:19.459440 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.459273 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55b2ee4a-927c-4369-b82d-ae4b586ba69b" containerName="extract" Apr 16 22:11:19.459440 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.459282 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="55b2ee4a-927c-4369-b82d-ae4b586ba69b" containerName="extract" Apr 16 22:11:19.459440 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.459293 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55b2ee4a-927c-4369-b82d-ae4b586ba69b" containerName="util" Apr 16 22:11:19.459440 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.459299 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="55b2ee4a-927c-4369-b82d-ae4b586ba69b" containerName="util" Apr 16 22:11:19.459440 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.459334 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55b2ee4a-927c-4369-b82d-ae4b586ba69b" containerName="pull" Apr 16 22:11:19.459440 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.459341 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="55b2ee4a-927c-4369-b82d-ae4b586ba69b" containerName="pull" Apr 16 22:11:19.459440 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.459350 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95b77ed3-de0b-4903-a989-e27d60d37342" containerName="console" Apr 16 22:11:19.459440 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.459355 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="95b77ed3-de0b-4903-a989-e27d60d37342" containerName="console" Apr 16 22:11:19.459440 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.459403 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="55b2ee4a-927c-4369-b82d-ae4b586ba69b" containerName="extract" Apr 16 22:11:19.459440 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.459412 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="95b77ed3-de0b-4903-a989-e27d60d37342" containerName="console" Apr 16 22:11:19.462778 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.462756 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-gzzf7" Apr 16 22:11:19.465004 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.464982 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-8p779\"" Apr 16 22:11:19.465104 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.465090 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:11:19.465638 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.465615 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 22:11:19.477402 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.477379 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-gzzf7"] Apr 16 22:11:19.517700 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.517672 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5xsf\" (UniqueName: \"kubernetes.io/projected/b75b4adb-be31-46a7-b6e1-d2969e51767f-kube-api-access-z5xsf\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-gzzf7\" (UID: \"b75b4adb-be31-46a7-b6e1-d2969e51767f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-gzzf7" Apr 16 22:11:19.517805 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.517717 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b75b4adb-be31-46a7-b6e1-d2969e51767f-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-gzzf7\" (UID: \"b75b4adb-be31-46a7-b6e1-d2969e51767f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-gzzf7" Apr 16 22:11:19.618887 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.618861 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b75b4adb-be31-46a7-b6e1-d2969e51767f-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-gzzf7\" (UID: \"b75b4adb-be31-46a7-b6e1-d2969e51767f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-gzzf7" Apr 16 22:11:19.619015 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.618927 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z5xsf\" (UniqueName: \"kubernetes.io/projected/b75b4adb-be31-46a7-b6e1-d2969e51767f-kube-api-access-z5xsf\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-gzzf7\" (UID: \"b75b4adb-be31-46a7-b6e1-d2969e51767f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-gzzf7" Apr 16 22:11:19.619181 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.619163 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b75b4adb-be31-46a7-b6e1-d2969e51767f-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-gzzf7\" (UID: \"b75b4adb-be31-46a7-b6e1-d2969e51767f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-gzzf7" Apr 16 22:11:19.627382 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.627353 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5xsf\" (UniqueName: \"kubernetes.io/projected/b75b4adb-be31-46a7-b6e1-d2969e51767f-kube-api-access-z5xsf\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-gzzf7\" (UID: \"b75b4adb-be31-46a7-b6e1-d2969e51767f\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-gzzf7" Apr 16 22:11:19.772342 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.772232 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-gzzf7" Apr 16 22:11:19.897772 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:19.897748 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-gzzf7"] Apr 16 22:11:19.900382 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:11:19.900355 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb75b4adb_be31_46a7_b6e1_d2969e51767f.slice/crio-9037c9613ccfd6c863f74b54786ca245a9bec65ca2d9ff3370232a0603e67c2e WatchSource:0}: Error finding container 9037c9613ccfd6c863f74b54786ca245a9bec65ca2d9ff3370232a0603e67c2e: Status 404 returned error can't find the container with id 9037c9613ccfd6c863f74b54786ca245a9bec65ca2d9ff3370232a0603e67c2e Apr 16 22:11:20.482006 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:20.481973 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-gzzf7" event={"ID":"b75b4adb-be31-46a7-b6e1-d2969e51767f","Type":"ContainerStarted","Data":"9037c9613ccfd6c863f74b54786ca245a9bec65ca2d9ff3370232a0603e67c2e"} Apr 16 22:11:23.492831 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:23.492754 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-gzzf7" event={"ID":"b75b4adb-be31-46a7-b6e1-d2969e51767f","Type":"ContainerStarted","Data":"4179eb368a5c2ef6f6b7bbf50e02cbf5191ce68f64348582069d9416c601dc44"} Apr 16 22:11:23.514823 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:23.514767 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-gzzf7" podStartSLOduration=1.224581065 podStartE2EDuration="4.51475123s" podCreationTimestamp="2026-04-16 22:11:19 +0000 UTC" firstStartedPulling="2026-04-16 22:11:19.902797975 +0000 UTC m=+384.163524072" lastFinishedPulling="2026-04-16 22:11:23.19296815 +0000 UTC m=+387.453694237" observedRunningTime="2026-04-16 22:11:23.511891635 +0000 UTC m=+387.772617740" watchObservedRunningTime="2026-04-16 22:11:23.51475123 +0000 UTC m=+387.775477336" Apr 16 22:11:28.483537 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:28.483501 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-t5cxp"] Apr 16 22:11:28.489116 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:28.489097 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-t5cxp" Apr 16 22:11:28.491257 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:28.491230 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 22:11:28.491992 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:28.491978 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 22:11:28.492044 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:28.492018 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-84wxd\"" Apr 16 22:11:28.496282 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:28.496259 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-t5cxp"] Apr 16 22:11:28.592323 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:28.592271 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/960923ca-a4fb-4c38-9817-34d67a701010-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-t5cxp\" (UID: \"960923ca-a4fb-4c38-9817-34d67a701010\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-t5cxp" Apr 16 22:11:28.592497 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:28.592382 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76b2p\" (UniqueName: \"kubernetes.io/projected/960923ca-a4fb-4c38-9817-34d67a701010-kube-api-access-76b2p\") pod \"cert-manager-cainjector-8966b78d4-t5cxp\" (UID: \"960923ca-a4fb-4c38-9817-34d67a701010\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-t5cxp" Apr 16 22:11:28.693215 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:28.693173 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76b2p\" (UniqueName: \"kubernetes.io/projected/960923ca-a4fb-4c38-9817-34d67a701010-kube-api-access-76b2p\") pod \"cert-manager-cainjector-8966b78d4-t5cxp\" (UID: \"960923ca-a4fb-4c38-9817-34d67a701010\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-t5cxp" Apr 16 22:11:28.693387 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:28.693229 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/960923ca-a4fb-4c38-9817-34d67a701010-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-t5cxp\" (UID: \"960923ca-a4fb-4c38-9817-34d67a701010\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-t5cxp" Apr 16 22:11:28.700661 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:28.700633 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/960923ca-a4fb-4c38-9817-34d67a701010-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-t5cxp\" (UID: \"960923ca-a4fb-4c38-9817-34d67a701010\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-t5cxp" Apr 16 22:11:28.700801 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:28.700742 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76b2p\" (UniqueName: \"kubernetes.io/projected/960923ca-a4fb-4c38-9817-34d67a701010-kube-api-access-76b2p\") pod \"cert-manager-cainjector-8966b78d4-t5cxp\" (UID: \"960923ca-a4fb-4c38-9817-34d67a701010\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-t5cxp" Apr 16 22:11:28.812418 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:28.812335 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-t5cxp" Apr 16 22:11:28.930605 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:28.930577 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-t5cxp"] Apr 16 22:11:28.933035 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:11:28.933009 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod960923ca_a4fb_4c38_9817_34d67a701010.slice/crio-c0654ea2e7a7244888d65e78d9ca110624c29ad00436f22af048805c61dd729a WatchSource:0}: Error finding container c0654ea2e7a7244888d65e78d9ca110624c29ad00436f22af048805c61dd729a: Status 404 returned error can't find the container with id c0654ea2e7a7244888d65e78d9ca110624c29ad00436f22af048805c61dd729a Apr 16 22:11:29.511406 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:29.511374 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-t5cxp" event={"ID":"960923ca-a4fb-4c38-9817-34d67a701010","Type":"ContainerStarted","Data":"c0654ea2e7a7244888d65e78d9ca110624c29ad00436f22af048805c61dd729a"} Apr 16 22:11:32.521466 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:32.521432 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-t5cxp" event={"ID":"960923ca-a4fb-4c38-9817-34d67a701010","Type":"ContainerStarted","Data":"0b776507bab017c3d3ca9179117ddae249ab05752e5f25a6b4141710d485e777"} Apr 16 22:11:32.535685 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:32.535639 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-t5cxp" podStartSLOduration=1.864829453 podStartE2EDuration="4.535626095s" podCreationTimestamp="2026-04-16 22:11:28 +0000 UTC" firstStartedPulling="2026-04-16 22:11:28.934826689 +0000 UTC m=+393.195552772" lastFinishedPulling="2026-04-16 22:11:31.605623328 +0000 UTC m=+395.866349414" observedRunningTime="2026-04-16 22:11:32.533759232 +0000 UTC m=+396.794485363" watchObservedRunningTime="2026-04-16 22:11:32.535626095 +0000 UTC m=+396.796352199" Apr 16 22:11:55.192165 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:55.192130 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj"] Apr 16 22:11:55.195966 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:55.195943 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj" Apr 16 22:11:55.197856 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:55.197836 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 22:11:55.198654 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:55.198634 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-v84j8\"" Apr 16 22:11:55.198745 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:55.198636 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 22:11:55.201741 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:55.201709 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj"] Apr 16 22:11:55.297005 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:55.296979 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0302deea-bb98-4bfa-bdff-7aef61d7431f-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj\" (UID: \"0302deea-bb98-4bfa-bdff-7aef61d7431f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj" Apr 16 22:11:55.297118 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:55.297011 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0302deea-bb98-4bfa-bdff-7aef61d7431f-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj\" (UID: \"0302deea-bb98-4bfa-bdff-7aef61d7431f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj" Apr 16 22:11:55.297118 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:55.297081 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlvgk\" (UniqueName: \"kubernetes.io/projected/0302deea-bb98-4bfa-bdff-7aef61d7431f-kube-api-access-mlvgk\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj\" (UID: \"0302deea-bb98-4bfa-bdff-7aef61d7431f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj" Apr 16 22:11:55.397922 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:55.397895 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mlvgk\" (UniqueName: \"kubernetes.io/projected/0302deea-bb98-4bfa-bdff-7aef61d7431f-kube-api-access-mlvgk\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj\" (UID: \"0302deea-bb98-4bfa-bdff-7aef61d7431f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj" Apr 16 22:11:55.398018 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:55.397958 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0302deea-bb98-4bfa-bdff-7aef61d7431f-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj\" (UID: \"0302deea-bb98-4bfa-bdff-7aef61d7431f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj" Apr 16 22:11:55.398018 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:55.398002 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0302deea-bb98-4bfa-bdff-7aef61d7431f-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj\" (UID: \"0302deea-bb98-4bfa-bdff-7aef61d7431f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj" Apr 16 22:11:55.398344 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:55.398328 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0302deea-bb98-4bfa-bdff-7aef61d7431f-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj\" (UID: \"0302deea-bb98-4bfa-bdff-7aef61d7431f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj" Apr 16 22:11:55.398405 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:55.398348 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0302deea-bb98-4bfa-bdff-7aef61d7431f-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj\" (UID: \"0302deea-bb98-4bfa-bdff-7aef61d7431f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj" Apr 16 22:11:55.405030 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:55.405008 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlvgk\" (UniqueName: \"kubernetes.io/projected/0302deea-bb98-4bfa-bdff-7aef61d7431f-kube-api-access-mlvgk\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj\" (UID: \"0302deea-bb98-4bfa-bdff-7aef61d7431f\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj" Apr 16 22:11:55.505865 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:55.505816 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj" Apr 16 22:11:55.620277 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:55.620239 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj"] Apr 16 22:11:56.599319 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:56.599272 2576 generic.go:358] "Generic (PLEG): container finished" podID="0302deea-bb98-4bfa-bdff-7aef61d7431f" containerID="ea8733b101e9c82f1404cab8a73fdc9b0990193e5b90ac3e3cff785b7571f9d4" exitCode=0 Apr 16 22:11:56.599660 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:56.599366 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj" event={"ID":"0302deea-bb98-4bfa-bdff-7aef61d7431f","Type":"ContainerDied","Data":"ea8733b101e9c82f1404cab8a73fdc9b0990193e5b90ac3e3cff785b7571f9d4"} Apr 16 22:11:56.599660 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:56.599412 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj" event={"ID":"0302deea-bb98-4bfa-bdff-7aef61d7431f","Type":"ContainerStarted","Data":"2ab691b67cfb71ee26ac3f85320c9974298485b6b3d4cb38411067b54c9b0689"} Apr 16 22:11:57.603682 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:57.603653 2576 generic.go:358] "Generic (PLEG): container finished" podID="0302deea-bb98-4bfa-bdff-7aef61d7431f" containerID="1e4a17fd8b60eb3166431fe921352da3ce7937bb8e9c43dc689dbef645d410e1" exitCode=0 Apr 16 22:11:57.604043 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:57.603707 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj" event={"ID":"0302deea-bb98-4bfa-bdff-7aef61d7431f","Type":"ContainerDied","Data":"1e4a17fd8b60eb3166431fe921352da3ce7937bb8e9c43dc689dbef645d410e1"} Apr 16 22:11:58.608159 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:58.608125 2576 generic.go:358] "Generic (PLEG): container finished" podID="0302deea-bb98-4bfa-bdff-7aef61d7431f" containerID="9762bdbb6be3e36467385b81c2a4413666575c68c2cd3c5cfc177a42bcbf92da" exitCode=0 Apr 16 22:11:58.608570 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:58.608220 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj" event={"ID":"0302deea-bb98-4bfa-bdff-7aef61d7431f","Type":"ContainerDied","Data":"9762bdbb6be3e36467385b81c2a4413666575c68c2cd3c5cfc177a42bcbf92da"} Apr 16 22:11:59.732984 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:59.732962 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj" Apr 16 22:11:59.838774 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:59.838747 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0302deea-bb98-4bfa-bdff-7aef61d7431f-util\") pod \"0302deea-bb98-4bfa-bdff-7aef61d7431f\" (UID: \"0302deea-bb98-4bfa-bdff-7aef61d7431f\") " Apr 16 22:11:59.838934 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:59.838795 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlvgk\" (UniqueName: \"kubernetes.io/projected/0302deea-bb98-4bfa-bdff-7aef61d7431f-kube-api-access-mlvgk\") pod \"0302deea-bb98-4bfa-bdff-7aef61d7431f\" (UID: \"0302deea-bb98-4bfa-bdff-7aef61d7431f\") " Apr 16 22:11:59.838934 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:59.838844 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0302deea-bb98-4bfa-bdff-7aef61d7431f-bundle\") pod \"0302deea-bb98-4bfa-bdff-7aef61d7431f\" (UID: \"0302deea-bb98-4bfa-bdff-7aef61d7431f\") " Apr 16 22:11:59.839614 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:59.839578 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0302deea-bb98-4bfa-bdff-7aef61d7431f-bundle" (OuterVolumeSpecName: "bundle") pod "0302deea-bb98-4bfa-bdff-7aef61d7431f" (UID: "0302deea-bb98-4bfa-bdff-7aef61d7431f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:11:59.840919 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:59.840896 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0302deea-bb98-4bfa-bdff-7aef61d7431f-kube-api-access-mlvgk" (OuterVolumeSpecName: "kube-api-access-mlvgk") pod "0302deea-bb98-4bfa-bdff-7aef61d7431f" (UID: "0302deea-bb98-4bfa-bdff-7aef61d7431f"). InnerVolumeSpecName "kube-api-access-mlvgk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:11:59.844084 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:59.844059 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0302deea-bb98-4bfa-bdff-7aef61d7431f-util" (OuterVolumeSpecName: "util") pod "0302deea-bb98-4bfa-bdff-7aef61d7431f" (UID: "0302deea-bb98-4bfa-bdff-7aef61d7431f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:11:59.939938 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:59.939874 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0302deea-bb98-4bfa-bdff-7aef61d7431f-bundle\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:11:59.939938 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:59.939906 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0302deea-bb98-4bfa-bdff-7aef61d7431f-util\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:11:59.939938 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:11:59.939917 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mlvgk\" (UniqueName: \"kubernetes.io/projected/0302deea-bb98-4bfa-bdff-7aef61d7431f-kube-api-access-mlvgk\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:12:00.616500 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:00.616466 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj" event={"ID":"0302deea-bb98-4bfa-bdff-7aef61d7431f","Type":"ContainerDied","Data":"2ab691b67cfb71ee26ac3f85320c9974298485b6b3d4cb38411067b54c9b0689"} Apr 16 22:12:00.616500 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:00.616503 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ab691b67cfb71ee26ac3f85320c9974298485b6b3d4cb38411067b54c9b0689" Apr 16 22:12:00.616694 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:00.616538 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5r9hpj" Apr 16 22:12:01.207605 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.207569 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-fd7d9b88b-cdwxh"] Apr 16 22:12:01.208046 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.208001 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0302deea-bb98-4bfa-bdff-7aef61d7431f" containerName="extract" Apr 16 22:12:01.208046 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.208018 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0302deea-bb98-4bfa-bdff-7aef61d7431f" containerName="extract" Apr 16 22:12:01.208046 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.208032 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0302deea-bb98-4bfa-bdff-7aef61d7431f" containerName="util" Apr 16 22:12:01.208046 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.208039 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0302deea-bb98-4bfa-bdff-7aef61d7431f" containerName="util" Apr 16 22:12:01.208244 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.208063 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0302deea-bb98-4bfa-bdff-7aef61d7431f" containerName="pull" Apr 16 22:12:01.208244 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.208072 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0302deea-bb98-4bfa-bdff-7aef61d7431f" containerName="pull" Apr 16 22:12:01.208244 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.208158 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0302deea-bb98-4bfa-bdff-7aef61d7431f" containerName="extract" Apr 16 22:12:01.212264 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.212243 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-cdwxh" Apr 16 22:12:01.214475 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.214456 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 22:12:01.215155 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.215134 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 22:12:01.215238 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.215136 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 22:12:01.215238 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.215158 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 22:12:01.215238 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.215210 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 22:12:01.215383 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.215328 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-d5qw4\"" Apr 16 22:12:01.218500 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.218475 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-fd7d9b88b-cdwxh"] Apr 16 22:12:01.352004 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.351968 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81ce82ca-5118-4f08-bf3c-381fe90dadb1-cert\") pod \"lws-controller-manager-fd7d9b88b-cdwxh\" (UID: \"81ce82ca-5118-4f08-bf3c-381fe90dadb1\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-cdwxh" Apr 16 22:12:01.352177 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.352040 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/81ce82ca-5118-4f08-bf3c-381fe90dadb1-manager-config\") pod \"lws-controller-manager-fd7d9b88b-cdwxh\" (UID: \"81ce82ca-5118-4f08-bf3c-381fe90dadb1\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-cdwxh" Apr 16 22:12:01.352177 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.352062 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cslb\" (UniqueName: \"kubernetes.io/projected/81ce82ca-5118-4f08-bf3c-381fe90dadb1-kube-api-access-8cslb\") pod \"lws-controller-manager-fd7d9b88b-cdwxh\" (UID: \"81ce82ca-5118-4f08-bf3c-381fe90dadb1\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-cdwxh" Apr 16 22:12:01.352177 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.352080 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/81ce82ca-5118-4f08-bf3c-381fe90dadb1-metrics-cert\") pod \"lws-controller-manager-fd7d9b88b-cdwxh\" (UID: \"81ce82ca-5118-4f08-bf3c-381fe90dadb1\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-cdwxh" Apr 16 22:12:01.452484 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.452454 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81ce82ca-5118-4f08-bf3c-381fe90dadb1-cert\") pod \"lws-controller-manager-fd7d9b88b-cdwxh\" (UID: \"81ce82ca-5118-4f08-bf3c-381fe90dadb1\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-cdwxh" Apr 16 22:12:01.452661 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.452516 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/81ce82ca-5118-4f08-bf3c-381fe90dadb1-manager-config\") pod \"lws-controller-manager-fd7d9b88b-cdwxh\" (UID: \"81ce82ca-5118-4f08-bf3c-381fe90dadb1\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-cdwxh" Apr 16 22:12:01.452661 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.452546 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8cslb\" (UniqueName: \"kubernetes.io/projected/81ce82ca-5118-4f08-bf3c-381fe90dadb1-kube-api-access-8cslb\") pod \"lws-controller-manager-fd7d9b88b-cdwxh\" (UID: \"81ce82ca-5118-4f08-bf3c-381fe90dadb1\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-cdwxh" Apr 16 22:12:01.452661 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.452565 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/81ce82ca-5118-4f08-bf3c-381fe90dadb1-metrics-cert\") pod \"lws-controller-manager-fd7d9b88b-cdwxh\" (UID: \"81ce82ca-5118-4f08-bf3c-381fe90dadb1\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-cdwxh" Apr 16 22:12:01.453275 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.453252 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/81ce82ca-5118-4f08-bf3c-381fe90dadb1-manager-config\") pod \"lws-controller-manager-fd7d9b88b-cdwxh\" (UID: \"81ce82ca-5118-4f08-bf3c-381fe90dadb1\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-cdwxh" Apr 16 22:12:01.454939 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.454916 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/81ce82ca-5118-4f08-bf3c-381fe90dadb1-metrics-cert\") pod \"lws-controller-manager-fd7d9b88b-cdwxh\" (UID: \"81ce82ca-5118-4f08-bf3c-381fe90dadb1\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-cdwxh" Apr 16 22:12:01.455035 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.454952 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/81ce82ca-5118-4f08-bf3c-381fe90dadb1-cert\") pod \"lws-controller-manager-fd7d9b88b-cdwxh\" (UID: \"81ce82ca-5118-4f08-bf3c-381fe90dadb1\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-cdwxh" Apr 16 22:12:01.459601 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.459546 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cslb\" (UniqueName: \"kubernetes.io/projected/81ce82ca-5118-4f08-bf3c-381fe90dadb1-kube-api-access-8cslb\") pod \"lws-controller-manager-fd7d9b88b-cdwxh\" (UID: \"81ce82ca-5118-4f08-bf3c-381fe90dadb1\") " pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-cdwxh" Apr 16 22:12:01.522731 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.522702 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-cdwxh" Apr 16 22:12:01.638770 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:01.638742 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-fd7d9b88b-cdwxh"] Apr 16 22:12:01.642449 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:12:01.642422 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81ce82ca_5118_4f08_bf3c_381fe90dadb1.slice/crio-46bf4acdba99b9981c272b972c8f45a4e66d71fa4b1ea6e99e4ad85da036b7e6 WatchSource:0}: Error finding container 46bf4acdba99b9981c272b972c8f45a4e66d71fa4b1ea6e99e4ad85da036b7e6: Status 404 returned error can't find the container with id 46bf4acdba99b9981c272b972c8f45a4e66d71fa4b1ea6e99e4ad85da036b7e6 Apr 16 22:12:02.624939 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:02.624903 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-cdwxh" event={"ID":"81ce82ca-5118-4f08-bf3c-381fe90dadb1","Type":"ContainerStarted","Data":"46bf4acdba99b9981c272b972c8f45a4e66d71fa4b1ea6e99e4ad85da036b7e6"} Apr 16 22:12:05.636984 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:05.636903 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-cdwxh" event={"ID":"81ce82ca-5118-4f08-bf3c-381fe90dadb1","Type":"ContainerStarted","Data":"4aa184549fbf62633c7cd4634a41e4b252c31f43f465594fa42b6611043154d0"} Apr 16 22:12:05.636984 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:05.636960 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-cdwxh" Apr 16 22:12:05.651741 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:05.651693 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-cdwxh" podStartSLOduration=1.5232442210000001 podStartE2EDuration="4.651679746s" podCreationTimestamp="2026-04-16 22:12:01 +0000 UTC" firstStartedPulling="2026-04-16 22:12:01.644238995 +0000 UTC m=+425.904965078" lastFinishedPulling="2026-04-16 22:12:04.772674516 +0000 UTC m=+429.033400603" observedRunningTime="2026-04-16 22:12:05.650519462 +0000 UTC m=+429.911245565" watchObservedRunningTime="2026-04-16 22:12:05.651679746 +0000 UTC m=+429.912405850" Apr 16 22:12:12.138807 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:12.138776 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-vprm4"] Apr 16 22:12:12.141883 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:12.141868 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-vprm4" Apr 16 22:12:12.143983 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:12.143959 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 22:12:12.144119 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:12.144007 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 22:12:12.144297 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:12.144281 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-52lvq\"" Apr 16 22:12:12.144692 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:12.144672 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 22:12:12.144890 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:12.144859 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 22:12:12.154811 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:12.154790 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-vprm4"] Apr 16 22:12:12.237251 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:12.237218 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eb4d92c0-6bd2-45be-9dd6-28bb8b0b5a99-apiservice-cert\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-vprm4\" (UID: \"eb4d92c0-6bd2-45be-9dd6-28bb8b0b5a99\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-vprm4" Apr 16 22:12:12.237423 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:12.237259 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eb4d92c0-6bd2-45be-9dd6-28bb8b0b5a99-webhook-cert\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-vprm4\" (UID: \"eb4d92c0-6bd2-45be-9dd6-28bb8b0b5a99\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-vprm4" Apr 16 22:12:12.237423 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:12.237399 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pv8t\" (UniqueName: \"kubernetes.io/projected/eb4d92c0-6bd2-45be-9dd6-28bb8b0b5a99-kube-api-access-8pv8t\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-vprm4\" (UID: \"eb4d92c0-6bd2-45be-9dd6-28bb8b0b5a99\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-vprm4" Apr 16 22:12:12.338117 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:12.338082 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eb4d92c0-6bd2-45be-9dd6-28bb8b0b5a99-apiservice-cert\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-vprm4\" (UID: \"eb4d92c0-6bd2-45be-9dd6-28bb8b0b5a99\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-vprm4" Apr 16 22:12:12.338117 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:12.338123 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eb4d92c0-6bd2-45be-9dd6-28bb8b0b5a99-webhook-cert\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-vprm4\" (UID: \"eb4d92c0-6bd2-45be-9dd6-28bb8b0b5a99\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-vprm4" Apr 16 22:12:12.338363 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:12.338160 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8pv8t\" (UniqueName: \"kubernetes.io/projected/eb4d92c0-6bd2-45be-9dd6-28bb8b0b5a99-kube-api-access-8pv8t\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-vprm4\" (UID: \"eb4d92c0-6bd2-45be-9dd6-28bb8b0b5a99\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-vprm4" Apr 16 22:12:12.340699 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:12.340668 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eb4d92c0-6bd2-45be-9dd6-28bb8b0b5a99-webhook-cert\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-vprm4\" (UID: \"eb4d92c0-6bd2-45be-9dd6-28bb8b0b5a99\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-vprm4" Apr 16 22:12:12.340810 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:12.340708 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eb4d92c0-6bd2-45be-9dd6-28bb8b0b5a99-apiservice-cert\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-vprm4\" (UID: \"eb4d92c0-6bd2-45be-9dd6-28bb8b0b5a99\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-vprm4" Apr 16 22:12:12.346558 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:12.346530 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pv8t\" (UniqueName: \"kubernetes.io/projected/eb4d92c0-6bd2-45be-9dd6-28bb8b0b5a99-kube-api-access-8pv8t\") pod \"opendatahub-operator-controller-manager-674f8cc5cf-vprm4\" (UID: \"eb4d92c0-6bd2-45be-9dd6-28bb8b0b5a99\") " pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-vprm4" Apr 16 22:12:12.452916 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:12.452825 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-vprm4" Apr 16 22:12:12.577794 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:12.577772 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-vprm4"] Apr 16 22:12:12.580658 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:12:12.580626 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb4d92c0_6bd2_45be_9dd6_28bb8b0b5a99.slice/crio-ceae3b1dfb38b3284911decd775b14cd836e007b79a28a3fb415b3d832ef120e WatchSource:0}: Error finding container ceae3b1dfb38b3284911decd775b14cd836e007b79a28a3fb415b3d832ef120e: Status 404 returned error can't find the container with id ceae3b1dfb38b3284911decd775b14cd836e007b79a28a3fb415b3d832ef120e Apr 16 22:12:12.659072 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:12.659041 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-vprm4" event={"ID":"eb4d92c0-6bd2-45be-9dd6-28bb8b0b5a99","Type":"ContainerStarted","Data":"ceae3b1dfb38b3284911decd775b14cd836e007b79a28a3fb415b3d832ef120e"} Apr 16 22:12:15.671223 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:15.671189 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-vprm4" event={"ID":"eb4d92c0-6bd2-45be-9dd6-28bb8b0b5a99","Type":"ContainerStarted","Data":"3f6af9c9f2918ec6972b254dca3d3ca3c618efb5d23c017abb2363755f58ec4c"} Apr 16 22:12:15.671639 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:15.671356 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-vprm4" Apr 16 22:12:15.701433 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:15.701392 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-vprm4" podStartSLOduration=1.30428945 podStartE2EDuration="3.70137967s" podCreationTimestamp="2026-04-16 22:12:12 +0000 UTC" firstStartedPulling="2026-04-16 22:12:12.58237 +0000 UTC m=+436.843096083" lastFinishedPulling="2026-04-16 22:12:14.97946022 +0000 UTC m=+439.240186303" observedRunningTime="2026-04-16 22:12:15.700333792 +0000 UTC m=+439.961059895" watchObservedRunningTime="2026-04-16 22:12:15.70137967 +0000 UTC m=+439.962105803" Apr 16 22:12:16.642912 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:16.642881 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-fd7d9b88b-cdwxh" Apr 16 22:12:26.677322 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:12:26.677278 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-674f8cc5cf-vprm4" Apr 16 22:13:03.168379 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.168346 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw"] Apr 16 22:13:03.177725 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.177695 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.179933 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.179908 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 22:13:03.180071 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.179910 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-4hjxp\"" Apr 16 22:13:03.183871 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.183840 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw"] Apr 16 22:13:03.330735 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.330694 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.330889 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.330743 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.330889 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.330814 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.330889 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.330852 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.330889 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.330881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.331085 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.330914 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.331085 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.330979 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klrfx\" (UniqueName: \"kubernetes.io/projected/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-kube-api-access-klrfx\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.331085 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.331046 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.331207 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.331086 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.431866 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.431788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klrfx\" (UniqueName: \"kubernetes.io/projected/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-kube-api-access-klrfx\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.431866 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.431836 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.432065 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.431866 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.432065 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.431908 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.432065 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.431939 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.432065 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.431986 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.432271 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.432098 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.432271 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.432136 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.432271 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.432161 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.432454 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.432269 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.432574 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.432546 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.432574 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.432563 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.432998 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.432968 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.433361 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.433326 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.434544 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.434514 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.434709 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.434690 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.438752 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.438725 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.438856 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.438841 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klrfx\" (UniqueName: \"kubernetes.io/projected/37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd-kube-api-access-klrfx\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw\" (UID: \"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.489810 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.489784 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:03.624483 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.624458 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw"] Apr 16 22:13:03.626403 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:13:03.626367 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37b7d56d_46e5_4e3f_8212_92a0cfbaa0fd.slice/crio-1e4805298110e5774734e7381344ac62602b5f9c7fc13c1696871f65beb15eee WatchSource:0}: Error finding container 1e4805298110e5774734e7381344ac62602b5f9c7fc13c1696871f65beb15eee: Status 404 returned error can't find the container with id 1e4805298110e5774734e7381344ac62602b5f9c7fc13c1696871f65beb15eee Apr 16 22:13:03.829912 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:03.829830 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" event={"ID":"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd","Type":"ContainerStarted","Data":"1e4805298110e5774734e7381344ac62602b5f9c7fc13c1696871f65beb15eee"} Apr 16 22:13:06.455536 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:06.455485 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 22:13:06.455838 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:06.455649 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 22:13:06.455838 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:06.455702 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 22:13:06.842235 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:06.842197 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" event={"ID":"37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd","Type":"ContainerStarted","Data":"dd5fd9fba4407a4961d4124588b90a77c1b59faa416c9e77461ce9c99c1ca674"} Apr 16 22:13:06.859376 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:06.859324 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" podStartSLOduration=1.032339919 podStartE2EDuration="3.859290498s" podCreationTimestamp="2026-04-16 22:13:03 +0000 UTC" firstStartedPulling="2026-04-16 22:13:03.628219044 +0000 UTC m=+487.888945127" lastFinishedPulling="2026-04-16 22:13:06.455169619 +0000 UTC m=+490.715895706" observedRunningTime="2026-04-16 22:13:06.858502799 +0000 UTC m=+491.119228905" watchObservedRunningTime="2026-04-16 22:13:06.859290498 +0000 UTC m=+491.120016604" Apr 16 22:13:07.491023 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:07.490985 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:07.495677 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:07.495653 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:07.845814 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:07.845782 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:07.846679 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:07.846660 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw" Apr 16 22:13:29.546620 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:29.546581 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-smdg9"] Apr 16 22:13:29.556976 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:29.556948 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-smdg9" Apr 16 22:13:29.557502 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:29.557479 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-smdg9"] Apr 16 22:13:29.561589 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:29.561529 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 22:13:29.562326 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:29.562290 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 22:13:29.562436 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:29.562293 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-dhpmv\"" Apr 16 22:13:29.635674 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:29.635644 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrjlp\" (UniqueName: \"kubernetes.io/projected/ee83595a-7942-4f06-b22c-61a388ac766b-kube-api-access-wrjlp\") pod \"kuadrant-operator-catalog-smdg9\" (UID: \"ee83595a-7942-4f06-b22c-61a388ac766b\") " pod="kuadrant-system/kuadrant-operator-catalog-smdg9" Apr 16 22:13:29.736939 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:29.736902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrjlp\" (UniqueName: \"kubernetes.io/projected/ee83595a-7942-4f06-b22c-61a388ac766b-kube-api-access-wrjlp\") pod \"kuadrant-operator-catalog-smdg9\" (UID: \"ee83595a-7942-4f06-b22c-61a388ac766b\") " pod="kuadrant-system/kuadrant-operator-catalog-smdg9" Apr 16 22:13:29.744344 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:29.744325 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrjlp\" (UniqueName: \"kubernetes.io/projected/ee83595a-7942-4f06-b22c-61a388ac766b-kube-api-access-wrjlp\") pod \"kuadrant-operator-catalog-smdg9\" (UID: \"ee83595a-7942-4f06-b22c-61a388ac766b\") " pod="kuadrant-system/kuadrant-operator-catalog-smdg9" Apr 16 22:13:29.868277 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:29.868239 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-smdg9" Apr 16 22:13:29.920929 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:29.920899 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-smdg9"] Apr 16 22:13:29.987119 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:29.987084 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-smdg9"] Apr 16 22:13:29.990344 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:13:29.990263 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee83595a_7942_4f06_b22c_61a388ac766b.slice/crio-ac8daacbaee631d3c345cea1168e41eb6ffd90861d65ea12a241418d286eb181 WatchSource:0}: Error finding container ac8daacbaee631d3c345cea1168e41eb6ffd90861d65ea12a241418d286eb181: Status 404 returned error can't find the container with id ac8daacbaee631d3c345cea1168e41eb6ffd90861d65ea12a241418d286eb181 Apr 16 22:13:30.127529 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:30.127452 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-5n2k9"] Apr 16 22:13:30.132534 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:30.132514 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-5n2k9" Apr 16 22:13:30.137808 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:30.137783 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-5n2k9"] Apr 16 22:13:30.240267 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:30.240237 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-676tg\" (UniqueName: \"kubernetes.io/projected/5526f190-7cc2-4eb3-9a88-a2565891302d-kube-api-access-676tg\") pod \"kuadrant-operator-catalog-5n2k9\" (UID: \"5526f190-7cc2-4eb3-9a88-a2565891302d\") " pod="kuadrant-system/kuadrant-operator-catalog-5n2k9" Apr 16 22:13:30.341739 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:30.341701 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-676tg\" (UniqueName: \"kubernetes.io/projected/5526f190-7cc2-4eb3-9a88-a2565891302d-kube-api-access-676tg\") pod \"kuadrant-operator-catalog-5n2k9\" (UID: \"5526f190-7cc2-4eb3-9a88-a2565891302d\") " pod="kuadrant-system/kuadrant-operator-catalog-5n2k9" Apr 16 22:13:30.349379 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:30.349352 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-676tg\" (UniqueName: \"kubernetes.io/projected/5526f190-7cc2-4eb3-9a88-a2565891302d-kube-api-access-676tg\") pod \"kuadrant-operator-catalog-5n2k9\" (UID: \"5526f190-7cc2-4eb3-9a88-a2565891302d\") " pod="kuadrant-system/kuadrant-operator-catalog-5n2k9" Apr 16 22:13:30.442882 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:30.442793 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-5n2k9" Apr 16 22:13:30.564046 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:30.564010 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-5n2k9"] Apr 16 22:13:30.585378 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:13:30.585337 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5526f190_7cc2_4eb3_9a88_a2565891302d.slice/crio-57f9a95743f480c6f8ca876a7365de9d3aa8adcabd3168d9a8428318c8b351c3 WatchSource:0}: Error finding container 57f9a95743f480c6f8ca876a7365de9d3aa8adcabd3168d9a8428318c8b351c3: Status 404 returned error can't find the container with id 57f9a95743f480c6f8ca876a7365de9d3aa8adcabd3168d9a8428318c8b351c3 Apr 16 22:13:30.925919 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:30.925884 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-5n2k9" event={"ID":"5526f190-7cc2-4eb3-9a88-a2565891302d","Type":"ContainerStarted","Data":"57f9a95743f480c6f8ca876a7365de9d3aa8adcabd3168d9a8428318c8b351c3"} Apr 16 22:13:30.926890 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:30.926864 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-smdg9" event={"ID":"ee83595a-7942-4f06-b22c-61a388ac766b","Type":"ContainerStarted","Data":"ac8daacbaee631d3c345cea1168e41eb6ffd90861d65ea12a241418d286eb181"} Apr 16 22:13:32.936052 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:32.936009 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-5n2k9" event={"ID":"5526f190-7cc2-4eb3-9a88-a2565891302d","Type":"ContainerStarted","Data":"3a3db8b6f6c1edb7a86551b348dc391a0786b62c60a92719e4f0eaa9e73674d3"} Apr 16 22:13:32.937372 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:32.937350 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-smdg9" event={"ID":"ee83595a-7942-4f06-b22c-61a388ac766b","Type":"ContainerStarted","Data":"87fb7328bafc47fbf5b1e14ef855a4eb28ab9e37be5fc8184f2da8d892778ff4"} Apr 16 22:13:32.937508 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:32.937466 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-smdg9" podUID="ee83595a-7942-4f06-b22c-61a388ac766b" containerName="registry-server" containerID="cri-o://87fb7328bafc47fbf5b1e14ef855a4eb28ab9e37be5fc8184f2da8d892778ff4" gracePeriod=2 Apr 16 22:13:32.950811 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:32.950772 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-5n2k9" podStartSLOduration=1.423007449 podStartE2EDuration="2.950758414s" podCreationTimestamp="2026-04-16 22:13:30 +0000 UTC" firstStartedPulling="2026-04-16 22:13:30.586819345 +0000 UTC m=+514.847545427" lastFinishedPulling="2026-04-16 22:13:32.114570291 +0000 UTC m=+516.375296392" observedRunningTime="2026-04-16 22:13:32.94893371 +0000 UTC m=+517.209659814" watchObservedRunningTime="2026-04-16 22:13:32.950758414 +0000 UTC m=+517.211484518" Apr 16 22:13:33.191671 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:33.191643 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-smdg9" Apr 16 22:13:33.268133 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:33.268103 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrjlp\" (UniqueName: \"kubernetes.io/projected/ee83595a-7942-4f06-b22c-61a388ac766b-kube-api-access-wrjlp\") pod \"ee83595a-7942-4f06-b22c-61a388ac766b\" (UID: \"ee83595a-7942-4f06-b22c-61a388ac766b\") " Apr 16 22:13:33.270166 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:33.270142 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee83595a-7942-4f06-b22c-61a388ac766b-kube-api-access-wrjlp" (OuterVolumeSpecName: "kube-api-access-wrjlp") pod "ee83595a-7942-4f06-b22c-61a388ac766b" (UID: "ee83595a-7942-4f06-b22c-61a388ac766b"). InnerVolumeSpecName "kube-api-access-wrjlp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:13:33.368870 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:33.368836 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wrjlp\" (UniqueName: \"kubernetes.io/projected/ee83595a-7942-4f06-b22c-61a388ac766b-kube-api-access-wrjlp\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:13:33.941731 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:33.941694 2576 generic.go:358] "Generic (PLEG): container finished" podID="ee83595a-7942-4f06-b22c-61a388ac766b" containerID="87fb7328bafc47fbf5b1e14ef855a4eb28ab9e37be5fc8184f2da8d892778ff4" exitCode=0 Apr 16 22:13:33.942157 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:33.941747 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-smdg9" Apr 16 22:13:33.942157 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:33.941773 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-smdg9" event={"ID":"ee83595a-7942-4f06-b22c-61a388ac766b","Type":"ContainerDied","Data":"87fb7328bafc47fbf5b1e14ef855a4eb28ab9e37be5fc8184f2da8d892778ff4"} Apr 16 22:13:33.942157 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:33.941805 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-smdg9" event={"ID":"ee83595a-7942-4f06-b22c-61a388ac766b","Type":"ContainerDied","Data":"ac8daacbaee631d3c345cea1168e41eb6ffd90861d65ea12a241418d286eb181"} Apr 16 22:13:33.942157 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:33.941820 2576 scope.go:117] "RemoveContainer" containerID="87fb7328bafc47fbf5b1e14ef855a4eb28ab9e37be5fc8184f2da8d892778ff4" Apr 16 22:13:33.951046 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:33.951024 2576 scope.go:117] "RemoveContainer" containerID="87fb7328bafc47fbf5b1e14ef855a4eb28ab9e37be5fc8184f2da8d892778ff4" Apr 16 22:13:33.951279 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:13:33.951260 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87fb7328bafc47fbf5b1e14ef855a4eb28ab9e37be5fc8184f2da8d892778ff4\": container with ID starting with 87fb7328bafc47fbf5b1e14ef855a4eb28ab9e37be5fc8184f2da8d892778ff4 not found: ID does not exist" containerID="87fb7328bafc47fbf5b1e14ef855a4eb28ab9e37be5fc8184f2da8d892778ff4" Apr 16 22:13:33.951362 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:33.951288 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87fb7328bafc47fbf5b1e14ef855a4eb28ab9e37be5fc8184f2da8d892778ff4"} err="failed to get container status \"87fb7328bafc47fbf5b1e14ef855a4eb28ab9e37be5fc8184f2da8d892778ff4\": rpc error: code = NotFound desc = could not find container \"87fb7328bafc47fbf5b1e14ef855a4eb28ab9e37be5fc8184f2da8d892778ff4\": container with ID starting with 87fb7328bafc47fbf5b1e14ef855a4eb28ab9e37be5fc8184f2da8d892778ff4 not found: ID does not exist" Apr 16 22:13:33.960744 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:33.960725 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-smdg9"] Apr 16 22:13:33.963990 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:33.963971 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-smdg9"] Apr 16 22:13:34.282388 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:34.282293 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee83595a-7942-4f06-b22c-61a388ac766b" path="/var/lib/kubelet/pods/ee83595a-7942-4f06-b22c-61a388ac766b/volumes" Apr 16 22:13:40.443230 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:40.443145 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-5n2k9" Apr 16 22:13:40.443230 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:40.443224 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-5n2k9" Apr 16 22:13:40.464948 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:40.464922 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-5n2k9" Apr 16 22:13:40.986569 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:40.986542 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-5n2k9" Apr 16 22:13:44.683857 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:44.683822 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg"] Apr 16 22:13:44.684230 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:44.684132 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee83595a-7942-4f06-b22c-61a388ac766b" containerName="registry-server" Apr 16 22:13:44.684230 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:44.684142 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee83595a-7942-4f06-b22c-61a388ac766b" containerName="registry-server" Apr 16 22:13:44.684230 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:44.684199 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ee83595a-7942-4f06-b22c-61a388ac766b" containerName="registry-server" Apr 16 22:13:44.688671 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:44.688630 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg" Apr 16 22:13:44.691566 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:44.691498 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-xfwsw\"" Apr 16 22:13:44.693526 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:44.693501 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg"] Apr 16 22:13:44.768700 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:44.768667 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqrvr\" (UniqueName: \"kubernetes.io/projected/ab32208f-6560-4abb-83f2-8cb550dbed35-kube-api-access-wqrvr\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg\" (UID: \"ab32208f-6560-4abb-83f2-8cb550dbed35\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg" Apr 16 22:13:44.768882 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:44.768780 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab32208f-6560-4abb-83f2-8cb550dbed35-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg\" (UID: \"ab32208f-6560-4abb-83f2-8cb550dbed35\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg" Apr 16 22:13:44.768882 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:44.768814 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab32208f-6560-4abb-83f2-8cb550dbed35-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg\" (UID: \"ab32208f-6560-4abb-83f2-8cb550dbed35\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg" Apr 16 22:13:44.869985 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:44.869947 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab32208f-6560-4abb-83f2-8cb550dbed35-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg\" (UID: \"ab32208f-6560-4abb-83f2-8cb550dbed35\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg" Apr 16 22:13:44.869985 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:44.869985 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab32208f-6560-4abb-83f2-8cb550dbed35-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg\" (UID: \"ab32208f-6560-4abb-83f2-8cb550dbed35\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg" Apr 16 22:13:44.870181 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:44.870018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wqrvr\" (UniqueName: \"kubernetes.io/projected/ab32208f-6560-4abb-83f2-8cb550dbed35-kube-api-access-wqrvr\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg\" (UID: \"ab32208f-6560-4abb-83f2-8cb550dbed35\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg" Apr 16 22:13:44.870451 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:44.870429 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab32208f-6560-4abb-83f2-8cb550dbed35-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg\" (UID: \"ab32208f-6560-4abb-83f2-8cb550dbed35\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg" Apr 16 22:13:44.870500 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:44.870436 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab32208f-6560-4abb-83f2-8cb550dbed35-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg\" (UID: \"ab32208f-6560-4abb-83f2-8cb550dbed35\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg" Apr 16 22:13:44.876928 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:44.876896 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqrvr\" (UniqueName: \"kubernetes.io/projected/ab32208f-6560-4abb-83f2-8cb550dbed35-kube-api-access-wqrvr\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg\" (UID: \"ab32208f-6560-4abb-83f2-8cb550dbed35\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg" Apr 16 22:13:44.999402 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:44.999319 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg" Apr 16 22:13:45.117482 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:45.117456 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg"] Apr 16 22:13:45.119603 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:13:45.119575 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab32208f_6560_4abb_83f2_8cb550dbed35.slice/crio-12fc9849b2937d36ec4962a99ce477498f73bfc8e72ec75620ed9bfe571196fd WatchSource:0}: Error finding container 12fc9849b2937d36ec4962a99ce477498f73bfc8e72ec75620ed9bfe571196fd: Status 404 returned error can't find the container with id 12fc9849b2937d36ec4962a99ce477498f73bfc8e72ec75620ed9bfe571196fd Apr 16 22:13:45.482758 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:45.482726 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h"] Apr 16 22:13:45.486198 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:45.486176 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h" Apr 16 22:13:45.492119 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:45.492090 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h"] Apr 16 22:13:45.576078 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:45.576040 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt4pq\" (UniqueName: \"kubernetes.io/projected/44c36f49-c527-45ce-b977-372564bbf626-kube-api-access-mt4pq\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h\" (UID: \"44c36f49-c527-45ce-b977-372564bbf626\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h" Apr 16 22:13:45.576239 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:45.576122 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44c36f49-c527-45ce-b977-372564bbf626-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h\" (UID: \"44c36f49-c527-45ce-b977-372564bbf626\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h" Apr 16 22:13:45.576239 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:45.576171 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44c36f49-c527-45ce-b977-372564bbf626-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h\" (UID: \"44c36f49-c527-45ce-b977-372564bbf626\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h" Apr 16 22:13:45.676831 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:45.676791 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44c36f49-c527-45ce-b977-372564bbf626-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h\" (UID: \"44c36f49-c527-45ce-b977-372564bbf626\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h" Apr 16 22:13:45.677022 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:45.676872 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mt4pq\" (UniqueName: \"kubernetes.io/projected/44c36f49-c527-45ce-b977-372564bbf626-kube-api-access-mt4pq\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h\" (UID: \"44c36f49-c527-45ce-b977-372564bbf626\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h" Apr 16 22:13:45.677022 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:45.676940 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44c36f49-c527-45ce-b977-372564bbf626-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h\" (UID: \"44c36f49-c527-45ce-b977-372564bbf626\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h" Apr 16 22:13:45.677192 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:45.677165 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44c36f49-c527-45ce-b977-372564bbf626-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h\" (UID: \"44c36f49-c527-45ce-b977-372564bbf626\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h" Apr 16 22:13:45.677289 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:45.677270 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44c36f49-c527-45ce-b977-372564bbf626-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h\" (UID: \"44c36f49-c527-45ce-b977-372564bbf626\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h" Apr 16 22:13:45.684795 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:45.684775 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt4pq\" (UniqueName: \"kubernetes.io/projected/44c36f49-c527-45ce-b977-372564bbf626-kube-api-access-mt4pq\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h\" (UID: \"44c36f49-c527-45ce-b977-372564bbf626\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h" Apr 16 22:13:45.796990 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:45.796893 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h" Apr 16 22:13:45.916944 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:45.916918 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h"] Apr 16 22:13:45.919065 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:13:45.919040 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44c36f49_c527_45ce_b977_372564bbf626.slice/crio-ffeee2aa1a72e6ede03d08c80eca69fb81a732b8347d9efde8a8cb880b95ada7 WatchSource:0}: Error finding container ffeee2aa1a72e6ede03d08c80eca69fb81a732b8347d9efde8a8cb880b95ada7: Status 404 returned error can't find the container with id ffeee2aa1a72e6ede03d08c80eca69fb81a732b8347d9efde8a8cb880b95ada7 Apr 16 22:13:45.982487 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:45.982454 2576 generic.go:358] "Generic (PLEG): container finished" podID="ab32208f-6560-4abb-83f2-8cb550dbed35" containerID="ed51a4f7a56162d58c8fb55481a87d89f004bdc03a44c2ab9119b95f758007d1" exitCode=0 Apr 16 22:13:45.982671 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:45.982535 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg" event={"ID":"ab32208f-6560-4abb-83f2-8cb550dbed35","Type":"ContainerDied","Data":"ed51a4f7a56162d58c8fb55481a87d89f004bdc03a44c2ab9119b95f758007d1"} Apr 16 22:13:45.982671 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:45.982571 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg" event={"ID":"ab32208f-6560-4abb-83f2-8cb550dbed35","Type":"ContainerStarted","Data":"12fc9849b2937d36ec4962a99ce477498f73bfc8e72ec75620ed9bfe571196fd"} Apr 16 22:13:45.983968 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:45.983945 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h" event={"ID":"44c36f49-c527-45ce-b977-372564bbf626","Type":"ContainerStarted","Data":"e709a552e006de4aed5d68b3736be9333482134de67b37a709ab82f318b1248a"} Apr 16 22:13:45.984057 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:45.983976 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h" event={"ID":"44c36f49-c527-45ce-b977-372564bbf626","Type":"ContainerStarted","Data":"ffeee2aa1a72e6ede03d08c80eca69fb81a732b8347d9efde8a8cb880b95ada7"} Apr 16 22:13:46.084159 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.084132 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt"] Apr 16 22:13:46.087515 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.087497 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt" Apr 16 22:13:46.094727 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.094702 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt"] Apr 16 22:13:46.181083 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.181046 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/016deeab-866e-45c0-ba71-db4cd5f9149b-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt\" (UID: \"016deeab-866e-45c0-ba71-db4cd5f9149b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt" Apr 16 22:13:46.181252 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.181111 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fjwl\" (UniqueName: \"kubernetes.io/projected/016deeab-866e-45c0-ba71-db4cd5f9149b-kube-api-access-6fjwl\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt\" (UID: \"016deeab-866e-45c0-ba71-db4cd5f9149b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt" Apr 16 22:13:46.181252 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.181159 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/016deeab-866e-45c0-ba71-db4cd5f9149b-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt\" (UID: \"016deeab-866e-45c0-ba71-db4cd5f9149b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt" Apr 16 22:13:46.281850 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.281821 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fjwl\" (UniqueName: \"kubernetes.io/projected/016deeab-866e-45c0-ba71-db4cd5f9149b-kube-api-access-6fjwl\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt\" (UID: \"016deeab-866e-45c0-ba71-db4cd5f9149b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt" Apr 16 22:13:46.282032 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.281872 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/016deeab-866e-45c0-ba71-db4cd5f9149b-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt\" (UID: \"016deeab-866e-45c0-ba71-db4cd5f9149b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt" Apr 16 22:13:46.282032 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.281905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/016deeab-866e-45c0-ba71-db4cd5f9149b-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt\" (UID: \"016deeab-866e-45c0-ba71-db4cd5f9149b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt" Apr 16 22:13:46.282250 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.282226 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/016deeab-866e-45c0-ba71-db4cd5f9149b-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt\" (UID: \"016deeab-866e-45c0-ba71-db4cd5f9149b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt" Apr 16 22:13:46.282356 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.282323 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/016deeab-866e-45c0-ba71-db4cd5f9149b-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt\" (UID: \"016deeab-866e-45c0-ba71-db4cd5f9149b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt" Apr 16 22:13:46.289341 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.289296 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fjwl\" (UniqueName: \"kubernetes.io/projected/016deeab-866e-45c0-ba71-db4cd5f9149b-kube-api-access-6fjwl\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt\" (UID: \"016deeab-866e-45c0-ba71-db4cd5f9149b\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt" Apr 16 22:13:46.397486 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.397410 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt" Apr 16 22:13:46.489510 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.489482 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7"] Apr 16 22:13:46.494604 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.494583 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7" Apr 16 22:13:46.500174 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.500148 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7"] Apr 16 22:13:46.516570 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.516546 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt"] Apr 16 22:13:46.518798 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:13:46.518775 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod016deeab_866e_45c0_ba71_db4cd5f9149b.slice/crio-3df05c97c07afb254a38e041c46dd766549ba2267e8b5bb82abf4ba4e904db5a WatchSource:0}: Error finding container 3df05c97c07afb254a38e041c46dd766549ba2267e8b5bb82abf4ba4e904db5a: Status 404 returned error can't find the container with id 3df05c97c07afb254a38e041c46dd766549ba2267e8b5bb82abf4ba4e904db5a Apr 16 22:13:46.584472 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.584442 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13dcea9f-b75a-429a-bd14-f3a952ab88c9-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7\" (UID: \"13dcea9f-b75a-429a-bd14-f3a952ab88c9\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7" Apr 16 22:13:46.584589 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.584486 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tdbq\" (UniqueName: \"kubernetes.io/projected/13dcea9f-b75a-429a-bd14-f3a952ab88c9-kube-api-access-4tdbq\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7\" (UID: \"13dcea9f-b75a-429a-bd14-f3a952ab88c9\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7" Apr 16 22:13:46.584589 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.584535 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13dcea9f-b75a-429a-bd14-f3a952ab88c9-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7\" (UID: \"13dcea9f-b75a-429a-bd14-f3a952ab88c9\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7" Apr 16 22:13:46.685188 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.685103 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13dcea9f-b75a-429a-bd14-f3a952ab88c9-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7\" (UID: \"13dcea9f-b75a-429a-bd14-f3a952ab88c9\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7" Apr 16 22:13:46.685188 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.685148 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4tdbq\" (UniqueName: \"kubernetes.io/projected/13dcea9f-b75a-429a-bd14-f3a952ab88c9-kube-api-access-4tdbq\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7\" (UID: \"13dcea9f-b75a-429a-bd14-f3a952ab88c9\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7" Apr 16 22:13:46.685612 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.685342 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13dcea9f-b75a-429a-bd14-f3a952ab88c9-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7\" (UID: \"13dcea9f-b75a-429a-bd14-f3a952ab88c9\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7" Apr 16 22:13:46.685612 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.685531 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13dcea9f-b75a-429a-bd14-f3a952ab88c9-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7\" (UID: \"13dcea9f-b75a-429a-bd14-f3a952ab88c9\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7" Apr 16 22:13:46.685737 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.685713 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13dcea9f-b75a-429a-bd14-f3a952ab88c9-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7\" (UID: \"13dcea9f-b75a-429a-bd14-f3a952ab88c9\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7" Apr 16 22:13:46.695161 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.695136 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tdbq\" (UniqueName: \"kubernetes.io/projected/13dcea9f-b75a-429a-bd14-f3a952ab88c9-kube-api-access-4tdbq\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7\" (UID: \"13dcea9f-b75a-429a-bd14-f3a952ab88c9\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7" Apr 16 22:13:46.807203 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.807147 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7" Apr 16 22:13:46.988691 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.988599 2576 generic.go:358] "Generic (PLEG): container finished" podID="44c36f49-c527-45ce-b977-372564bbf626" containerID="e709a552e006de4aed5d68b3736be9333482134de67b37a709ab82f318b1248a" exitCode=0 Apr 16 22:13:46.988853 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.988683 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h" event={"ID":"44c36f49-c527-45ce-b977-372564bbf626","Type":"ContainerDied","Data":"e709a552e006de4aed5d68b3736be9333482134de67b37a709ab82f318b1248a"} Apr 16 22:13:46.990115 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.990096 2576 generic.go:358] "Generic (PLEG): container finished" podID="016deeab-866e-45c0-ba71-db4cd5f9149b" containerID="5ef25329cf253557f377d0a86a48e19e1e7a6413c4faf8dbecb9a840badb09de" exitCode=0 Apr 16 22:13:46.990207 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.990182 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt" event={"ID":"016deeab-866e-45c0-ba71-db4cd5f9149b","Type":"ContainerDied","Data":"5ef25329cf253557f377d0a86a48e19e1e7a6413c4faf8dbecb9a840badb09de"} Apr 16 22:13:46.990258 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.990214 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt" event={"ID":"016deeab-866e-45c0-ba71-db4cd5f9149b","Type":"ContainerStarted","Data":"3df05c97c07afb254a38e041c46dd766549ba2267e8b5bb82abf4ba4e904db5a"} Apr 16 22:13:46.992055 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.992037 2576 generic.go:358] "Generic (PLEG): container finished" podID="ab32208f-6560-4abb-83f2-8cb550dbed35" containerID="b085d95ca1e5e4c5f157890f8e5b4d3b7d97443cff4cbc75536b95448e79ecef" exitCode=0 Apr 16 22:13:46.992150 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:46.992076 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg" event={"ID":"ab32208f-6560-4abb-83f2-8cb550dbed35","Type":"ContainerDied","Data":"b085d95ca1e5e4c5f157890f8e5b4d3b7d97443cff4cbc75536b95448e79ecef"} Apr 16 22:13:47.143025 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:47.142997 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7"] Apr 16 22:13:47.167324 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:13:47.167265 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13dcea9f_b75a_429a_bd14_f3a952ab88c9.slice/crio-8e7057f0c8d2c4b25f22fbcaf14a7ffc3f8b311e3d4ff9915bb992000d390002 WatchSource:0}: Error finding container 8e7057f0c8d2c4b25f22fbcaf14a7ffc3f8b311e3d4ff9915bb992000d390002: Status 404 returned error can't find the container with id 8e7057f0c8d2c4b25f22fbcaf14a7ffc3f8b311e3d4ff9915bb992000d390002 Apr 16 22:13:47.996973 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:47.996883 2576 generic.go:358] "Generic (PLEG): container finished" podID="44c36f49-c527-45ce-b977-372564bbf626" containerID="ee85ff46ff8a48ae702a3904b693908a6d7d3d370b3d921a945ad43b45bde872" exitCode=0 Apr 16 22:13:47.996973 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:47.996951 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h" event={"ID":"44c36f49-c527-45ce-b977-372564bbf626","Type":"ContainerDied","Data":"ee85ff46ff8a48ae702a3904b693908a6d7d3d370b3d921a945ad43b45bde872"} Apr 16 22:13:47.998183 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:47.998159 2576 generic.go:358] "Generic (PLEG): container finished" podID="13dcea9f-b75a-429a-bd14-f3a952ab88c9" containerID="5e9ee066cded61e34e4e3f0e649af8fc73c8a2534a5691039571e68483c3221d" exitCode=0 Apr 16 22:13:47.998299 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:47.998185 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7" event={"ID":"13dcea9f-b75a-429a-bd14-f3a952ab88c9","Type":"ContainerDied","Data":"5e9ee066cded61e34e4e3f0e649af8fc73c8a2534a5691039571e68483c3221d"} Apr 16 22:13:47.998299 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:47.998214 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7" event={"ID":"13dcea9f-b75a-429a-bd14-f3a952ab88c9","Type":"ContainerStarted","Data":"8e7057f0c8d2c4b25f22fbcaf14a7ffc3f8b311e3d4ff9915bb992000d390002"} Apr 16 22:13:48.000043 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:48.000019 2576 generic.go:358] "Generic (PLEG): container finished" podID="016deeab-866e-45c0-ba71-db4cd5f9149b" containerID="9e93834ee5c5dfee163782e06e3d3bbd00f8f03217de4afea61ed3d4b6a39dbf" exitCode=0 Apr 16 22:13:48.000130 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:48.000056 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt" event={"ID":"016deeab-866e-45c0-ba71-db4cd5f9149b","Type":"ContainerDied","Data":"9e93834ee5c5dfee163782e06e3d3bbd00f8f03217de4afea61ed3d4b6a39dbf"} Apr 16 22:13:48.002170 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:48.002148 2576 generic.go:358] "Generic (PLEG): container finished" podID="ab32208f-6560-4abb-83f2-8cb550dbed35" containerID="a2a3f417635572fbe8928c59a77a01ecc76a82c8c9dc05be66a902c331dba7b9" exitCode=0 Apr 16 22:13:48.002257 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:48.002181 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg" event={"ID":"ab32208f-6560-4abb-83f2-8cb550dbed35","Type":"ContainerDied","Data":"a2a3f417635572fbe8928c59a77a01ecc76a82c8c9dc05be66a902c331dba7b9"} Apr 16 22:13:49.007890 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:49.007799 2576 generic.go:358] "Generic (PLEG): container finished" podID="016deeab-866e-45c0-ba71-db4cd5f9149b" containerID="dbed3f2abb13b4e54a480fd0663cbd8225ecf6156ce6efca2b0faab380a833b2" exitCode=0 Apr 16 22:13:49.007890 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:49.007845 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt" event={"ID":"016deeab-866e-45c0-ba71-db4cd5f9149b","Type":"ContainerDied","Data":"dbed3f2abb13b4e54a480fd0663cbd8225ecf6156ce6efca2b0faab380a833b2"} Apr 16 22:13:49.009740 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:49.009711 2576 generic.go:358] "Generic (PLEG): container finished" podID="44c36f49-c527-45ce-b977-372564bbf626" containerID="6bf6bd5a6fa9a5bf32db1cad873319e3691a9f176f09247aa565598c12da4c06" exitCode=0 Apr 16 22:13:49.009877 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:49.009776 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h" event={"ID":"44c36f49-c527-45ce-b977-372564bbf626","Type":"ContainerDied","Data":"6bf6bd5a6fa9a5bf32db1cad873319e3691a9f176f09247aa565598c12da4c06"} Apr 16 22:13:49.011330 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:49.011270 2576 generic.go:358] "Generic (PLEG): container finished" podID="13dcea9f-b75a-429a-bd14-f3a952ab88c9" containerID="9f0b568ac7fc0e2ae184252276eaac8fa63a11936f7dba0c7adb8336fb3b0cd3" exitCode=0 Apr 16 22:13:49.011427 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:49.011332 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7" event={"ID":"13dcea9f-b75a-429a-bd14-f3a952ab88c9","Type":"ContainerDied","Data":"9f0b568ac7fc0e2ae184252276eaac8fa63a11936f7dba0c7adb8336fb3b0cd3"} Apr 16 22:13:49.143838 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:49.143817 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg" Apr 16 22:13:49.309111 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:49.309025 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqrvr\" (UniqueName: \"kubernetes.io/projected/ab32208f-6560-4abb-83f2-8cb550dbed35-kube-api-access-wqrvr\") pod \"ab32208f-6560-4abb-83f2-8cb550dbed35\" (UID: \"ab32208f-6560-4abb-83f2-8cb550dbed35\") " Apr 16 22:13:49.309111 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:49.309071 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab32208f-6560-4abb-83f2-8cb550dbed35-util\") pod \"ab32208f-6560-4abb-83f2-8cb550dbed35\" (UID: \"ab32208f-6560-4abb-83f2-8cb550dbed35\") " Apr 16 22:13:49.309297 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:49.309180 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab32208f-6560-4abb-83f2-8cb550dbed35-bundle\") pod \"ab32208f-6560-4abb-83f2-8cb550dbed35\" (UID: \"ab32208f-6560-4abb-83f2-8cb550dbed35\") " Apr 16 22:13:49.309713 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:49.309684 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab32208f-6560-4abb-83f2-8cb550dbed35-bundle" (OuterVolumeSpecName: "bundle") pod "ab32208f-6560-4abb-83f2-8cb550dbed35" (UID: "ab32208f-6560-4abb-83f2-8cb550dbed35"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:13:49.311216 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:49.311194 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab32208f-6560-4abb-83f2-8cb550dbed35-kube-api-access-wqrvr" (OuterVolumeSpecName: "kube-api-access-wqrvr") pod "ab32208f-6560-4abb-83f2-8cb550dbed35" (UID: "ab32208f-6560-4abb-83f2-8cb550dbed35"). InnerVolumeSpecName "kube-api-access-wqrvr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:13:49.314611 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:49.314590 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab32208f-6560-4abb-83f2-8cb550dbed35-util" (OuterVolumeSpecName: "util") pod "ab32208f-6560-4abb-83f2-8cb550dbed35" (UID: "ab32208f-6560-4abb-83f2-8cb550dbed35"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:13:49.410321 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:49.410279 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wqrvr\" (UniqueName: \"kubernetes.io/projected/ab32208f-6560-4abb-83f2-8cb550dbed35-kube-api-access-wqrvr\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:13:49.410487 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:49.410343 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ab32208f-6560-4abb-83f2-8cb550dbed35-util\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:13:49.410487 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:49.410359 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ab32208f-6560-4abb-83f2-8cb550dbed35-bundle\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:13:50.017851 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.017812 2576 generic.go:358] "Generic (PLEG): container finished" podID="13dcea9f-b75a-429a-bd14-f3a952ab88c9" containerID="f9c364766037b89b8c5a9193777c3b55802d93504dc457f03ceff825dacf1ae5" exitCode=0 Apr 16 22:13:50.018279 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.017931 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7" event={"ID":"13dcea9f-b75a-429a-bd14-f3a952ab88c9","Type":"ContainerDied","Data":"f9c364766037b89b8c5a9193777c3b55802d93504dc457f03ceff825dacf1ae5"} Apr 16 22:13:50.019635 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.019610 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg" event={"ID":"ab32208f-6560-4abb-83f2-8cb550dbed35","Type":"ContainerDied","Data":"12fc9849b2937d36ec4962a99ce477498f73bfc8e72ec75620ed9bfe571196fd"} Apr 16 22:13:50.019764 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.019642 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12fc9849b2937d36ec4962a99ce477498f73bfc8e72ec75620ed9bfe571196fd" Apr 16 22:13:50.020354 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.020013 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg" Apr 16 22:13:50.144757 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.144735 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h" Apr 16 22:13:50.176164 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.176145 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt" Apr 16 22:13:50.316574 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.316540 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/016deeab-866e-45c0-ba71-db4cd5f9149b-bundle\") pod \"016deeab-866e-45c0-ba71-db4cd5f9149b\" (UID: \"016deeab-866e-45c0-ba71-db4cd5f9149b\") " Apr 16 22:13:50.316732 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.316607 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fjwl\" (UniqueName: \"kubernetes.io/projected/016deeab-866e-45c0-ba71-db4cd5f9149b-kube-api-access-6fjwl\") pod \"016deeab-866e-45c0-ba71-db4cd5f9149b\" (UID: \"016deeab-866e-45c0-ba71-db4cd5f9149b\") " Apr 16 22:13:50.316732 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.316650 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44c36f49-c527-45ce-b977-372564bbf626-util\") pod \"44c36f49-c527-45ce-b977-372564bbf626\" (UID: \"44c36f49-c527-45ce-b977-372564bbf626\") " Apr 16 22:13:50.316732 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.316715 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt4pq\" (UniqueName: \"kubernetes.io/projected/44c36f49-c527-45ce-b977-372564bbf626-kube-api-access-mt4pq\") pod \"44c36f49-c527-45ce-b977-372564bbf626\" (UID: \"44c36f49-c527-45ce-b977-372564bbf626\") " Apr 16 22:13:50.316903 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.316743 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44c36f49-c527-45ce-b977-372564bbf626-bundle\") pod \"44c36f49-c527-45ce-b977-372564bbf626\" (UID: \"44c36f49-c527-45ce-b977-372564bbf626\") " Apr 16 22:13:50.316903 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.316776 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/016deeab-866e-45c0-ba71-db4cd5f9149b-util\") pod \"016deeab-866e-45c0-ba71-db4cd5f9149b\" (UID: \"016deeab-866e-45c0-ba71-db4cd5f9149b\") " Apr 16 22:13:50.317602 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.317288 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/016deeab-866e-45c0-ba71-db4cd5f9149b-bundle" (OuterVolumeSpecName: "bundle") pod "016deeab-866e-45c0-ba71-db4cd5f9149b" (UID: "016deeab-866e-45c0-ba71-db4cd5f9149b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:13:50.317602 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.317298 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44c36f49-c527-45ce-b977-372564bbf626-bundle" (OuterVolumeSpecName: "bundle") pod "44c36f49-c527-45ce-b977-372564bbf626" (UID: "44c36f49-c527-45ce-b977-372564bbf626"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:13:50.318822 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.318800 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/016deeab-866e-45c0-ba71-db4cd5f9149b-kube-api-access-6fjwl" (OuterVolumeSpecName: "kube-api-access-6fjwl") pod "016deeab-866e-45c0-ba71-db4cd5f9149b" (UID: "016deeab-866e-45c0-ba71-db4cd5f9149b"). InnerVolumeSpecName "kube-api-access-6fjwl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:13:50.319134 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.319113 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44c36f49-c527-45ce-b977-372564bbf626-kube-api-access-mt4pq" (OuterVolumeSpecName: "kube-api-access-mt4pq") pod "44c36f49-c527-45ce-b977-372564bbf626" (UID: "44c36f49-c527-45ce-b977-372564bbf626"). InnerVolumeSpecName "kube-api-access-mt4pq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:13:50.321797 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.321758 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44c36f49-c527-45ce-b977-372564bbf626-util" (OuterVolumeSpecName: "util") pod "44c36f49-c527-45ce-b977-372564bbf626" (UID: "44c36f49-c527-45ce-b977-372564bbf626"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:13:50.323046 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.323020 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/016deeab-866e-45c0-ba71-db4cd5f9149b-util" (OuterVolumeSpecName: "util") pod "016deeab-866e-45c0-ba71-db4cd5f9149b" (UID: "016deeab-866e-45c0-ba71-db4cd5f9149b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:13:50.418336 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.418231 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mt4pq\" (UniqueName: \"kubernetes.io/projected/44c36f49-c527-45ce-b977-372564bbf626-kube-api-access-mt4pq\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:13:50.418336 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.418256 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/44c36f49-c527-45ce-b977-372564bbf626-bundle\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:13:50.418336 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.418267 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/016deeab-866e-45c0-ba71-db4cd5f9149b-util\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:13:50.418336 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.418275 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/016deeab-866e-45c0-ba71-db4cd5f9149b-bundle\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:13:50.418336 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.418285 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6fjwl\" (UniqueName: \"kubernetes.io/projected/016deeab-866e-45c0-ba71-db4cd5f9149b-kube-api-access-6fjwl\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:13:50.418336 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:50.418294 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/44c36f49-c527-45ce-b977-372564bbf626-util\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:13:51.025134 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:51.025095 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h" event={"ID":"44c36f49-c527-45ce-b977-372564bbf626","Type":"ContainerDied","Data":"ffeee2aa1a72e6ede03d08c80eca69fb81a732b8347d9efde8a8cb880b95ada7"} Apr 16 22:13:51.025134 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:51.025109 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h" Apr 16 22:13:51.025134 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:51.025128 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ffeee2aa1a72e6ede03d08c80eca69fb81a732b8347d9efde8a8cb880b95ada7" Apr 16 22:13:51.026746 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:51.026728 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt" Apr 16 22:13:51.026872 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:51.026758 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt" event={"ID":"016deeab-866e-45c0-ba71-db4cd5f9149b","Type":"ContainerDied","Data":"3df05c97c07afb254a38e041c46dd766549ba2267e8b5bb82abf4ba4e904db5a"} Apr 16 22:13:51.026872 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:51.026785 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3df05c97c07afb254a38e041c46dd766549ba2267e8b5bb82abf4ba4e904db5a" Apr 16 22:13:51.150744 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:51.150722 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7" Apr 16 22:13:51.326481 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:51.326443 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13dcea9f-b75a-429a-bd14-f3a952ab88c9-bundle\") pod \"13dcea9f-b75a-429a-bd14-f3a952ab88c9\" (UID: \"13dcea9f-b75a-429a-bd14-f3a952ab88c9\") " Apr 16 22:13:51.326684 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:51.326490 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tdbq\" (UniqueName: \"kubernetes.io/projected/13dcea9f-b75a-429a-bd14-f3a952ab88c9-kube-api-access-4tdbq\") pod \"13dcea9f-b75a-429a-bd14-f3a952ab88c9\" (UID: \"13dcea9f-b75a-429a-bd14-f3a952ab88c9\") " Apr 16 22:13:51.326684 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:51.326549 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13dcea9f-b75a-429a-bd14-f3a952ab88c9-util\") pod \"13dcea9f-b75a-429a-bd14-f3a952ab88c9\" (UID: \"13dcea9f-b75a-429a-bd14-f3a952ab88c9\") " Apr 16 22:13:51.327014 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:51.326988 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13dcea9f-b75a-429a-bd14-f3a952ab88c9-bundle" (OuterVolumeSpecName: "bundle") pod "13dcea9f-b75a-429a-bd14-f3a952ab88c9" (UID: "13dcea9f-b75a-429a-bd14-f3a952ab88c9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:13:51.328690 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:51.328660 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13dcea9f-b75a-429a-bd14-f3a952ab88c9-kube-api-access-4tdbq" (OuterVolumeSpecName: "kube-api-access-4tdbq") pod "13dcea9f-b75a-429a-bd14-f3a952ab88c9" (UID: "13dcea9f-b75a-429a-bd14-f3a952ab88c9"). InnerVolumeSpecName "kube-api-access-4tdbq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:13:51.331931 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:51.331907 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13dcea9f-b75a-429a-bd14-f3a952ab88c9-util" (OuterVolumeSpecName: "util") pod "13dcea9f-b75a-429a-bd14-f3a952ab88c9" (UID: "13dcea9f-b75a-429a-bd14-f3a952ab88c9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:13:51.427521 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:51.427425 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13dcea9f-b75a-429a-bd14-f3a952ab88c9-bundle\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:13:51.427521 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:51.427466 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4tdbq\" (UniqueName: \"kubernetes.io/projected/13dcea9f-b75a-429a-bd14-f3a952ab88c9-kube-api-access-4tdbq\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:13:51.427521 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:51.427476 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13dcea9f-b75a-429a-bd14-f3a952ab88c9-util\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:13:52.032321 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:52.032275 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7" Apr 16 22:13:52.032321 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:52.032268 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7" event={"ID":"13dcea9f-b75a-429a-bd14-f3a952ab88c9","Type":"ContainerDied","Data":"8e7057f0c8d2c4b25f22fbcaf14a7ffc3f8b311e3d4ff9915bb992000d390002"} Apr 16 22:13:52.032709 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:13:52.032334 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e7057f0c8d2c4b25f22fbcaf14a7ffc3f8b311e3d4ff9915bb992000d390002" Apr 16 22:14:02.771048 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771014 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt"] Apr 16 22:14:02.771421 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771349 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab32208f-6560-4abb-83f2-8cb550dbed35" containerName="pull" Apr 16 22:14:02.771421 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771361 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab32208f-6560-4abb-83f2-8cb550dbed35" containerName="pull" Apr 16 22:14:02.771421 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771369 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab32208f-6560-4abb-83f2-8cb550dbed35" containerName="extract" Apr 16 22:14:02.771421 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771374 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab32208f-6560-4abb-83f2-8cb550dbed35" containerName="extract" Apr 16 22:14:02.771421 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771381 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13dcea9f-b75a-429a-bd14-f3a952ab88c9" containerName="pull" Apr 16 22:14:02.771421 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771387 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dcea9f-b75a-429a-bd14-f3a952ab88c9" containerName="pull" Apr 16 22:14:02.771421 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771394 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44c36f49-c527-45ce-b977-372564bbf626" containerName="util" Apr 16 22:14:02.771421 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771400 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c36f49-c527-45ce-b977-372564bbf626" containerName="util" Apr 16 22:14:02.771421 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771408 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44c36f49-c527-45ce-b977-372564bbf626" containerName="pull" Apr 16 22:14:02.771421 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771413 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c36f49-c527-45ce-b977-372564bbf626" containerName="pull" Apr 16 22:14:02.771421 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771422 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13dcea9f-b75a-429a-bd14-f3a952ab88c9" containerName="util" Apr 16 22:14:02.771421 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771427 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dcea9f-b75a-429a-bd14-f3a952ab88c9" containerName="util" Apr 16 22:14:02.771762 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771435 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="016deeab-866e-45c0-ba71-db4cd5f9149b" containerName="extract" Apr 16 22:14:02.771762 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771441 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="016deeab-866e-45c0-ba71-db4cd5f9149b" containerName="extract" Apr 16 22:14:02.771762 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771446 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="016deeab-866e-45c0-ba71-db4cd5f9149b" containerName="util" Apr 16 22:14:02.771762 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771451 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="016deeab-866e-45c0-ba71-db4cd5f9149b" containerName="util" Apr 16 22:14:02.771762 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771459 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="44c36f49-c527-45ce-b977-372564bbf626" containerName="extract" Apr 16 22:14:02.771762 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771466 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="44c36f49-c527-45ce-b977-372564bbf626" containerName="extract" Apr 16 22:14:02.771762 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771475 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="016deeab-866e-45c0-ba71-db4cd5f9149b" containerName="pull" Apr 16 22:14:02.771762 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771480 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="016deeab-866e-45c0-ba71-db4cd5f9149b" containerName="pull" Apr 16 22:14:02.771762 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771486 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13dcea9f-b75a-429a-bd14-f3a952ab88c9" containerName="extract" Apr 16 22:14:02.771762 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771491 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="13dcea9f-b75a-429a-bd14-f3a952ab88c9" containerName="extract" Apr 16 22:14:02.771762 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771499 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab32208f-6560-4abb-83f2-8cb550dbed35" containerName="util" Apr 16 22:14:02.771762 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771504 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab32208f-6560-4abb-83f2-8cb550dbed35" containerName="util" Apr 16 22:14:02.771762 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771555 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab32208f-6560-4abb-83f2-8cb550dbed35" containerName="extract" Apr 16 22:14:02.771762 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771562 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="13dcea9f-b75a-429a-bd14-f3a952ab88c9" containerName="extract" Apr 16 22:14:02.771762 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771571 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="016deeab-866e-45c0-ba71-db4cd5f9149b" containerName="extract" Apr 16 22:14:02.771762 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.771579 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="44c36f49-c527-45ce-b977-372564bbf626" containerName="extract" Apr 16 22:14:02.778340 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.778296 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt" Apr 16 22:14:02.780862 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.780841 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-kvhzx\"" Apr 16 22:14:02.785973 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.785950 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt"] Apr 16 22:14:02.826299 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.826272 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw9tf\" (UniqueName: \"kubernetes.io/projected/6564a76d-2b9b-4067-af0a-e415be3dc7ea-kube-api-access-tw9tf\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-x78wt\" (UID: \"6564a76d-2b9b-4067-af0a-e415be3dc7ea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt" Apr 16 22:14:02.826451 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.826364 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6564a76d-2b9b-4067-af0a-e415be3dc7ea-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-x78wt\" (UID: \"6564a76d-2b9b-4067-af0a-e415be3dc7ea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt" Apr 16 22:14:02.927328 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.927267 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6564a76d-2b9b-4067-af0a-e415be3dc7ea-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-x78wt\" (UID: \"6564a76d-2b9b-4067-af0a-e415be3dc7ea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt" Apr 16 22:14:02.927484 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.927389 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tw9tf\" (UniqueName: \"kubernetes.io/projected/6564a76d-2b9b-4067-af0a-e415be3dc7ea-kube-api-access-tw9tf\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-x78wt\" (UID: \"6564a76d-2b9b-4067-af0a-e415be3dc7ea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt" Apr 16 22:14:02.927650 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.927632 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6564a76d-2b9b-4067-af0a-e415be3dc7ea-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-x78wt\" (UID: \"6564a76d-2b9b-4067-af0a-e415be3dc7ea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt" Apr 16 22:14:02.940785 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:02.940752 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw9tf\" (UniqueName: \"kubernetes.io/projected/6564a76d-2b9b-4067-af0a-e415be3dc7ea-kube-api-access-tw9tf\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-x78wt\" (UID: \"6564a76d-2b9b-4067-af0a-e415be3dc7ea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt" Apr 16 22:14:03.089511 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:03.089487 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt" Apr 16 22:14:03.217084 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:03.217057 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt"] Apr 16 22:14:03.218867 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:14:03.218832 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6564a76d_2b9b_4067_af0a_e415be3dc7ea.slice/crio-363fcf6fe64cb2df3e6b4df5cb00a58e8fbfdbe9a92b5346342d2293bde3f3ba WatchSource:0}: Error finding container 363fcf6fe64cb2df3e6b4df5cb00a58e8fbfdbe9a92b5346342d2293bde3f3ba: Status 404 returned error can't find the container with id 363fcf6fe64cb2df3e6b4df5cb00a58e8fbfdbe9a92b5346342d2293bde3f3ba Apr 16 22:14:04.074377 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:04.074337 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt" event={"ID":"6564a76d-2b9b-4067-af0a-e415be3dc7ea","Type":"ContainerStarted","Data":"363fcf6fe64cb2df3e6b4df5cb00a58e8fbfdbe9a92b5346342d2293bde3f3ba"} Apr 16 22:14:08.093277 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:08.093226 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt" event={"ID":"6564a76d-2b9b-4067-af0a-e415be3dc7ea","Type":"ContainerStarted","Data":"f63e60132e25d4d5b675feabd1526ed91abfa091c6e73c2a5dc7a89615498459"} Apr 16 22:14:08.094030 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:08.093356 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt" Apr 16 22:14:08.113969 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:08.113916 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt" podStartSLOduration=2.068983997 podStartE2EDuration="6.11390104s" podCreationTimestamp="2026-04-16 22:14:02 +0000 UTC" firstStartedPulling="2026-04-16 22:14:03.221047772 +0000 UTC m=+547.481773856" lastFinishedPulling="2026-04-16 22:14:07.265964813 +0000 UTC m=+551.526690899" observedRunningTime="2026-04-16 22:14:08.110695582 +0000 UTC m=+552.371421696" watchObservedRunningTime="2026-04-16 22:14:08.11390104 +0000 UTC m=+552.374627144" Apr 16 22:14:10.354341 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:10.354293 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cw8r9"] Apr 16 22:14:10.357895 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:10.357873 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cw8r9" Apr 16 22:14:10.360021 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:10.359995 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 16 22:14:10.360140 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:10.360119 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-xfwsw\"" Apr 16 22:14:10.360202 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:10.360119 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 16 22:14:10.364251 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:10.364221 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cw8r9"] Apr 16 22:14:10.386077 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:10.386049 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3dce1467-c2b9-482d-baf3-6f1e2bcf237c-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-cw8r9\" (UID: \"3dce1467-c2b9-482d-baf3-6f1e2bcf237c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cw8r9" Apr 16 22:14:10.386182 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:10.386099 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3dce1467-c2b9-482d-baf3-6f1e2bcf237c-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-cw8r9\" (UID: \"3dce1467-c2b9-482d-baf3-6f1e2bcf237c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cw8r9" Apr 16 22:14:10.386182 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:10.386162 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bh9g\" (UniqueName: \"kubernetes.io/projected/3dce1467-c2b9-482d-baf3-6f1e2bcf237c-kube-api-access-7bh9g\") pod \"kuadrant-console-plugin-6cb54b5c86-cw8r9\" (UID: \"3dce1467-c2b9-482d-baf3-6f1e2bcf237c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cw8r9" Apr 16 22:14:10.486716 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:10.486686 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3dce1467-c2b9-482d-baf3-6f1e2bcf237c-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-cw8r9\" (UID: \"3dce1467-c2b9-482d-baf3-6f1e2bcf237c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cw8r9" Apr 16 22:14:10.486861 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:10.486730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3dce1467-c2b9-482d-baf3-6f1e2bcf237c-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-cw8r9\" (UID: \"3dce1467-c2b9-482d-baf3-6f1e2bcf237c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cw8r9" Apr 16 22:14:10.486861 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:10.486752 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7bh9g\" (UniqueName: \"kubernetes.io/projected/3dce1467-c2b9-482d-baf3-6f1e2bcf237c-kube-api-access-7bh9g\") pod \"kuadrant-console-plugin-6cb54b5c86-cw8r9\" (UID: \"3dce1467-c2b9-482d-baf3-6f1e2bcf237c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cw8r9" Apr 16 22:14:10.486861 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:14:10.486848 2576 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 16 22:14:10.486983 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:14:10.486927 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dce1467-c2b9-482d-baf3-6f1e2bcf237c-plugin-serving-cert podName:3dce1467-c2b9-482d-baf3-6f1e2bcf237c nodeName:}" failed. No retries permitted until 2026-04-16 22:14:10.986905724 +0000 UTC m=+555.247631813 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/3dce1467-c2b9-482d-baf3-6f1e2bcf237c-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-cw8r9" (UID: "3dce1467-c2b9-482d-baf3-6f1e2bcf237c") : secret "plugin-serving-cert" not found Apr 16 22:14:10.487337 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:10.487293 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/3dce1467-c2b9-482d-baf3-6f1e2bcf237c-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-cw8r9\" (UID: \"3dce1467-c2b9-482d-baf3-6f1e2bcf237c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cw8r9" Apr 16 22:14:10.496730 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:10.496704 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bh9g\" (UniqueName: \"kubernetes.io/projected/3dce1467-c2b9-482d-baf3-6f1e2bcf237c-kube-api-access-7bh9g\") pod \"kuadrant-console-plugin-6cb54b5c86-cw8r9\" (UID: \"3dce1467-c2b9-482d-baf3-6f1e2bcf237c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cw8r9" Apr 16 22:14:10.990359 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:10.990327 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3dce1467-c2b9-482d-baf3-6f1e2bcf237c-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-cw8r9\" (UID: \"3dce1467-c2b9-482d-baf3-6f1e2bcf237c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cw8r9" Apr 16 22:14:10.992551 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:10.992530 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/3dce1467-c2b9-482d-baf3-6f1e2bcf237c-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-cw8r9\" (UID: \"3dce1467-c2b9-482d-baf3-6f1e2bcf237c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cw8r9" Apr 16 22:14:11.268622 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:11.268546 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cw8r9" Apr 16 22:14:11.391730 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:11.391700 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cw8r9"] Apr 16 22:14:11.395185 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:14:11.395158 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3dce1467_c2b9_482d_baf3_6f1e2bcf237c.slice/crio-081414b0caa43a5362da7732d9e8a1ae1147f722de3df4d96d5e8c5371deba46 WatchSource:0}: Error finding container 081414b0caa43a5362da7732d9e8a1ae1147f722de3df4d96d5e8c5371deba46: Status 404 returned error can't find the container with id 081414b0caa43a5362da7732d9e8a1ae1147f722de3df4d96d5e8c5371deba46 Apr 16 22:14:11.428049 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:11.428024 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-65cbp"] Apr 16 22:14:11.432431 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:11.432409 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-65cbp" Apr 16 22:14:11.434487 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:11.434469 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 22:14:11.434574 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:11.434471 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-rxv7h\"" Apr 16 22:14:11.439117 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:11.439083 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-65cbp"] Apr 16 22:14:11.496520 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:11.496487 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kxc5\" (UniqueName: \"kubernetes.io/projected/131861e7-abf2-48d6-a9d5-136cf8543227-kube-api-access-8kxc5\") pod \"dns-operator-controller-manager-648d5c98bc-65cbp\" (UID: \"131861e7-abf2-48d6-a9d5-136cf8543227\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-65cbp" Apr 16 22:14:11.597365 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:11.597299 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kxc5\" (UniqueName: \"kubernetes.io/projected/131861e7-abf2-48d6-a9d5-136cf8543227-kube-api-access-8kxc5\") pod \"dns-operator-controller-manager-648d5c98bc-65cbp\" (UID: \"131861e7-abf2-48d6-a9d5-136cf8543227\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-65cbp" Apr 16 22:14:11.606900 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:11.606882 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kxc5\" (UniqueName: \"kubernetes.io/projected/131861e7-abf2-48d6-a9d5-136cf8543227-kube-api-access-8kxc5\") pod \"dns-operator-controller-manager-648d5c98bc-65cbp\" (UID: \"131861e7-abf2-48d6-a9d5-136cf8543227\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-65cbp" Apr 16 22:14:11.743866 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:11.743834 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-65cbp" Apr 16 22:14:11.874765 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:11.874681 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-65cbp"] Apr 16 22:14:11.877322 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:14:11.877279 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod131861e7_abf2_48d6_a9d5_136cf8543227.slice/crio-689b4b24aa79e4f75482d79609e209416f2a15e3f119249921fe708bbc5ef794 WatchSource:0}: Error finding container 689b4b24aa79e4f75482d79609e209416f2a15e3f119249921fe708bbc5ef794: Status 404 returned error can't find the container with id 689b4b24aa79e4f75482d79609e209416f2a15e3f119249921fe708bbc5ef794 Apr 16 22:14:12.117067 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:12.117017 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cw8r9" event={"ID":"3dce1467-c2b9-482d-baf3-6f1e2bcf237c","Type":"ContainerStarted","Data":"081414b0caa43a5362da7732d9e8a1ae1147f722de3df4d96d5e8c5371deba46"} Apr 16 22:14:12.118553 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:12.118524 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-65cbp" event={"ID":"131861e7-abf2-48d6-a9d5-136cf8543227","Type":"ContainerStarted","Data":"689b4b24aa79e4f75482d79609e209416f2a15e3f119249921fe708bbc5ef794"} Apr 16 22:14:16.139222 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:16.139178 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-65cbp" event={"ID":"131861e7-abf2-48d6-a9d5-136cf8543227","Type":"ContainerStarted","Data":"dce9490c8823c594fa5c8e935fd63eab234fb139b5fbb60bac9f518470e0b311"} Apr 16 22:14:16.139707 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:16.139440 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-65cbp" Apr 16 22:14:16.155752 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:16.155711 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-65cbp" podStartSLOduration=1.208879352 podStartE2EDuration="5.155696539s" podCreationTimestamp="2026-04-16 22:14:11 +0000 UTC" firstStartedPulling="2026-04-16 22:14:11.879418083 +0000 UTC m=+556.140144166" lastFinishedPulling="2026-04-16 22:14:15.826235258 +0000 UTC m=+560.086961353" observedRunningTime="2026-04-16 22:14:16.154501607 +0000 UTC m=+560.415227713" watchObservedRunningTime="2026-04-16 22:14:16.155696539 +0000 UTC m=+560.416422645" Apr 16 22:14:17.517450 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.517405 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-586fc7995b-gf69z"] Apr 16 22:14:17.524388 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.524359 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.548948 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.548924 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-586fc7995b-gf69z"] Apr 16 22:14:17.550349 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.550296 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6165f3d-da4e-403b-a5c6-c1506f6a34c8-oauth-serving-cert\") pod \"console-586fc7995b-gf69z\" (UID: \"a6165f3d-da4e-403b-a5c6-c1506f6a34c8\") " pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.550472 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.550372 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6165f3d-da4e-403b-a5c6-c1506f6a34c8-console-config\") pod \"console-586fc7995b-gf69z\" (UID: \"a6165f3d-da4e-403b-a5c6-c1506f6a34c8\") " pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.550472 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.550406 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6165f3d-da4e-403b-a5c6-c1506f6a34c8-console-serving-cert\") pod \"console-586fc7995b-gf69z\" (UID: \"a6165f3d-da4e-403b-a5c6-c1506f6a34c8\") " pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.550572 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.550487 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8nx5\" (UniqueName: \"kubernetes.io/projected/a6165f3d-da4e-403b-a5c6-c1506f6a34c8-kube-api-access-v8nx5\") pod \"console-586fc7995b-gf69z\" (UID: \"a6165f3d-da4e-403b-a5c6-c1506f6a34c8\") " pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.550572 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.550512 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6165f3d-da4e-403b-a5c6-c1506f6a34c8-service-ca\") pod \"console-586fc7995b-gf69z\" (UID: \"a6165f3d-da4e-403b-a5c6-c1506f6a34c8\") " pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.550572 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.550550 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6165f3d-da4e-403b-a5c6-c1506f6a34c8-console-oauth-config\") pod \"console-586fc7995b-gf69z\" (UID: \"a6165f3d-da4e-403b-a5c6-c1506f6a34c8\") " pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.550715 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.550572 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6165f3d-da4e-403b-a5c6-c1506f6a34c8-trusted-ca-bundle\") pod \"console-586fc7995b-gf69z\" (UID: \"a6165f3d-da4e-403b-a5c6-c1506f6a34c8\") " pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.651867 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.651819 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6165f3d-da4e-403b-a5c6-c1506f6a34c8-oauth-serving-cert\") pod \"console-586fc7995b-gf69z\" (UID: \"a6165f3d-da4e-403b-a5c6-c1506f6a34c8\") " pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.652052 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.651879 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6165f3d-da4e-403b-a5c6-c1506f6a34c8-console-config\") pod \"console-586fc7995b-gf69z\" (UID: \"a6165f3d-da4e-403b-a5c6-c1506f6a34c8\") " pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.652052 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.651911 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6165f3d-da4e-403b-a5c6-c1506f6a34c8-console-serving-cert\") pod \"console-586fc7995b-gf69z\" (UID: \"a6165f3d-da4e-403b-a5c6-c1506f6a34c8\") " pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.652052 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.651964 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v8nx5\" (UniqueName: \"kubernetes.io/projected/a6165f3d-da4e-403b-a5c6-c1506f6a34c8-kube-api-access-v8nx5\") pod \"console-586fc7995b-gf69z\" (UID: \"a6165f3d-da4e-403b-a5c6-c1506f6a34c8\") " pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.652052 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.651994 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6165f3d-da4e-403b-a5c6-c1506f6a34c8-service-ca\") pod \"console-586fc7995b-gf69z\" (UID: \"a6165f3d-da4e-403b-a5c6-c1506f6a34c8\") " pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.652052 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.652029 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6165f3d-da4e-403b-a5c6-c1506f6a34c8-console-oauth-config\") pod \"console-586fc7995b-gf69z\" (UID: \"a6165f3d-da4e-403b-a5c6-c1506f6a34c8\") " pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.652052 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.652052 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6165f3d-da4e-403b-a5c6-c1506f6a34c8-trusted-ca-bundle\") pod \"console-586fc7995b-gf69z\" (UID: \"a6165f3d-da4e-403b-a5c6-c1506f6a34c8\") " pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.653246 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.652749 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6165f3d-da4e-403b-a5c6-c1506f6a34c8-oauth-serving-cert\") pod \"console-586fc7995b-gf69z\" (UID: \"a6165f3d-da4e-403b-a5c6-c1506f6a34c8\") " pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.653246 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.653074 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6165f3d-da4e-403b-a5c6-c1506f6a34c8-console-config\") pod \"console-586fc7995b-gf69z\" (UID: \"a6165f3d-da4e-403b-a5c6-c1506f6a34c8\") " pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.653246 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.653149 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6165f3d-da4e-403b-a5c6-c1506f6a34c8-trusted-ca-bundle\") pod \"console-586fc7995b-gf69z\" (UID: \"a6165f3d-da4e-403b-a5c6-c1506f6a34c8\") " pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.653246 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.653177 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6165f3d-da4e-403b-a5c6-c1506f6a34c8-service-ca\") pod \"console-586fc7995b-gf69z\" (UID: \"a6165f3d-da4e-403b-a5c6-c1506f6a34c8\") " pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.654998 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.654971 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6165f3d-da4e-403b-a5c6-c1506f6a34c8-console-oauth-config\") pod \"console-586fc7995b-gf69z\" (UID: \"a6165f3d-da4e-403b-a5c6-c1506f6a34c8\") " pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.655694 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.655674 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6165f3d-da4e-403b-a5c6-c1506f6a34c8-console-serving-cert\") pod \"console-586fc7995b-gf69z\" (UID: \"a6165f3d-da4e-403b-a5c6-c1506f6a34c8\") " pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.659819 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.659797 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8nx5\" (UniqueName: \"kubernetes.io/projected/a6165f3d-da4e-403b-a5c6-c1506f6a34c8-kube-api-access-v8nx5\") pod \"console-586fc7995b-gf69z\" (UID: \"a6165f3d-da4e-403b-a5c6-c1506f6a34c8\") " pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.837475 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.837443 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:17.984040 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:17.984015 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-586fc7995b-gf69z"] Apr 16 22:14:17.986016 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:14:17.985987 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6165f3d_da4e_403b_a5c6_c1506f6a34c8.slice/crio-e597cab932529dfc98fcf8f7d3a6dc0ec664c0d004f2d99027b48ae5732dfd85 WatchSource:0}: Error finding container e597cab932529dfc98fcf8f7d3a6dc0ec664c0d004f2d99027b48ae5732dfd85: Status 404 returned error can't find the container with id e597cab932529dfc98fcf8f7d3a6dc0ec664c0d004f2d99027b48ae5732dfd85 Apr 16 22:14:18.147820 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:18.147732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-586fc7995b-gf69z" event={"ID":"a6165f3d-da4e-403b-a5c6-c1506f6a34c8","Type":"ContainerStarted","Data":"58236e1ec1cda640f4dbb368dbd13cf01045f82ec2bbebed27d2aec912195bf4"} Apr 16 22:14:18.147820 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:18.147776 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-586fc7995b-gf69z" event={"ID":"a6165f3d-da4e-403b-a5c6-c1506f6a34c8","Type":"ContainerStarted","Data":"e597cab932529dfc98fcf8f7d3a6dc0ec664c0d004f2d99027b48ae5732dfd85"} Apr 16 22:14:18.164623 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:18.164574 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-586fc7995b-gf69z" podStartSLOduration=1.164557149 podStartE2EDuration="1.164557149s" podCreationTimestamp="2026-04-16 22:14:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:14:18.162898769 +0000 UTC m=+562.423624890" watchObservedRunningTime="2026-04-16 22:14:18.164557149 +0000 UTC m=+562.425283253" Apr 16 22:14:19.099542 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:19.099514 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt" Apr 16 22:14:19.959220 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:19.959185 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt"] Apr 16 22:14:19.959567 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:19.959497 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt" podUID="6564a76d-2b9b-4067-af0a-e415be3dc7ea" containerName="manager" containerID="cri-o://f63e60132e25d4d5b675feabd1526ed91abfa091c6e73c2a5dc7a89615498459" gracePeriod=2 Apr 16 22:14:19.961933 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:19.961894 2576 status_manager.go:895] "Failed to get status for pod" podUID="6564a76d-2b9b-4067-af0a-e415be3dc7ea" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-x78wt\" is forbidden: User \"system:node:ip-10-0-130-26.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-26.ec2.internal' and this object" Apr 16 22:14:19.963644 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:19.963606 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt"] Apr 16 22:14:19.983727 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:19.983636 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2"] Apr 16 22:14:19.984179 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:19.984097 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6564a76d-2b9b-4067-af0a-e415be3dc7ea" containerName="manager" Apr 16 22:14:19.984179 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:19.984121 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6564a76d-2b9b-4067-af0a-e415be3dc7ea" containerName="manager" Apr 16 22:14:19.984501 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:19.984224 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6564a76d-2b9b-4067-af0a-e415be3dc7ea" containerName="manager" Apr 16 22:14:19.987642 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:19.987287 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2" Apr 16 22:14:19.991544 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:19.991391 2576 status_manager.go:895] "Failed to get status for pod" podUID="6564a76d-2b9b-4067-af0a-e415be3dc7ea" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-x78wt\" is forbidden: User \"system:node:ip-10-0-130-26.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-26.ec2.internal' and this object" Apr 16 22:14:20.003111 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:20.003078 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2"] Apr 16 22:14:20.076214 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:20.076176 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wn2b\" (UniqueName: \"kubernetes.io/projected/fddaa5ef-d3b0-4194-81c5-36bf122cf797-kube-api-access-7wn2b\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-twgq2\" (UID: \"fddaa5ef-d3b0-4194-81c5-36bf122cf797\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2" Apr 16 22:14:20.076449 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:20.076344 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fddaa5ef-d3b0-4194-81c5-36bf122cf797-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-twgq2\" (UID: \"fddaa5ef-d3b0-4194-81c5-36bf122cf797\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2" Apr 16 22:14:20.158085 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:20.158049 2576 generic.go:358] "Generic (PLEG): container finished" podID="6564a76d-2b9b-4067-af0a-e415be3dc7ea" containerID="f63e60132e25d4d5b675feabd1526ed91abfa091c6e73c2a5dc7a89615498459" exitCode=0 Apr 16 22:14:20.177681 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:20.177650 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7wn2b\" (UniqueName: \"kubernetes.io/projected/fddaa5ef-d3b0-4194-81c5-36bf122cf797-kube-api-access-7wn2b\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-twgq2\" (UID: \"fddaa5ef-d3b0-4194-81c5-36bf122cf797\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2" Apr 16 22:14:20.177830 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:20.177735 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fddaa5ef-d3b0-4194-81c5-36bf122cf797-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-twgq2\" (UID: \"fddaa5ef-d3b0-4194-81c5-36bf122cf797\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2" Apr 16 22:14:20.178108 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:20.178079 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fddaa5ef-d3b0-4194-81c5-36bf122cf797-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-twgq2\" (UID: \"fddaa5ef-d3b0-4194-81c5-36bf122cf797\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2" Apr 16 22:14:20.191149 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:20.191096 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wn2b\" (UniqueName: \"kubernetes.io/projected/fddaa5ef-d3b0-4194-81c5-36bf122cf797-kube-api-access-7wn2b\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-twgq2\" (UID: \"fddaa5ef-d3b0-4194-81c5-36bf122cf797\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2" Apr 16 22:14:20.330736 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:20.330644 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2" Apr 16 22:14:21.174121 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:21.174094 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6"] Apr 16 22:14:21.184187 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:21.184157 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6" Apr 16 22:14:21.187823 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:21.187779 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6"] Apr 16 22:14:21.287225 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:21.287187 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/30802795-7f45-4e9c-9470-70dca27eca94-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6\" (UID: \"30802795-7f45-4e9c-9470-70dca27eca94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6" Apr 16 22:14:21.287419 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:21.287329 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9xz8\" (UniqueName: \"kubernetes.io/projected/30802795-7f45-4e9c-9470-70dca27eca94-kube-api-access-f9xz8\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6\" (UID: \"30802795-7f45-4e9c-9470-70dca27eca94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6" Apr 16 22:14:21.388489 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:21.388450 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f9xz8\" (UniqueName: \"kubernetes.io/projected/30802795-7f45-4e9c-9470-70dca27eca94-kube-api-access-f9xz8\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6\" (UID: \"30802795-7f45-4e9c-9470-70dca27eca94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6" Apr 16 22:14:21.388674 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:21.388585 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/30802795-7f45-4e9c-9470-70dca27eca94-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6\" (UID: \"30802795-7f45-4e9c-9470-70dca27eca94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6" Apr 16 22:14:21.389059 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:21.388996 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/30802795-7f45-4e9c-9470-70dca27eca94-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6\" (UID: \"30802795-7f45-4e9c-9470-70dca27eca94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6" Apr 16 22:14:21.397652 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:21.397627 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9xz8\" (UniqueName: \"kubernetes.io/projected/30802795-7f45-4e9c-9470-70dca27eca94-kube-api-access-f9xz8\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6\" (UID: \"30802795-7f45-4e9c-9470-70dca27eca94\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6" Apr 16 22:14:21.517823 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:21.517725 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6" Apr 16 22:14:27.144812 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:27.144766 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-65cbp" Apr 16 22:14:27.837907 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:27.837867 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:27.837907 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:27.837918 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:27.843671 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:27.843646 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:28.197862 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:28.197776 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-586fc7995b-gf69z" Apr 16 22:14:28.244254 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:28.244223 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c94b67fdf-tjtlv"] Apr 16 22:14:34.377456 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:34.377430 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt" Apr 16 22:14:34.379522 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:34.379487 2576 status_manager.go:895] "Failed to get status for pod" podUID="6564a76d-2b9b-4067-af0a-e415be3dc7ea" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-x78wt\" is forbidden: User \"system:node:ip-10-0-130-26.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-26.ec2.internal' and this object" Apr 16 22:14:34.501714 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:34.501682 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6"] Apr 16 22:14:34.503408 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:14:34.503381 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30802795_7f45_4e9c_9470_70dca27eca94.slice/crio-9f226ce2326b28862e755fb5c4caf8941cca99ec03185916f12017da0ec6515f WatchSource:0}: Error finding container 9f226ce2326b28862e755fb5c4caf8941cca99ec03185916f12017da0ec6515f: Status 404 returned error can't find the container with id 9f226ce2326b28862e755fb5c4caf8941cca99ec03185916f12017da0ec6515f Apr 16 22:14:34.514437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:34.514412 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2"] Apr 16 22:14:34.517883 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:14:34.517861 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfddaa5ef_d3b0_4194_81c5_36bf122cf797.slice/crio-a204f723c3d4ab009f9f2505b7887ace4c165a181bad708c6694aa7a5a9d49f9 WatchSource:0}: Error finding container a204f723c3d4ab009f9f2505b7887ace4c165a181bad708c6694aa7a5a9d49f9: Status 404 returned error can't find the container with id a204f723c3d4ab009f9f2505b7887ace4c165a181bad708c6694aa7a5a9d49f9 Apr 16 22:14:34.519207 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:34.519158 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6564a76d-2b9b-4067-af0a-e415be3dc7ea-extensions-socket-volume\") pod \"6564a76d-2b9b-4067-af0a-e415be3dc7ea\" (UID: \"6564a76d-2b9b-4067-af0a-e415be3dc7ea\") " Apr 16 22:14:34.519283 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:34.519231 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw9tf\" (UniqueName: \"kubernetes.io/projected/6564a76d-2b9b-4067-af0a-e415be3dc7ea-kube-api-access-tw9tf\") pod \"6564a76d-2b9b-4067-af0a-e415be3dc7ea\" (UID: \"6564a76d-2b9b-4067-af0a-e415be3dc7ea\") " Apr 16 22:14:34.519819 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:34.519790 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6564a76d-2b9b-4067-af0a-e415be3dc7ea-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "6564a76d-2b9b-4067-af0a-e415be3dc7ea" (UID: "6564a76d-2b9b-4067-af0a-e415be3dc7ea"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:14:34.521505 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:34.521475 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6564a76d-2b9b-4067-af0a-e415be3dc7ea-kube-api-access-tw9tf" (OuterVolumeSpecName: "kube-api-access-tw9tf") pod "6564a76d-2b9b-4067-af0a-e415be3dc7ea" (UID: "6564a76d-2b9b-4067-af0a-e415be3dc7ea"). InnerVolumeSpecName "kube-api-access-tw9tf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:14:34.620554 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:34.620473 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tw9tf\" (UniqueName: \"kubernetes.io/projected/6564a76d-2b9b-4067-af0a-e415be3dc7ea-kube-api-access-tw9tf\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:14:34.620554 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:34.620499 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6564a76d-2b9b-4067-af0a-e415be3dc7ea-extensions-socket-volume\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:14:35.219823 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:35.219779 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2" event={"ID":"fddaa5ef-d3b0-4194-81c5-36bf122cf797","Type":"ContainerStarted","Data":"94a58e0eeb2ff6f719a76580d6e955d9b90e52c3ab0a6047924da60c3ac548ee"} Apr 16 22:14:35.219823 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:35.219818 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2" event={"ID":"fddaa5ef-d3b0-4194-81c5-36bf122cf797","Type":"ContainerStarted","Data":"a204f723c3d4ab009f9f2505b7887ace4c165a181bad708c6694aa7a5a9d49f9"} Apr 16 22:14:35.220083 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:35.219837 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2" Apr 16 22:14:35.220937 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:35.220920 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt" Apr 16 22:14:35.221008 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:35.220924 2576 scope.go:117] "RemoveContainer" containerID="f63e60132e25d4d5b675feabd1526ed91abfa091c6e73c2a5dc7a89615498459" Apr 16 22:14:35.222025 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:35.221877 2576 status_manager.go:895] "Failed to get status for pod" podUID="6564a76d-2b9b-4067-af0a-e415be3dc7ea" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-x78wt\" is forbidden: User \"system:node:ip-10-0-130-26.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-26.ec2.internal' and this object" Apr 16 22:14:35.222225 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:35.222205 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cw8r9" event={"ID":"3dce1467-c2b9-482d-baf3-6f1e2bcf237c","Type":"ContainerStarted","Data":"ea6174a36b8df15f01305506578678c22f1eeb5761f826d31cf787ed9a24cb79"} Apr 16 22:14:35.223692 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:35.223671 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6" event={"ID":"30802795-7f45-4e9c-9470-70dca27eca94","Type":"ContainerStarted","Data":"9eff2c2c3972c11e6e2baae921d94cd8bc73a29f3c7c7e7e5f29e81cc5272d19"} Apr 16 22:14:35.223801 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:35.223697 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6" event={"ID":"30802795-7f45-4e9c-9470-70dca27eca94","Type":"ContainerStarted","Data":"9f226ce2326b28862e755fb5c4caf8941cca99ec03185916f12017da0ec6515f"} Apr 16 22:14:35.223801 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:35.223786 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6" Apr 16 22:14:35.240412 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:35.240368 2576 status_manager.go:895] "Failed to get status for pod" podUID="6564a76d-2b9b-4067-af0a-e415be3dc7ea" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-x78wt\" is forbidden: User \"system:node:ip-10-0-130-26.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-26.ec2.internal' and this object" Apr 16 22:14:35.240590 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:35.240553 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2" podStartSLOduration=16.240540615 podStartE2EDuration="16.240540615s" podCreationTimestamp="2026-04-16 22:14:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:14:35.23838804 +0000 UTC m=+579.499114146" watchObservedRunningTime="2026-04-16 22:14:35.240540615 +0000 UTC m=+579.501266719" Apr 16 22:14:35.258096 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:35.258048 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-cw8r9" podStartSLOduration=2.211640704 podStartE2EDuration="25.25803376s" podCreationTimestamp="2026-04-16 22:14:10 +0000 UTC" firstStartedPulling="2026-04-16 22:14:11.396466578 +0000 UTC m=+555.657192661" lastFinishedPulling="2026-04-16 22:14:34.442859624 +0000 UTC m=+578.703585717" observedRunningTime="2026-04-16 22:14:35.255997354 +0000 UTC m=+579.516723469" watchObservedRunningTime="2026-04-16 22:14:35.25803376 +0000 UTC m=+579.518759864" Apr 16 22:14:35.275268 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:35.275215 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6" podStartSLOduration=14.275195471 podStartE2EDuration="14.275195471s" podCreationTimestamp="2026-04-16 22:14:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:14:35.2729039 +0000 UTC m=+579.533630005" watchObservedRunningTime="2026-04-16 22:14:35.275195471 +0000 UTC m=+579.535921576" Apr 16 22:14:36.281430 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:36.281392 2576 status_manager.go:895] "Failed to get status for pod" podUID="6564a76d-2b9b-4067-af0a-e415be3dc7ea" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-x78wt" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-x78wt\" is forbidden: User \"system:node:ip-10-0-130-26.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-130-26.ec2.internal' and this object" Apr 16 22:14:36.282067 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:36.282044 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6564a76d-2b9b-4067-af0a-e415be3dc7ea" path="/var/lib/kubelet/pods/6564a76d-2b9b-4067-af0a-e415be3dc7ea/volumes" Apr 16 22:14:46.234106 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:46.234075 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6" Apr 16 22:14:46.234559 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:46.234136 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2" Apr 16 22:14:46.300169 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:46.300138 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2"] Apr 16 22:14:46.300387 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:46.300366 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2" podUID="fddaa5ef-d3b0-4194-81c5-36bf122cf797" containerName="manager" containerID="cri-o://94a58e0eeb2ff6f719a76580d6e955d9b90e52c3ab0a6047924da60c3ac548ee" gracePeriod=10 Apr 16 22:14:46.552214 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:46.552189 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2" Apr 16 22:14:46.732743 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:46.732714 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fddaa5ef-d3b0-4194-81c5-36bf122cf797-extensions-socket-volume\") pod \"fddaa5ef-d3b0-4194-81c5-36bf122cf797\" (UID: \"fddaa5ef-d3b0-4194-81c5-36bf122cf797\") " Apr 16 22:14:46.732935 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:46.732782 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wn2b\" (UniqueName: \"kubernetes.io/projected/fddaa5ef-d3b0-4194-81c5-36bf122cf797-kube-api-access-7wn2b\") pod \"fddaa5ef-d3b0-4194-81c5-36bf122cf797\" (UID: \"fddaa5ef-d3b0-4194-81c5-36bf122cf797\") " Apr 16 22:14:46.733260 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:46.733199 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fddaa5ef-d3b0-4194-81c5-36bf122cf797-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "fddaa5ef-d3b0-4194-81c5-36bf122cf797" (UID: "fddaa5ef-d3b0-4194-81c5-36bf122cf797"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:14:46.735504 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:46.735475 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fddaa5ef-d3b0-4194-81c5-36bf122cf797-kube-api-access-7wn2b" (OuterVolumeSpecName: "kube-api-access-7wn2b") pod "fddaa5ef-d3b0-4194-81c5-36bf122cf797" (UID: "fddaa5ef-d3b0-4194-81c5-36bf122cf797"). InnerVolumeSpecName "kube-api-access-7wn2b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:14:46.833893 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:46.833865 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7wn2b\" (UniqueName: \"kubernetes.io/projected/fddaa5ef-d3b0-4194-81c5-36bf122cf797-kube-api-access-7wn2b\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:14:46.833893 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:46.833892 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/fddaa5ef-d3b0-4194-81c5-36bf122cf797-extensions-socket-volume\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:14:47.274164 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:47.274091 2576 generic.go:358] "Generic (PLEG): container finished" podID="fddaa5ef-d3b0-4194-81c5-36bf122cf797" containerID="94a58e0eeb2ff6f719a76580d6e955d9b90e52c3ab0a6047924da60c3ac548ee" exitCode=0 Apr 16 22:14:47.274164 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:47.274158 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2" event={"ID":"fddaa5ef-d3b0-4194-81c5-36bf122cf797","Type":"ContainerDied","Data":"94a58e0eeb2ff6f719a76580d6e955d9b90e52c3ab0a6047924da60c3ac548ee"} Apr 16 22:14:47.274655 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:47.274159 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2" Apr 16 22:14:47.274655 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:47.274194 2576 scope.go:117] "RemoveContainer" containerID="94a58e0eeb2ff6f719a76580d6e955d9b90e52c3ab0a6047924da60c3ac548ee" Apr 16 22:14:47.274655 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:47.274184 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2" event={"ID":"fddaa5ef-d3b0-4194-81c5-36bf122cf797","Type":"ContainerDied","Data":"a204f723c3d4ab009f9f2505b7887ace4c165a181bad708c6694aa7a5a9d49f9"} Apr 16 22:14:47.283298 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:47.283264 2576 scope.go:117] "RemoveContainer" containerID="94a58e0eeb2ff6f719a76580d6e955d9b90e52c3ab0a6047924da60c3ac548ee" Apr 16 22:14:47.283585 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:14:47.283565 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94a58e0eeb2ff6f719a76580d6e955d9b90e52c3ab0a6047924da60c3ac548ee\": container with ID starting with 94a58e0eeb2ff6f719a76580d6e955d9b90e52c3ab0a6047924da60c3ac548ee not found: ID does not exist" containerID="94a58e0eeb2ff6f719a76580d6e955d9b90e52c3ab0a6047924da60c3ac548ee" Apr 16 22:14:47.283651 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:47.283594 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94a58e0eeb2ff6f719a76580d6e955d9b90e52c3ab0a6047924da60c3ac548ee"} err="failed to get container status \"94a58e0eeb2ff6f719a76580d6e955d9b90e52c3ab0a6047924da60c3ac548ee\": rpc error: code = NotFound desc = could not find container \"94a58e0eeb2ff6f719a76580d6e955d9b90e52c3ab0a6047924da60c3ac548ee\": container with ID starting with 94a58e0eeb2ff6f719a76580d6e955d9b90e52c3ab0a6047924da60c3ac548ee not found: ID does not exist" Apr 16 22:14:47.303230 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:47.303204 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2"] Apr 16 22:14:47.313931 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:47.313907 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-twgq2"] Apr 16 22:14:48.281873 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:48.281838 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fddaa5ef-d3b0-4194-81c5-36bf122cf797" path="/var/lib/kubelet/pods/fddaa5ef-d3b0-4194-81c5-36bf122cf797/volumes" Apr 16 22:14:53.272415 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.272345 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5c94b67fdf-tjtlv" podUID="1f265668-3d67-4794-ace1-23da716b45fa" containerName="console" containerID="cri-o://a74661f95f56d20067505cd17e094fcb71f6337ac0b52313600a782958e336ad" gracePeriod=15 Apr 16 22:14:53.539364 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.539342 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c94b67fdf-tjtlv_1f265668-3d67-4794-ace1-23da716b45fa/console/0.log" Apr 16 22:14:53.539488 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.539404 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:14:53.587296 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.587241 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f265668-3d67-4794-ace1-23da716b45fa-console-oauth-config\") pod \"1f265668-3d67-4794-ace1-23da716b45fa\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " Apr 16 22:14:53.587475 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.587334 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-console-config\") pod \"1f265668-3d67-4794-ace1-23da716b45fa\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " Apr 16 22:14:53.587475 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.587387 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f265668-3d67-4794-ace1-23da716b45fa-console-serving-cert\") pod \"1f265668-3d67-4794-ace1-23da716b45fa\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " Apr 16 22:14:53.587475 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.587426 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-trusted-ca-bundle\") pod \"1f265668-3d67-4794-ace1-23da716b45fa\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " Apr 16 22:14:53.588651 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.587480 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9gs68\" (UniqueName: \"kubernetes.io/projected/1f265668-3d67-4794-ace1-23da716b45fa-kube-api-access-9gs68\") pod \"1f265668-3d67-4794-ace1-23da716b45fa\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " Apr 16 22:14:53.588651 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.587510 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-oauth-serving-cert\") pod \"1f265668-3d67-4794-ace1-23da716b45fa\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " Apr 16 22:14:53.588651 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.587574 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-service-ca\") pod \"1f265668-3d67-4794-ace1-23da716b45fa\" (UID: \"1f265668-3d67-4794-ace1-23da716b45fa\") " Apr 16 22:14:53.588651 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.588168 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-service-ca" (OuterVolumeSpecName: "service-ca") pod "1f265668-3d67-4794-ace1-23da716b45fa" (UID: "1f265668-3d67-4794-ace1-23da716b45fa"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:14:53.588651 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.588386 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-console-config" (OuterVolumeSpecName: "console-config") pod "1f265668-3d67-4794-ace1-23da716b45fa" (UID: "1f265668-3d67-4794-ace1-23da716b45fa"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:14:53.588651 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.588395 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1f265668-3d67-4794-ace1-23da716b45fa" (UID: "1f265668-3d67-4794-ace1-23da716b45fa"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:14:53.588651 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.588508 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1f265668-3d67-4794-ace1-23da716b45fa" (UID: "1f265668-3d67-4794-ace1-23da716b45fa"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:14:53.590372 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.590331 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f265668-3d67-4794-ace1-23da716b45fa-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1f265668-3d67-4794-ace1-23da716b45fa" (UID: "1f265668-3d67-4794-ace1-23da716b45fa"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:14:53.590708 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.590676 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f265668-3d67-4794-ace1-23da716b45fa-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1f265668-3d67-4794-ace1-23da716b45fa" (UID: "1f265668-3d67-4794-ace1-23da716b45fa"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:14:53.590808 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.590772 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f265668-3d67-4794-ace1-23da716b45fa-kube-api-access-9gs68" (OuterVolumeSpecName: "kube-api-access-9gs68") pod "1f265668-3d67-4794-ace1-23da716b45fa" (UID: "1f265668-3d67-4794-ace1-23da716b45fa"). InnerVolumeSpecName "kube-api-access-9gs68". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:14:53.688639 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.688605 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9gs68\" (UniqueName: \"kubernetes.io/projected/1f265668-3d67-4794-ace1-23da716b45fa-kube-api-access-9gs68\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:14:53.688639 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.688633 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-oauth-serving-cert\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:14:53.688639 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.688645 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-service-ca\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:14:53.688840 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.688654 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1f265668-3d67-4794-ace1-23da716b45fa-console-oauth-config\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:14:53.688840 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.688664 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-console-config\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:14:53.688840 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.688672 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f265668-3d67-4794-ace1-23da716b45fa-console-serving-cert\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:14:53.688840 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:53.688680 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f265668-3d67-4794-ace1-23da716b45fa-trusted-ca-bundle\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:14:54.304504 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:54.304472 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c94b67fdf-tjtlv_1f265668-3d67-4794-ace1-23da716b45fa/console/0.log" Apr 16 22:14:54.304891 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:54.304517 2576 generic.go:358] "Generic (PLEG): container finished" podID="1f265668-3d67-4794-ace1-23da716b45fa" containerID="a74661f95f56d20067505cd17e094fcb71f6337ac0b52313600a782958e336ad" exitCode=2 Apr 16 22:14:54.304891 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:54.304579 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c94b67fdf-tjtlv" Apr 16 22:14:54.304891 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:54.304600 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c94b67fdf-tjtlv" event={"ID":"1f265668-3d67-4794-ace1-23da716b45fa","Type":"ContainerDied","Data":"a74661f95f56d20067505cd17e094fcb71f6337ac0b52313600a782958e336ad"} Apr 16 22:14:54.304891 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:54.304633 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c94b67fdf-tjtlv" event={"ID":"1f265668-3d67-4794-ace1-23da716b45fa","Type":"ContainerDied","Data":"9609529f80ef37b2d7afa06c503ab7cbae78eb6bae41cee9924f42cf8caa4f83"} Apr 16 22:14:54.304891 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:54.304649 2576 scope.go:117] "RemoveContainer" containerID="a74661f95f56d20067505cd17e094fcb71f6337ac0b52313600a782958e336ad" Apr 16 22:14:54.312748 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:54.312731 2576 scope.go:117] "RemoveContainer" containerID="a74661f95f56d20067505cd17e094fcb71f6337ac0b52313600a782958e336ad" Apr 16 22:14:54.312998 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:14:54.312981 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a74661f95f56d20067505cd17e094fcb71f6337ac0b52313600a782958e336ad\": container with ID starting with a74661f95f56d20067505cd17e094fcb71f6337ac0b52313600a782958e336ad not found: ID does not exist" containerID="a74661f95f56d20067505cd17e094fcb71f6337ac0b52313600a782958e336ad" Apr 16 22:14:54.313097 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:54.313003 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a74661f95f56d20067505cd17e094fcb71f6337ac0b52313600a782958e336ad"} err="failed to get container status \"a74661f95f56d20067505cd17e094fcb71f6337ac0b52313600a782958e336ad\": rpc error: code = NotFound desc = could not find container \"a74661f95f56d20067505cd17e094fcb71f6337ac0b52313600a782958e336ad\": container with ID starting with a74661f95f56d20067505cd17e094fcb71f6337ac0b52313600a782958e336ad not found: ID does not exist" Apr 16 22:14:54.322545 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:54.322524 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c94b67fdf-tjtlv"] Apr 16 22:14:54.331108 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:54.331086 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c94b67fdf-tjtlv"] Apr 16 22:14:56.189950 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:56.189921 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/2.log" Apr 16 22:14:56.190403 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:56.189989 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/2.log" Apr 16 22:14:56.282863 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:14:56.282834 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f265668-3d67-4794-ace1-23da716b45fa" path="/var/lib/kubelet/pods/1f265668-3d67-4794-ace1-23da716b45fa/volumes" Apr 16 22:15:02.504342 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.504296 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw"] Apr 16 22:15:02.504854 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.504835 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fddaa5ef-d3b0-4194-81c5-36bf122cf797" containerName="manager" Apr 16 22:15:02.504940 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.504857 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fddaa5ef-d3b0-4194-81c5-36bf122cf797" containerName="manager" Apr 16 22:15:02.504940 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.504876 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f265668-3d67-4794-ace1-23da716b45fa" containerName="console" Apr 16 22:15:02.504940 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.504885 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f265668-3d67-4794-ace1-23da716b45fa" containerName="console" Apr 16 22:15:02.505082 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.504979 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f265668-3d67-4794-ace1-23da716b45fa" containerName="console" Apr 16 22:15:02.505082 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.504994 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fddaa5ef-d3b0-4194-81c5-36bf122cf797" containerName="manager" Apr 16 22:15:02.510012 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.509986 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.512492 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.512468 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-b26fr\"" Apr 16 22:15:02.518621 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.518593 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw"] Apr 16 22:15:02.569927 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.569896 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/66b10bf8-2a76-488d-9eaa-9f7369c48145-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.570132 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.569941 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khndm\" (UniqueName: \"kubernetes.io/projected/66b10bf8-2a76-488d-9eaa-9f7369c48145-kube-api-access-khndm\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.570132 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.570010 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/66b10bf8-2a76-488d-9eaa-9f7369c48145-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.570132 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.570042 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/66b10bf8-2a76-488d-9eaa-9f7369c48145-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.570132 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.570075 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/66b10bf8-2a76-488d-9eaa-9f7369c48145-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.570132 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.570129 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/66b10bf8-2a76-488d-9eaa-9f7369c48145-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.570418 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.570181 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/66b10bf8-2a76-488d-9eaa-9f7369c48145-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.570418 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.570205 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/66b10bf8-2a76-488d-9eaa-9f7369c48145-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.570418 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.570231 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/66b10bf8-2a76-488d-9eaa-9f7369c48145-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.671675 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.671637 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/66b10bf8-2a76-488d-9eaa-9f7369c48145-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.671675 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.671679 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khndm\" (UniqueName: \"kubernetes.io/projected/66b10bf8-2a76-488d-9eaa-9f7369c48145-kube-api-access-khndm\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.671981 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.671708 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/66b10bf8-2a76-488d-9eaa-9f7369c48145-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.671981 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.671732 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/66b10bf8-2a76-488d-9eaa-9f7369c48145-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.671981 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.671753 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/66b10bf8-2a76-488d-9eaa-9f7369c48145-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.671981 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.671785 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/66b10bf8-2a76-488d-9eaa-9f7369c48145-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.671981 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.671957 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/66b10bf8-2a76-488d-9eaa-9f7369c48145-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.672229 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.672008 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/66b10bf8-2a76-488d-9eaa-9f7369c48145-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.672229 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.672035 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/66b10bf8-2a76-488d-9eaa-9f7369c48145-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.672229 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.672095 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/66b10bf8-2a76-488d-9eaa-9f7369c48145-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.672229 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.672172 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/66b10bf8-2a76-488d-9eaa-9f7369c48145-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.672229 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.672211 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/66b10bf8-2a76-488d-9eaa-9f7369c48145-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.672516 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.672242 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/66b10bf8-2a76-488d-9eaa-9f7369c48145-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.672516 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.672438 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/66b10bf8-2a76-488d-9eaa-9f7369c48145-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.674429 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.674399 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/66b10bf8-2a76-488d-9eaa-9f7369c48145-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.674540 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.674523 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/66b10bf8-2a76-488d-9eaa-9f7369c48145-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.678669 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.678649 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/66b10bf8-2a76-488d-9eaa-9f7369c48145-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.679056 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.679035 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khndm\" (UniqueName: \"kubernetes.io/projected/66b10bf8-2a76-488d-9eaa-9f7369c48145-kube-api-access-khndm\") pod \"maas-default-gateway-openshift-default-58b6f876-gb8pw\" (UID: \"66b10bf8-2a76-488d-9eaa-9f7369c48145\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.823370 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.823248 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:02.948747 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.948723 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw"] Apr 16 22:15:02.950719 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:15:02.950698 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66b10bf8_2a76_488d_9eaa_9f7369c48145.slice/crio-80b189262dfd8f8a65d1b4c1ae0e45c560a04a7e30bb2808d344ecd8bee3af45 WatchSource:0}: Error finding container 80b189262dfd8f8a65d1b4c1ae0e45c560a04a7e30bb2808d344ecd8bee3af45: Status 404 returned error can't find the container with id 80b189262dfd8f8a65d1b4c1ae0e45c560a04a7e30bb2808d344ecd8bee3af45 Apr 16 22:15:02.952956 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.952927 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 22:15:02.953063 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.952985 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 22:15:02.953063 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:02.953016 2576 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 22:15:03.339528 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:03.339490 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" event={"ID":"66b10bf8-2a76-488d-9eaa-9f7369c48145","Type":"ContainerStarted","Data":"06e754854e88611a1aae4aacac19044f42b38bc994debee07dd30fe1d2675f99"} Apr 16 22:15:03.339740 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:03.339535 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" event={"ID":"66b10bf8-2a76-488d-9eaa-9f7369c48145","Type":"ContainerStarted","Data":"80b189262dfd8f8a65d1b4c1ae0e45c560a04a7e30bb2808d344ecd8bee3af45"} Apr 16 22:15:03.356973 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:03.356923 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" podStartSLOduration=1.356900798 podStartE2EDuration="1.356900798s" podCreationTimestamp="2026-04-16 22:15:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:15:03.356213669 +0000 UTC m=+607.616939801" watchObservedRunningTime="2026-04-16 22:15:03.356900798 +0000 UTC m=+607.617626904" Apr 16 22:15:03.823914 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:03.823822 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:04.829112 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:04.829077 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:05.347047 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:05.347017 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:05.347906 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:05.347886 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-gb8pw" Apr 16 22:15:06.734639 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:06.734540 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-ldcpl"] Apr 16 22:15:06.738492 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:06.738466 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-ldcpl" Apr 16 22:15:06.740812 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:06.740791 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 22:15:06.747490 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:06.747460 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-ldcpl"] Apr 16 22:15:06.811860 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:06.811816 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfrdq\" (UniqueName: \"kubernetes.io/projected/5b2784ec-9d14-46dc-9233-35ffdefc1acf-kube-api-access-vfrdq\") pod \"limitador-limitador-7d549b5b-ldcpl\" (UID: \"5b2784ec-9d14-46dc-9233-35ffdefc1acf\") " pod="kuadrant-system/limitador-limitador-7d549b5b-ldcpl" Apr 16 22:15:06.812014 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:06.811967 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5b2784ec-9d14-46dc-9233-35ffdefc1acf-config-file\") pod \"limitador-limitador-7d549b5b-ldcpl\" (UID: \"5b2784ec-9d14-46dc-9233-35ffdefc1acf\") " pod="kuadrant-system/limitador-limitador-7d549b5b-ldcpl" Apr 16 22:15:06.835626 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:06.835595 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-ldcpl"] Apr 16 22:15:06.913375 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:06.913344 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfrdq\" (UniqueName: \"kubernetes.io/projected/5b2784ec-9d14-46dc-9233-35ffdefc1acf-kube-api-access-vfrdq\") pod \"limitador-limitador-7d549b5b-ldcpl\" (UID: \"5b2784ec-9d14-46dc-9233-35ffdefc1acf\") " pod="kuadrant-system/limitador-limitador-7d549b5b-ldcpl" Apr 16 22:15:06.913565 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:06.913450 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5b2784ec-9d14-46dc-9233-35ffdefc1acf-config-file\") pod \"limitador-limitador-7d549b5b-ldcpl\" (UID: \"5b2784ec-9d14-46dc-9233-35ffdefc1acf\") " pod="kuadrant-system/limitador-limitador-7d549b5b-ldcpl" Apr 16 22:15:06.914107 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:06.914088 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5b2784ec-9d14-46dc-9233-35ffdefc1acf-config-file\") pod \"limitador-limitador-7d549b5b-ldcpl\" (UID: \"5b2784ec-9d14-46dc-9233-35ffdefc1acf\") " pod="kuadrant-system/limitador-limitador-7d549b5b-ldcpl" Apr 16 22:15:06.920816 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:06.920797 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfrdq\" (UniqueName: \"kubernetes.io/projected/5b2784ec-9d14-46dc-9233-35ffdefc1acf-kube-api-access-vfrdq\") pod \"limitador-limitador-7d549b5b-ldcpl\" (UID: \"5b2784ec-9d14-46dc-9233-35ffdefc1acf\") " pod="kuadrant-system/limitador-limitador-7d549b5b-ldcpl" Apr 16 22:15:07.051236 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:07.051143 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-ldcpl" Apr 16 22:15:07.212594 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:07.212568 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-ldcpl"] Apr 16 22:15:07.214185 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:15:07.214158 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b2784ec_9d14_46dc_9233_35ffdefc1acf.slice/crio-de43947a673a76d22a586048d199d3f2a77846ddaf5dc54153614958b59aa900 WatchSource:0}: Error finding container de43947a673a76d22a586048d199d3f2a77846ddaf5dc54153614958b59aa900: Status 404 returned error can't find the container with id de43947a673a76d22a586048d199d3f2a77846ddaf5dc54153614958b59aa900 Apr 16 22:15:07.356004 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:07.355954 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-ldcpl" event={"ID":"5b2784ec-9d14-46dc-9233-35ffdefc1acf","Type":"ContainerStarted","Data":"de43947a673a76d22a586048d199d3f2a77846ddaf5dc54153614958b59aa900"} Apr 16 22:15:10.369222 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:10.369177 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-ldcpl" event={"ID":"5b2784ec-9d14-46dc-9233-35ffdefc1acf","Type":"ContainerStarted","Data":"bcb0430555908648b93443568b24086cfd11caf394d5fabe9c99d11328fcc00e"} Apr 16 22:15:10.369600 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:10.369349 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-ldcpl" Apr 16 22:15:10.385147 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:10.385091 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-ldcpl" podStartSLOduration=1.687341449 podStartE2EDuration="4.38507274s" podCreationTimestamp="2026-04-16 22:15:06 +0000 UTC" firstStartedPulling="2026-04-16 22:15:07.215983373 +0000 UTC m=+611.476709455" lastFinishedPulling="2026-04-16 22:15:09.913714655 +0000 UTC m=+614.174440746" observedRunningTime="2026-04-16 22:15:10.385025201 +0000 UTC m=+614.645751305" watchObservedRunningTime="2026-04-16 22:15:10.38507274 +0000 UTC m=+614.645798846" Apr 16 22:15:21.207507 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:21.207458 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-ldcpl"] Apr 16 22:15:21.208071 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:21.207836 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-ldcpl" podUID="5b2784ec-9d14-46dc-9233-35ffdefc1acf" containerName="limitador" containerID="cri-o://bcb0430555908648b93443568b24086cfd11caf394d5fabe9c99d11328fcc00e" gracePeriod=30 Apr 16 22:15:21.210270 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:21.210247 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-ldcpl" Apr 16 22:15:21.759334 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:21.759298 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-ldcpl" Apr 16 22:15:21.857640 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:21.857616 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5b2784ec-9d14-46dc-9233-35ffdefc1acf-config-file\") pod \"5b2784ec-9d14-46dc-9233-35ffdefc1acf\" (UID: \"5b2784ec-9d14-46dc-9233-35ffdefc1acf\") " Apr 16 22:15:21.857792 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:21.857660 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfrdq\" (UniqueName: \"kubernetes.io/projected/5b2784ec-9d14-46dc-9233-35ffdefc1acf-kube-api-access-vfrdq\") pod \"5b2784ec-9d14-46dc-9233-35ffdefc1acf\" (UID: \"5b2784ec-9d14-46dc-9233-35ffdefc1acf\") " Apr 16 22:15:21.858001 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:21.857978 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b2784ec-9d14-46dc-9233-35ffdefc1acf-config-file" (OuterVolumeSpecName: "config-file") pod "5b2784ec-9d14-46dc-9233-35ffdefc1acf" (UID: "5b2784ec-9d14-46dc-9233-35ffdefc1acf"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 22:15:21.859762 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:21.859733 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b2784ec-9d14-46dc-9233-35ffdefc1acf-kube-api-access-vfrdq" (OuterVolumeSpecName: "kube-api-access-vfrdq") pod "5b2784ec-9d14-46dc-9233-35ffdefc1acf" (UID: "5b2784ec-9d14-46dc-9233-35ffdefc1acf"). InnerVolumeSpecName "kube-api-access-vfrdq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:15:21.958688 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:21.958657 2576 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/5b2784ec-9d14-46dc-9233-35ffdefc1acf-config-file\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:15:21.958688 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:21.958686 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vfrdq\" (UniqueName: \"kubernetes.io/projected/5b2784ec-9d14-46dc-9233-35ffdefc1acf-kube-api-access-vfrdq\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:15:22.414221 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.414191 2576 generic.go:358] "Generic (PLEG): container finished" podID="5b2784ec-9d14-46dc-9233-35ffdefc1acf" containerID="bcb0430555908648b93443568b24086cfd11caf394d5fabe9c99d11328fcc00e" exitCode=0 Apr 16 22:15:22.414677 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.414239 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-ldcpl" event={"ID":"5b2784ec-9d14-46dc-9233-35ffdefc1acf","Type":"ContainerDied","Data":"bcb0430555908648b93443568b24086cfd11caf394d5fabe9c99d11328fcc00e"} Apr 16 22:15:22.414677 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.414264 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-ldcpl" event={"ID":"5b2784ec-9d14-46dc-9233-35ffdefc1acf","Type":"ContainerDied","Data":"de43947a673a76d22a586048d199d3f2a77846ddaf5dc54153614958b59aa900"} Apr 16 22:15:22.414677 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.414260 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-ldcpl" Apr 16 22:15:22.414677 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.414276 2576 scope.go:117] "RemoveContainer" containerID="bcb0430555908648b93443568b24086cfd11caf394d5fabe9c99d11328fcc00e" Apr 16 22:15:22.422838 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.422818 2576 scope.go:117] "RemoveContainer" containerID="bcb0430555908648b93443568b24086cfd11caf394d5fabe9c99d11328fcc00e" Apr 16 22:15:22.423084 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:15:22.423066 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcb0430555908648b93443568b24086cfd11caf394d5fabe9c99d11328fcc00e\": container with ID starting with bcb0430555908648b93443568b24086cfd11caf394d5fabe9c99d11328fcc00e not found: ID does not exist" containerID="bcb0430555908648b93443568b24086cfd11caf394d5fabe9c99d11328fcc00e" Apr 16 22:15:22.423136 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.423090 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcb0430555908648b93443568b24086cfd11caf394d5fabe9c99d11328fcc00e"} err="failed to get container status \"bcb0430555908648b93443568b24086cfd11caf394d5fabe9c99d11328fcc00e\": rpc error: code = NotFound desc = could not find container \"bcb0430555908648b93443568b24086cfd11caf394d5fabe9c99d11328fcc00e\": container with ID starting with bcb0430555908648b93443568b24086cfd11caf394d5fabe9c99d11328fcc00e not found: ID does not exist" Apr 16 22:15:22.432438 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.432407 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-ldcpl"] Apr 16 22:15:22.435574 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.435553 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-ldcpl"] Apr 16 22:15:22.779928 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.779888 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-gwkmp"] Apr 16 22:15:22.780444 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.780425 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b2784ec-9d14-46dc-9233-35ffdefc1acf" containerName="limitador" Apr 16 22:15:22.780444 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.780444 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b2784ec-9d14-46dc-9233-35ffdefc1acf" containerName="limitador" Apr 16 22:15:22.780578 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.780540 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b2784ec-9d14-46dc-9233-35ffdefc1acf" containerName="limitador" Apr 16 22:15:22.784995 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.784975 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-gwkmp" Apr 16 22:15:22.787563 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.787540 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 16 22:15:22.787670 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.787568 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-8s5q7\"" Apr 16 22:15:22.790096 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.790060 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-gwkmp"] Apr 16 22:15:22.866885 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.866851 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw6wz\" (UniqueName: \"kubernetes.io/projected/220c566b-3d68-44b0-9d48-2b4ef15781a1-kube-api-access-vw6wz\") pod \"postgres-868db5846d-gwkmp\" (UID: \"220c566b-3d68-44b0-9d48-2b4ef15781a1\") " pod="opendatahub/postgres-868db5846d-gwkmp" Apr 16 22:15:22.867046 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.866921 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/220c566b-3d68-44b0-9d48-2b4ef15781a1-data\") pod \"postgres-868db5846d-gwkmp\" (UID: \"220c566b-3d68-44b0-9d48-2b4ef15781a1\") " pod="opendatahub/postgres-868db5846d-gwkmp" Apr 16 22:15:22.967894 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.967859 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/220c566b-3d68-44b0-9d48-2b4ef15781a1-data\") pod \"postgres-868db5846d-gwkmp\" (UID: \"220c566b-3d68-44b0-9d48-2b4ef15781a1\") " pod="opendatahub/postgres-868db5846d-gwkmp" Apr 16 22:15:22.968066 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.967986 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vw6wz\" (UniqueName: \"kubernetes.io/projected/220c566b-3d68-44b0-9d48-2b4ef15781a1-kube-api-access-vw6wz\") pod \"postgres-868db5846d-gwkmp\" (UID: \"220c566b-3d68-44b0-9d48-2b4ef15781a1\") " pod="opendatahub/postgres-868db5846d-gwkmp" Apr 16 22:15:22.968382 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.968358 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/220c566b-3d68-44b0-9d48-2b4ef15781a1-data\") pod \"postgres-868db5846d-gwkmp\" (UID: \"220c566b-3d68-44b0-9d48-2b4ef15781a1\") " pod="opendatahub/postgres-868db5846d-gwkmp" Apr 16 22:15:22.975198 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:22.975173 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw6wz\" (UniqueName: \"kubernetes.io/projected/220c566b-3d68-44b0-9d48-2b4ef15781a1-kube-api-access-vw6wz\") pod \"postgres-868db5846d-gwkmp\" (UID: \"220c566b-3d68-44b0-9d48-2b4ef15781a1\") " pod="opendatahub/postgres-868db5846d-gwkmp" Apr 16 22:15:23.098061 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:23.098031 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-gwkmp" Apr 16 22:15:23.428341 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:23.428278 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-gwkmp"] Apr 16 22:15:23.431251 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:15:23.431227 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod220c566b_3d68_44b0_9d48_2b4ef15781a1.slice/crio-7f3e0ed0b070d5a1ebd0389f5bb64bae5f37f8bab82dc971f1df8a7712902f6c WatchSource:0}: Error finding container 7f3e0ed0b070d5a1ebd0389f5bb64bae5f37f8bab82dc971f1df8a7712902f6c: Status 404 returned error can't find the container with id 7f3e0ed0b070d5a1ebd0389f5bb64bae5f37f8bab82dc971f1df8a7712902f6c Apr 16 22:15:23.432923 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:23.432907 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:15:24.283458 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:24.283418 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b2784ec-9d14-46dc-9233-35ffdefc1acf" path="/var/lib/kubelet/pods/5b2784ec-9d14-46dc-9233-35ffdefc1acf/volumes" Apr 16 22:15:24.427815 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:24.427767 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-gwkmp" event={"ID":"220c566b-3d68-44b0-9d48-2b4ef15781a1","Type":"ContainerStarted","Data":"7f3e0ed0b070d5a1ebd0389f5bb64bae5f37f8bab82dc971f1df8a7712902f6c"} Apr 16 22:15:29.448237 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:29.448202 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-gwkmp" event={"ID":"220c566b-3d68-44b0-9d48-2b4ef15781a1","Type":"ContainerStarted","Data":"9011bb2adee0b37db80c64a88fcf6d966b8fdda158b4d0ac85693589d9696c36"} Apr 16 22:15:29.448705 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:29.448351 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-gwkmp" Apr 16 22:15:29.464039 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:29.463680 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-gwkmp" podStartSLOduration=2.314141471 podStartE2EDuration="7.463662223s" podCreationTimestamp="2026-04-16 22:15:22 +0000 UTC" firstStartedPulling="2026-04-16 22:15:23.43303924 +0000 UTC m=+627.693765324" lastFinishedPulling="2026-04-16 22:15:28.582559972 +0000 UTC m=+632.843286076" observedRunningTime="2026-04-16 22:15:29.462439935 +0000 UTC m=+633.723166043" watchObservedRunningTime="2026-04-16 22:15:29.463662223 +0000 UTC m=+633.724388329" Apr 16 22:15:35.482475 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:35.482443 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-gwkmp" Apr 16 22:15:36.284734 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.284702 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-65b7968cfc-t8jfg"] Apr 16 22:15:36.292408 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.292372 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-65b7968cfc-t8jfg" Apr 16 22:15:36.295760 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.295731 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 16 22:15:36.295925 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.295813 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 16 22:15:36.295925 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.295835 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-mjfcm\"" Apr 16 22:15:36.300515 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.300493 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-65b7968cfc-t8jfg"] Apr 16 22:15:36.311729 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.311699 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d5f5c574-z6m9w"] Apr 16 22:15:36.316183 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.316159 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d5f5c574-z6m9w" Apr 16 22:15:36.318232 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.318208 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-rt5bg\"" Apr 16 22:15:36.323977 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.323951 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d5f5c574-z6m9w"] Apr 16 22:15:36.394475 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.394432 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8f164300-cff9-45a5-9bb1-d2ac77dced42-maas-api-tls\") pod \"maas-api-65b7968cfc-t8jfg\" (UID: \"8f164300-cff9-45a5-9bb1-d2ac77dced42\") " pod="opendatahub/maas-api-65b7968cfc-t8jfg" Apr 16 22:15:36.394475 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.394473 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x96vs\" (UniqueName: \"kubernetes.io/projected/cafdc936-a3cc-4936-8d73-4f24d1679386-kube-api-access-x96vs\") pod \"maas-controller-6d5f5c574-z6m9w\" (UID: \"cafdc936-a3cc-4936-8d73-4f24d1679386\") " pod="opendatahub/maas-controller-6d5f5c574-z6m9w" Apr 16 22:15:36.394684 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.394512 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwb7p\" (UniqueName: \"kubernetes.io/projected/8f164300-cff9-45a5-9bb1-d2ac77dced42-kube-api-access-qwb7p\") pod \"maas-api-65b7968cfc-t8jfg\" (UID: \"8f164300-cff9-45a5-9bb1-d2ac77dced42\") " pod="opendatahub/maas-api-65b7968cfc-t8jfg" Apr 16 22:15:36.495112 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.495082 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwb7p\" (UniqueName: \"kubernetes.io/projected/8f164300-cff9-45a5-9bb1-d2ac77dced42-kube-api-access-qwb7p\") pod \"maas-api-65b7968cfc-t8jfg\" (UID: \"8f164300-cff9-45a5-9bb1-d2ac77dced42\") " pod="opendatahub/maas-api-65b7968cfc-t8jfg" Apr 16 22:15:36.495511 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.495212 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8f164300-cff9-45a5-9bb1-d2ac77dced42-maas-api-tls\") pod \"maas-api-65b7968cfc-t8jfg\" (UID: \"8f164300-cff9-45a5-9bb1-d2ac77dced42\") " pod="opendatahub/maas-api-65b7968cfc-t8jfg" Apr 16 22:15:36.495511 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.495248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x96vs\" (UniqueName: \"kubernetes.io/projected/cafdc936-a3cc-4936-8d73-4f24d1679386-kube-api-access-x96vs\") pod \"maas-controller-6d5f5c574-z6m9w\" (UID: \"cafdc936-a3cc-4936-8d73-4f24d1679386\") " pod="opendatahub/maas-controller-6d5f5c574-z6m9w" Apr 16 22:15:36.497706 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.497679 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8f164300-cff9-45a5-9bb1-d2ac77dced42-maas-api-tls\") pod \"maas-api-65b7968cfc-t8jfg\" (UID: \"8f164300-cff9-45a5-9bb1-d2ac77dced42\") " pod="opendatahub/maas-api-65b7968cfc-t8jfg" Apr 16 22:15:36.503745 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.503724 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x96vs\" (UniqueName: \"kubernetes.io/projected/cafdc936-a3cc-4936-8d73-4f24d1679386-kube-api-access-x96vs\") pod \"maas-controller-6d5f5c574-z6m9w\" (UID: \"cafdc936-a3cc-4936-8d73-4f24d1679386\") " pod="opendatahub/maas-controller-6d5f5c574-z6m9w" Apr 16 22:15:36.503858 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.503837 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwb7p\" (UniqueName: \"kubernetes.io/projected/8f164300-cff9-45a5-9bb1-d2ac77dced42-kube-api-access-qwb7p\") pod \"maas-api-65b7968cfc-t8jfg\" (UID: \"8f164300-cff9-45a5-9bb1-d2ac77dced42\") " pod="opendatahub/maas-api-65b7968cfc-t8jfg" Apr 16 22:15:36.606993 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.606958 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-65b7968cfc-t8jfg" Apr 16 22:15:36.632138 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.632098 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d5f5c574-z6m9w" Apr 16 22:15:36.749151 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.749053 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-65b7968cfc-t8jfg"] Apr 16 22:15:36.752264 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:15:36.752223 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f164300_cff9_45a5_9bb1_d2ac77dced42.slice/crio-6e749291a360380fae81517ff5a36b02adc81eeb6f69dada3a9282292906eb3b WatchSource:0}: Error finding container 6e749291a360380fae81517ff5a36b02adc81eeb6f69dada3a9282292906eb3b: Status 404 returned error can't find the container with id 6e749291a360380fae81517ff5a36b02adc81eeb6f69dada3a9282292906eb3b Apr 16 22:15:36.777482 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:36.777453 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d5f5c574-z6m9w"] Apr 16 22:15:36.780027 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:15:36.779986 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcafdc936_a3cc_4936_8d73_4f24d1679386.slice/crio-df3d26afc5216b6476f877601bc9400e532f94cbe3e0a99ecba01ace8492b279 WatchSource:0}: Error finding container df3d26afc5216b6476f877601bc9400e532f94cbe3e0a99ecba01ace8492b279: Status 404 returned error can't find the container with id df3d26afc5216b6476f877601bc9400e532f94cbe3e0a99ecba01ace8492b279 Apr 16 22:15:37.119364 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:37.119331 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-5b8d76df79-8hw22"] Apr 16 22:15:37.124774 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:37.124754 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5b8d76df79-8hw22" Apr 16 22:15:37.133681 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:37.133652 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-5b8d76df79-8hw22"] Apr 16 22:15:37.201877 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:37.201846 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/e833b21b-3807-49ab-9889-2cf5e77ad763-maas-api-tls\") pod \"maas-api-5b8d76df79-8hw22\" (UID: \"e833b21b-3807-49ab-9889-2cf5e77ad763\") " pod="opendatahub/maas-api-5b8d76df79-8hw22" Apr 16 22:15:37.201877 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:37.201881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qfz7\" (UniqueName: \"kubernetes.io/projected/e833b21b-3807-49ab-9889-2cf5e77ad763-kube-api-access-4qfz7\") pod \"maas-api-5b8d76df79-8hw22\" (UID: \"e833b21b-3807-49ab-9889-2cf5e77ad763\") " pod="opendatahub/maas-api-5b8d76df79-8hw22" Apr 16 22:15:37.303151 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:37.303113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/e833b21b-3807-49ab-9889-2cf5e77ad763-maas-api-tls\") pod \"maas-api-5b8d76df79-8hw22\" (UID: \"e833b21b-3807-49ab-9889-2cf5e77ad763\") " pod="opendatahub/maas-api-5b8d76df79-8hw22" Apr 16 22:15:37.303343 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:37.303174 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qfz7\" (UniqueName: \"kubernetes.io/projected/e833b21b-3807-49ab-9889-2cf5e77ad763-kube-api-access-4qfz7\") pod \"maas-api-5b8d76df79-8hw22\" (UID: \"e833b21b-3807-49ab-9889-2cf5e77ad763\") " pod="opendatahub/maas-api-5b8d76df79-8hw22" Apr 16 22:15:37.305829 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:37.305806 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/e833b21b-3807-49ab-9889-2cf5e77ad763-maas-api-tls\") pod \"maas-api-5b8d76df79-8hw22\" (UID: \"e833b21b-3807-49ab-9889-2cf5e77ad763\") " pod="opendatahub/maas-api-5b8d76df79-8hw22" Apr 16 22:15:37.309699 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:37.309682 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qfz7\" (UniqueName: \"kubernetes.io/projected/e833b21b-3807-49ab-9889-2cf5e77ad763-kube-api-access-4qfz7\") pod \"maas-api-5b8d76df79-8hw22\" (UID: \"e833b21b-3807-49ab-9889-2cf5e77ad763\") " pod="opendatahub/maas-api-5b8d76df79-8hw22" Apr 16 22:15:37.451718 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:37.451461 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5b8d76df79-8hw22" Apr 16 22:15:37.483118 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:37.483066 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-65b7968cfc-t8jfg" event={"ID":"8f164300-cff9-45a5-9bb1-d2ac77dced42","Type":"ContainerStarted","Data":"6e749291a360380fae81517ff5a36b02adc81eeb6f69dada3a9282292906eb3b"} Apr 16 22:15:37.484441 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:37.484400 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d5f5c574-z6m9w" event={"ID":"cafdc936-a3cc-4936-8d73-4f24d1679386","Type":"ContainerStarted","Data":"df3d26afc5216b6476f877601bc9400e532f94cbe3e0a99ecba01ace8492b279"} Apr 16 22:15:37.657342 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:37.656862 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-5b8d76df79-8hw22"] Apr 16 22:15:38.494548 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:38.494509 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5b8d76df79-8hw22" event={"ID":"e833b21b-3807-49ab-9889-2cf5e77ad763","Type":"ContainerStarted","Data":"e496faf3b19c48ae5decddf6c2a8071dbe2ab66c26b6138a28e1957f44042b14"} Apr 16 22:15:42.512136 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:42.512102 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-65b7968cfc-t8jfg" event={"ID":"8f164300-cff9-45a5-9bb1-d2ac77dced42","Type":"ContainerStarted","Data":"15b85f056e9da75aeb33e3374cc1a8ccc3ea128e623e732d040e02c9980925a7"} Apr 16 22:15:42.512751 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:42.512217 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-65b7968cfc-t8jfg" Apr 16 22:15:42.513557 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:42.513530 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5b8d76df79-8hw22" event={"ID":"e833b21b-3807-49ab-9889-2cf5e77ad763","Type":"ContainerStarted","Data":"4bd1807c8f86a41cbec861dd152d44d88b4dd4d469ebb25c070e764c86f21ba9"} Apr 16 22:15:42.513669 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:42.513594 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-5b8d76df79-8hw22" Apr 16 22:15:42.514807 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:42.514789 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d5f5c574-z6m9w" event={"ID":"cafdc936-a3cc-4936-8d73-4f24d1679386","Type":"ContainerStarted","Data":"4151107ed2d6bcbc8aacf57079c57b449e6e7d810456c95fa5220c4a0ca6f0bb"} Apr 16 22:15:42.514904 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:42.514893 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d5f5c574-z6m9w" Apr 16 22:15:42.526937 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:42.526893 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-65b7968cfc-t8jfg" podStartSLOduration=1.81956575 podStartE2EDuration="6.526880116s" podCreationTimestamp="2026-04-16 22:15:36 +0000 UTC" firstStartedPulling="2026-04-16 22:15:36.754092212 +0000 UTC m=+641.014818297" lastFinishedPulling="2026-04-16 22:15:41.461406577 +0000 UTC m=+645.722132663" observedRunningTime="2026-04-16 22:15:42.525527812 +0000 UTC m=+646.786253916" watchObservedRunningTime="2026-04-16 22:15:42.526880116 +0000 UTC m=+646.787606220" Apr 16 22:15:42.541095 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:42.541038 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-5b8d76df79-8hw22" podStartSLOduration=1.7404725829999999 podStartE2EDuration="5.541022101s" podCreationTimestamp="2026-04-16 22:15:37 +0000 UTC" firstStartedPulling="2026-04-16 22:15:37.666236545 +0000 UTC m=+641.926962634" lastFinishedPulling="2026-04-16 22:15:41.466786069 +0000 UTC m=+645.727512152" observedRunningTime="2026-04-16 22:15:42.538384439 +0000 UTC m=+646.799110544" watchObservedRunningTime="2026-04-16 22:15:42.541022101 +0000 UTC m=+646.801748206" Apr 16 22:15:48.524942 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:48.524914 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-5b8d76df79-8hw22" Apr 16 22:15:48.525388 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:48.525354 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-65b7968cfc-t8jfg" Apr 16 22:15:48.542146 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:48.542097 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d5f5c574-z6m9w" podStartSLOduration=7.862214138 podStartE2EDuration="12.542083s" podCreationTimestamp="2026-04-16 22:15:36 +0000 UTC" firstStartedPulling="2026-04-16 22:15:36.781538628 +0000 UTC m=+641.042264725" lastFinishedPulling="2026-04-16 22:15:41.461407491 +0000 UTC m=+645.722133587" observedRunningTime="2026-04-16 22:15:42.553851695 +0000 UTC m=+646.814577800" watchObservedRunningTime="2026-04-16 22:15:48.542083 +0000 UTC m=+652.802809105" Apr 16 22:15:48.579409 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:48.579378 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-65b7968cfc-t8jfg"] Apr 16 22:15:48.579618 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:48.579596 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-65b7968cfc-t8jfg" podUID="8f164300-cff9-45a5-9bb1-d2ac77dced42" containerName="maas-api" containerID="cri-o://15b85f056e9da75aeb33e3374cc1a8ccc3ea128e623e732d040e02c9980925a7" gracePeriod=30 Apr 16 22:15:48.826575 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:48.826554 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-65b7968cfc-t8jfg" Apr 16 22:15:48.950575 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:48.950538 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwb7p\" (UniqueName: \"kubernetes.io/projected/8f164300-cff9-45a5-9bb1-d2ac77dced42-kube-api-access-qwb7p\") pod \"8f164300-cff9-45a5-9bb1-d2ac77dced42\" (UID: \"8f164300-cff9-45a5-9bb1-d2ac77dced42\") " Apr 16 22:15:48.950736 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:48.950635 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8f164300-cff9-45a5-9bb1-d2ac77dced42-maas-api-tls\") pod \"8f164300-cff9-45a5-9bb1-d2ac77dced42\" (UID: \"8f164300-cff9-45a5-9bb1-d2ac77dced42\") " Apr 16 22:15:48.952837 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:48.952800 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f164300-cff9-45a5-9bb1-d2ac77dced42-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "8f164300-cff9-45a5-9bb1-d2ac77dced42" (UID: "8f164300-cff9-45a5-9bb1-d2ac77dced42"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 22:15:48.952837 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:48.952830 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f164300-cff9-45a5-9bb1-d2ac77dced42-kube-api-access-qwb7p" (OuterVolumeSpecName: "kube-api-access-qwb7p") pod "8f164300-cff9-45a5-9bb1-d2ac77dced42" (UID: "8f164300-cff9-45a5-9bb1-d2ac77dced42"). InnerVolumeSpecName "kube-api-access-qwb7p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:15:49.051326 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:49.051277 2576 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8f164300-cff9-45a5-9bb1-d2ac77dced42-maas-api-tls\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:15:49.051326 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:49.051328 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qwb7p\" (UniqueName: \"kubernetes.io/projected/8f164300-cff9-45a5-9bb1-d2ac77dced42-kube-api-access-qwb7p\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:15:49.540957 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:49.540920 2576 generic.go:358] "Generic (PLEG): container finished" podID="8f164300-cff9-45a5-9bb1-d2ac77dced42" containerID="15b85f056e9da75aeb33e3374cc1a8ccc3ea128e623e732d040e02c9980925a7" exitCode=0 Apr 16 22:15:49.541365 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:49.540991 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-65b7968cfc-t8jfg" Apr 16 22:15:49.541365 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:49.541010 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-65b7968cfc-t8jfg" event={"ID":"8f164300-cff9-45a5-9bb1-d2ac77dced42","Type":"ContainerDied","Data":"15b85f056e9da75aeb33e3374cc1a8ccc3ea128e623e732d040e02c9980925a7"} Apr 16 22:15:49.541365 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:49.541051 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-65b7968cfc-t8jfg" event={"ID":"8f164300-cff9-45a5-9bb1-d2ac77dced42","Type":"ContainerDied","Data":"6e749291a360380fae81517ff5a36b02adc81eeb6f69dada3a9282292906eb3b"} Apr 16 22:15:49.541365 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:49.541071 2576 scope.go:117] "RemoveContainer" containerID="15b85f056e9da75aeb33e3374cc1a8ccc3ea128e623e732d040e02c9980925a7" Apr 16 22:15:49.549818 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:49.549803 2576 scope.go:117] "RemoveContainer" containerID="15b85f056e9da75aeb33e3374cc1a8ccc3ea128e623e732d040e02c9980925a7" Apr 16 22:15:49.550054 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:15:49.550036 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15b85f056e9da75aeb33e3374cc1a8ccc3ea128e623e732d040e02c9980925a7\": container with ID starting with 15b85f056e9da75aeb33e3374cc1a8ccc3ea128e623e732d040e02c9980925a7 not found: ID does not exist" containerID="15b85f056e9da75aeb33e3374cc1a8ccc3ea128e623e732d040e02c9980925a7" Apr 16 22:15:49.550110 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:49.550061 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15b85f056e9da75aeb33e3374cc1a8ccc3ea128e623e732d040e02c9980925a7"} err="failed to get container status \"15b85f056e9da75aeb33e3374cc1a8ccc3ea128e623e732d040e02c9980925a7\": rpc error: code = NotFound desc = could not find container \"15b85f056e9da75aeb33e3374cc1a8ccc3ea128e623e732d040e02c9980925a7\": container with ID starting with 15b85f056e9da75aeb33e3374cc1a8ccc3ea128e623e732d040e02c9980925a7 not found: ID does not exist" Apr 16 22:15:49.562041 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:49.562009 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-65b7968cfc-t8jfg"] Apr 16 22:15:49.564400 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:49.564380 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-65b7968cfc-t8jfg"] Apr 16 22:15:50.281971 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:50.281935 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f164300-cff9-45a5-9bb1-d2ac77dced42" path="/var/lib/kubelet/pods/8f164300-cff9-45a5-9bb1-d2ac77dced42/volumes" Apr 16 22:15:53.524645 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:53.524615 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6d5f5c574-z6m9w" Apr 16 22:15:53.817515 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:53.817443 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-77c748cfd6-v4gc6"] Apr 16 22:15:53.817804 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:53.817792 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f164300-cff9-45a5-9bb1-d2ac77dced42" containerName="maas-api" Apr 16 22:15:53.817843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:53.817805 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f164300-cff9-45a5-9bb1-d2ac77dced42" containerName="maas-api" Apr 16 22:15:53.817878 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:53.817871 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f164300-cff9-45a5-9bb1-d2ac77dced42" containerName="maas-api" Apr 16 22:15:53.822435 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:53.822404 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-77c748cfd6-v4gc6" Apr 16 22:15:53.827925 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:53.827900 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-77c748cfd6-v4gc6"] Apr 16 22:15:53.991681 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:53.991649 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r89k\" (UniqueName: \"kubernetes.io/projected/2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b-kube-api-access-2r89k\") pod \"maas-controller-77c748cfd6-v4gc6\" (UID: \"2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b\") " pod="opendatahub/maas-controller-77c748cfd6-v4gc6" Apr 16 22:15:54.092203 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:54.092180 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2r89k\" (UniqueName: \"kubernetes.io/projected/2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b-kube-api-access-2r89k\") pod \"maas-controller-77c748cfd6-v4gc6\" (UID: \"2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b\") " pod="opendatahub/maas-controller-77c748cfd6-v4gc6" Apr 16 22:15:54.099680 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:54.099651 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r89k\" (UniqueName: \"kubernetes.io/projected/2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b-kube-api-access-2r89k\") pod \"maas-controller-77c748cfd6-v4gc6\" (UID: \"2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b\") " pod="opendatahub/maas-controller-77c748cfd6-v4gc6" Apr 16 22:15:54.134621 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:54.134588 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-77c748cfd6-v4gc6" Apr 16 22:15:54.459707 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:54.459683 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-77c748cfd6-v4gc6"] Apr 16 22:15:54.461997 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:15:54.461968 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d7d7b18_dc9e_4fbf_849b_ce7965c02b8b.slice/crio-a1fa322f911d1567265c8b7c2625d90cb34a363ad3a8b938a82501be1036ece6 WatchSource:0}: Error finding container a1fa322f911d1567265c8b7c2625d90cb34a363ad3a8b938a82501be1036ece6: Status 404 returned error can't find the container with id a1fa322f911d1567265c8b7c2625d90cb34a363ad3a8b938a82501be1036ece6 Apr 16 22:15:54.561397 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:54.561365 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-77c748cfd6-v4gc6" event={"ID":"2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b","Type":"ContainerStarted","Data":"a1fa322f911d1567265c8b7c2625d90cb34a363ad3a8b938a82501be1036ece6"} Apr 16 22:15:55.566329 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:55.566277 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-77c748cfd6-v4gc6" event={"ID":"2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b","Type":"ContainerStarted","Data":"514c368d474e0f561cb8b1fd09d6442687a59455c90b16405284f6c4bc35df2c"} Apr 16 22:15:55.566694 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:55.566368 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-77c748cfd6-v4gc6" Apr 16 22:15:55.580662 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:15:55.580612 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-77c748cfd6-v4gc6" podStartSLOduration=2.075477943 podStartE2EDuration="2.580599285s" podCreationTimestamp="2026-04-16 22:15:53 +0000 UTC" firstStartedPulling="2026-04-16 22:15:54.463335793 +0000 UTC m=+658.724061880" lastFinishedPulling="2026-04-16 22:15:54.968457127 +0000 UTC m=+659.229183222" observedRunningTime="2026-04-16 22:15:55.578945072 +0000 UTC m=+659.839671177" watchObservedRunningTime="2026-04-16 22:15:55.580599285 +0000 UTC m=+659.841325390" Apr 16 22:16:06.579154 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:06.579120 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-77c748cfd6-v4gc6" Apr 16 22:16:06.615951 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:06.615910 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d5f5c574-z6m9w"] Apr 16 22:16:06.616172 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:06.616136 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d5f5c574-z6m9w" podUID="cafdc936-a3cc-4936-8d73-4f24d1679386" containerName="manager" containerID="cri-o://4151107ed2d6bcbc8aacf57079c57b449e6e7d810456c95fa5220c4a0ca6f0bb" gracePeriod=10 Apr 16 22:16:06.862421 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:06.862401 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d5f5c574-z6m9w" Apr 16 22:16:06.900483 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:06.900457 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x96vs\" (UniqueName: \"kubernetes.io/projected/cafdc936-a3cc-4936-8d73-4f24d1679386-kube-api-access-x96vs\") pod \"cafdc936-a3cc-4936-8d73-4f24d1679386\" (UID: \"cafdc936-a3cc-4936-8d73-4f24d1679386\") " Apr 16 22:16:06.902451 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:06.902420 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cafdc936-a3cc-4936-8d73-4f24d1679386-kube-api-access-x96vs" (OuterVolumeSpecName: "kube-api-access-x96vs") pod "cafdc936-a3cc-4936-8d73-4f24d1679386" (UID: "cafdc936-a3cc-4936-8d73-4f24d1679386"). InnerVolumeSpecName "kube-api-access-x96vs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:16:07.000984 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:07.000950 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x96vs\" (UniqueName: \"kubernetes.io/projected/cafdc936-a3cc-4936-8d73-4f24d1679386-kube-api-access-x96vs\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:16:07.616807 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:07.616767 2576 generic.go:358] "Generic (PLEG): container finished" podID="cafdc936-a3cc-4936-8d73-4f24d1679386" containerID="4151107ed2d6bcbc8aacf57079c57b449e6e7d810456c95fa5220c4a0ca6f0bb" exitCode=0 Apr 16 22:16:07.617345 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:07.617299 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d5f5c574-z6m9w" event={"ID":"cafdc936-a3cc-4936-8d73-4f24d1679386","Type":"ContainerDied","Data":"4151107ed2d6bcbc8aacf57079c57b449e6e7d810456c95fa5220c4a0ca6f0bb"} Apr 16 22:16:07.617492 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:07.617470 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d5f5c574-z6m9w" event={"ID":"cafdc936-a3cc-4936-8d73-4f24d1679386","Type":"ContainerDied","Data":"df3d26afc5216b6476f877601bc9400e532f94cbe3e0a99ecba01ace8492b279"} Apr 16 22:16:07.617587 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:07.617506 2576 scope.go:117] "RemoveContainer" containerID="4151107ed2d6bcbc8aacf57079c57b449e6e7d810456c95fa5220c4a0ca6f0bb" Apr 16 22:16:07.617587 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:07.617556 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d5f5c574-z6m9w" Apr 16 22:16:07.626240 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:07.626224 2576 scope.go:117] "RemoveContainer" containerID="4151107ed2d6bcbc8aacf57079c57b449e6e7d810456c95fa5220c4a0ca6f0bb" Apr 16 22:16:07.626497 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:16:07.626479 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4151107ed2d6bcbc8aacf57079c57b449e6e7d810456c95fa5220c4a0ca6f0bb\": container with ID starting with 4151107ed2d6bcbc8aacf57079c57b449e6e7d810456c95fa5220c4a0ca6f0bb not found: ID does not exist" containerID="4151107ed2d6bcbc8aacf57079c57b449e6e7d810456c95fa5220c4a0ca6f0bb" Apr 16 22:16:07.626560 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:07.626503 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4151107ed2d6bcbc8aacf57079c57b449e6e7d810456c95fa5220c4a0ca6f0bb"} err="failed to get container status \"4151107ed2d6bcbc8aacf57079c57b449e6e7d810456c95fa5220c4a0ca6f0bb\": rpc error: code = NotFound desc = could not find container \"4151107ed2d6bcbc8aacf57079c57b449e6e7d810456c95fa5220c4a0ca6f0bb\": container with ID starting with 4151107ed2d6bcbc8aacf57079c57b449e6e7d810456c95fa5220c4a0ca6f0bb not found: ID does not exist" Apr 16 22:16:07.640949 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:07.640928 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d5f5c574-z6m9w"] Apr 16 22:16:07.643017 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:07.642999 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d5f5c574-z6m9w"] Apr 16 22:16:08.283259 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:08.283220 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cafdc936-a3cc-4936-8d73-4f24d1679386" path="/var/lib/kubelet/pods/cafdc936-a3cc-4936-8d73-4f24d1679386/volumes" Apr 16 22:16:16.757450 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.757418 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr"] Apr 16 22:16:16.757831 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.757808 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cafdc936-a3cc-4936-8d73-4f24d1679386" containerName="manager" Apr 16 22:16:16.757831 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.757824 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cafdc936-a3cc-4936-8d73-4f24d1679386" containerName="manager" Apr 16 22:16:16.757953 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.757918 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cafdc936-a3cc-4936-8d73-4f24d1679386" containerName="manager" Apr 16 22:16:16.765213 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.765193 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" Apr 16 22:16:16.768273 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.768249 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 16 22:16:16.768425 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.768275 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-khb59\"" Apr 16 22:16:16.768581 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.768555 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 16 22:16:16.769106 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.769080 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 16 22:16:16.771672 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.771649 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr"] Apr 16 22:16:16.784647 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.784626 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2facde92-691f-406e-a26c-a68c92e6c31c-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr\" (UID: \"2facde92-691f-406e-a26c-a68c92e6c31c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" Apr 16 22:16:16.784743 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.784653 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2facde92-691f-406e-a26c-a68c92e6c31c-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr\" (UID: \"2facde92-691f-406e-a26c-a68c92e6c31c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" Apr 16 22:16:16.784743 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.784676 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2facde92-691f-406e-a26c-a68c92e6c31c-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr\" (UID: \"2facde92-691f-406e-a26c-a68c92e6c31c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" Apr 16 22:16:16.784743 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.784725 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2facde92-691f-406e-a26c-a68c92e6c31c-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr\" (UID: \"2facde92-691f-406e-a26c-a68c92e6c31c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" Apr 16 22:16:16.784857 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.784747 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2facde92-691f-406e-a26c-a68c92e6c31c-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr\" (UID: \"2facde92-691f-406e-a26c-a68c92e6c31c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" Apr 16 22:16:16.784857 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.784770 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4lsg\" (UniqueName: \"kubernetes.io/projected/2facde92-691f-406e-a26c-a68c92e6c31c-kube-api-access-v4lsg\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr\" (UID: \"2facde92-691f-406e-a26c-a68c92e6c31c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" Apr 16 22:16:16.885716 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.885689 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2facde92-691f-406e-a26c-a68c92e6c31c-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr\" (UID: \"2facde92-691f-406e-a26c-a68c92e6c31c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" Apr 16 22:16:16.885716 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.885718 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2facde92-691f-406e-a26c-a68c92e6c31c-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr\" (UID: \"2facde92-691f-406e-a26c-a68c92e6c31c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" Apr 16 22:16:16.885900 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.885739 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2facde92-691f-406e-a26c-a68c92e6c31c-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr\" (UID: \"2facde92-691f-406e-a26c-a68c92e6c31c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" Apr 16 22:16:16.885900 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.885777 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2facde92-691f-406e-a26c-a68c92e6c31c-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr\" (UID: \"2facde92-691f-406e-a26c-a68c92e6c31c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" Apr 16 22:16:16.885990 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.885896 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2facde92-691f-406e-a26c-a68c92e6c31c-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr\" (UID: \"2facde92-691f-406e-a26c-a68c92e6c31c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" Apr 16 22:16:16.885990 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.885946 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4lsg\" (UniqueName: \"kubernetes.io/projected/2facde92-691f-406e-a26c-a68c92e6c31c-kube-api-access-v4lsg\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr\" (UID: \"2facde92-691f-406e-a26c-a68c92e6c31c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" Apr 16 22:16:16.886128 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.886109 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2facde92-691f-406e-a26c-a68c92e6c31c-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr\" (UID: \"2facde92-691f-406e-a26c-a68c92e6c31c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" Apr 16 22:16:16.886189 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.886154 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2facde92-691f-406e-a26c-a68c92e6c31c-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr\" (UID: \"2facde92-691f-406e-a26c-a68c92e6c31c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" Apr 16 22:16:16.886189 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.886175 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2facde92-691f-406e-a26c-a68c92e6c31c-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr\" (UID: \"2facde92-691f-406e-a26c-a68c92e6c31c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" Apr 16 22:16:16.888068 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.888048 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2facde92-691f-406e-a26c-a68c92e6c31c-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr\" (UID: \"2facde92-691f-406e-a26c-a68c92e6c31c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" Apr 16 22:16:16.888251 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.888235 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2facde92-691f-406e-a26c-a68c92e6c31c-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr\" (UID: \"2facde92-691f-406e-a26c-a68c92e6c31c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" Apr 16 22:16:16.893079 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:16.893055 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4lsg\" (UniqueName: \"kubernetes.io/projected/2facde92-691f-406e-a26c-a68c92e6c31c-kube-api-access-v4lsg\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr\" (UID: \"2facde92-691f-406e-a26c-a68c92e6c31c\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" Apr 16 22:16:17.075196 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:17.075112 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" Apr 16 22:16:17.201227 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:17.201200 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr"] Apr 16 22:16:17.203451 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:16:17.203421 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2facde92_691f_406e_a26c_a68c92e6c31c.slice/crio-99785d1b50e3723e57747097367daef16698b237db4f77bb72d50d8320aa3008 WatchSource:0}: Error finding container 99785d1b50e3723e57747097367daef16698b237db4f77bb72d50d8320aa3008: Status 404 returned error can't find the container with id 99785d1b50e3723e57747097367daef16698b237db4f77bb72d50d8320aa3008 Apr 16 22:16:17.655916 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:17.655882 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" event={"ID":"2facde92-691f-406e-a26c-a68c92e6c31c","Type":"ContainerStarted","Data":"99785d1b50e3723e57747097367daef16698b237db4f77bb72d50d8320aa3008"} Apr 16 22:16:23.682411 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:23.682373 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" event={"ID":"2facde92-691f-406e-a26c-a68c92e6c31c","Type":"ContainerStarted","Data":"8b1095404e9fffc39662591b05b46de94fce1044199b3c43bc731875c8f8baad"} Apr 16 22:16:31.717608 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:31.717577 2576 generic.go:358] "Generic (PLEG): container finished" podID="2facde92-691f-406e-a26c-a68c92e6c31c" containerID="8b1095404e9fffc39662591b05b46de94fce1044199b3c43bc731875c8f8baad" exitCode=0 Apr 16 22:16:31.717953 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:31.717619 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" event={"ID":"2facde92-691f-406e-a26c-a68c92e6c31c","Type":"ContainerDied","Data":"8b1095404e9fffc39662591b05b46de94fce1044199b3c43bc731875c8f8baad"} Apr 16 22:16:35.744995 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:35.744878 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" event={"ID":"2facde92-691f-406e-a26c-a68c92e6c31c","Type":"ContainerStarted","Data":"f10d7e2de39814c4a024cfe8989afb6da80397a3902f4c55ca728f7c77165695"} Apr 16 22:16:35.745389 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:35.745077 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" Apr 16 22:16:35.764209 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:35.764163 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" podStartSLOduration=1.596837643 podStartE2EDuration="19.764149903s" podCreationTimestamp="2026-04-16 22:16:16 +0000 UTC" firstStartedPulling="2026-04-16 22:16:17.20676051 +0000 UTC m=+681.467486610" lastFinishedPulling="2026-04-16 22:16:35.374072771 +0000 UTC m=+699.634798870" observedRunningTime="2026-04-16 22:16:35.761382227 +0000 UTC m=+700.022108332" watchObservedRunningTime="2026-04-16 22:16:35.764149903 +0000 UTC m=+700.024876008" Apr 16 22:16:41.565097 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.565066 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk"] Apr 16 22:16:41.624407 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.624374 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk"] Apr 16 22:16:41.624565 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.624494 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" Apr 16 22:16:41.627417 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.627394 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 16 22:16:41.724323 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.724276 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a03252c-b02c-423b-8f0d-a5f9f7a013e3-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk\" (UID: \"6a03252c-b02c-423b-8f0d-a5f9f7a013e3\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" Apr 16 22:16:41.724457 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.724328 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6a03252c-b02c-423b-8f0d-a5f9f7a013e3-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk\" (UID: \"6a03252c-b02c-423b-8f0d-a5f9f7a013e3\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" Apr 16 22:16:41.724457 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.724390 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6a03252c-b02c-423b-8f0d-a5f9f7a013e3-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk\" (UID: \"6a03252c-b02c-423b-8f0d-a5f9f7a013e3\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" Apr 16 22:16:41.724457 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.724429 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtgsh\" (UniqueName: \"kubernetes.io/projected/6a03252c-b02c-423b-8f0d-a5f9f7a013e3-kube-api-access-qtgsh\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk\" (UID: \"6a03252c-b02c-423b-8f0d-a5f9f7a013e3\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" Apr 16 22:16:41.724457 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.724456 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6a03252c-b02c-423b-8f0d-a5f9f7a013e3-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk\" (UID: \"6a03252c-b02c-423b-8f0d-a5f9f7a013e3\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" Apr 16 22:16:41.724596 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.724536 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6a03252c-b02c-423b-8f0d-a5f9f7a013e3-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk\" (UID: \"6a03252c-b02c-423b-8f0d-a5f9f7a013e3\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" Apr 16 22:16:41.824982 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.824920 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6a03252c-b02c-423b-8f0d-a5f9f7a013e3-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk\" (UID: \"6a03252c-b02c-423b-8f0d-a5f9f7a013e3\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" Apr 16 22:16:41.824982 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.824975 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a03252c-b02c-423b-8f0d-a5f9f7a013e3-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk\" (UID: \"6a03252c-b02c-423b-8f0d-a5f9f7a013e3\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" Apr 16 22:16:41.825174 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.825010 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6a03252c-b02c-423b-8f0d-a5f9f7a013e3-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk\" (UID: \"6a03252c-b02c-423b-8f0d-a5f9f7a013e3\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" Apr 16 22:16:41.825174 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.825049 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6a03252c-b02c-423b-8f0d-a5f9f7a013e3-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk\" (UID: \"6a03252c-b02c-423b-8f0d-a5f9f7a013e3\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" Apr 16 22:16:41.825174 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.825084 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtgsh\" (UniqueName: \"kubernetes.io/projected/6a03252c-b02c-423b-8f0d-a5f9f7a013e3-kube-api-access-qtgsh\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk\" (UID: \"6a03252c-b02c-423b-8f0d-a5f9f7a013e3\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" Apr 16 22:16:41.825174 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.825120 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6a03252c-b02c-423b-8f0d-a5f9f7a013e3-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk\" (UID: \"6a03252c-b02c-423b-8f0d-a5f9f7a013e3\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" Apr 16 22:16:41.825424 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.825289 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6a03252c-b02c-423b-8f0d-a5f9f7a013e3-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk\" (UID: \"6a03252c-b02c-423b-8f0d-a5f9f7a013e3\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" Apr 16 22:16:41.825424 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.825394 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a03252c-b02c-423b-8f0d-a5f9f7a013e3-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk\" (UID: \"6a03252c-b02c-423b-8f0d-a5f9f7a013e3\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" Apr 16 22:16:41.825529 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.825442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6a03252c-b02c-423b-8f0d-a5f9f7a013e3-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk\" (UID: \"6a03252c-b02c-423b-8f0d-a5f9f7a013e3\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" Apr 16 22:16:41.827497 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.827466 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6a03252c-b02c-423b-8f0d-a5f9f7a013e3-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk\" (UID: \"6a03252c-b02c-423b-8f0d-a5f9f7a013e3\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" Apr 16 22:16:41.827675 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.827661 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6a03252c-b02c-423b-8f0d-a5f9f7a013e3-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk\" (UID: \"6a03252c-b02c-423b-8f0d-a5f9f7a013e3\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" Apr 16 22:16:41.833024 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.832998 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtgsh\" (UniqueName: \"kubernetes.io/projected/6a03252c-b02c-423b-8f0d-a5f9f7a013e3-kube-api-access-qtgsh\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk\" (UID: \"6a03252c-b02c-423b-8f0d-a5f9f7a013e3\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" Apr 16 22:16:41.938193 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:41.938161 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" Apr 16 22:16:42.066401 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:42.066377 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk"] Apr 16 22:16:42.069115 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:16:42.069088 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a03252c_b02c_423b_8f0d_a5f9f7a013e3.slice/crio-499dad3f9f60ee975645240b35d7d7ba10048adcd77dd5576ef82406636b7748 WatchSource:0}: Error finding container 499dad3f9f60ee975645240b35d7d7ba10048adcd77dd5576ef82406636b7748: Status 404 returned error can't find the container with id 499dad3f9f60ee975645240b35d7d7ba10048adcd77dd5576ef82406636b7748 Apr 16 22:16:42.774267 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:42.774227 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" event={"ID":"6a03252c-b02c-423b-8f0d-a5f9f7a013e3","Type":"ContainerStarted","Data":"aabfcdc299a07665bddf5fa1d738523dbba8d434bf3e8e0072779b682a66ec90"} Apr 16 22:16:42.774267 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:42.774266 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" event={"ID":"6a03252c-b02c-423b-8f0d-a5f9f7a013e3","Type":"ContainerStarted","Data":"499dad3f9f60ee975645240b35d7d7ba10048adcd77dd5576ef82406636b7748"} Apr 16 22:16:46.761816 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:46.761783 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr" Apr 16 22:16:47.793520 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:47.793415 2576 generic.go:358] "Generic (PLEG): container finished" podID="6a03252c-b02c-423b-8f0d-a5f9f7a013e3" containerID="aabfcdc299a07665bddf5fa1d738523dbba8d434bf3e8e0072779b682a66ec90" exitCode=0 Apr 16 22:16:47.793520 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:47.793484 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" event={"ID":"6a03252c-b02c-423b-8f0d-a5f9f7a013e3","Type":"ContainerDied","Data":"aabfcdc299a07665bddf5fa1d738523dbba8d434bf3e8e0072779b682a66ec90"} Apr 16 22:16:48.798786 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:48.798754 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" event={"ID":"6a03252c-b02c-423b-8f0d-a5f9f7a013e3","Type":"ContainerStarted","Data":"305cf881009269a3144774df642869a15f3332677af0ab720e0af055be2ebdbe"} Apr 16 22:16:48.799233 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:48.798977 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" Apr 16 22:16:48.816694 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:48.816651 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" podStartSLOduration=7.449223342 podStartE2EDuration="7.816638198s" podCreationTimestamp="2026-04-16 22:16:41 +0000 UTC" firstStartedPulling="2026-04-16 22:16:47.794181715 +0000 UTC m=+712.054907798" lastFinishedPulling="2026-04-16 22:16:48.161596572 +0000 UTC m=+712.422322654" observedRunningTime="2026-04-16 22:16:48.814905046 +0000 UTC m=+713.075631168" watchObservedRunningTime="2026-04-16 22:16:48.816638198 +0000 UTC m=+713.077364364" Apr 16 22:16:59.816015 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:16:59.815983 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk" Apr 16 22:19:02.045987 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:02.045953 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-77c748cfd6-v4gc6"] Apr 16 22:19:02.046503 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:02.046212 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-77c748cfd6-v4gc6" podUID="2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b" containerName="manager" containerID="cri-o://514c368d474e0f561cb8b1fd09d6442687a59455c90b16405284f6c4bc35df2c" gracePeriod=10 Apr 16 22:19:02.292021 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:02.291997 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-77c748cfd6-v4gc6" Apr 16 22:19:02.324405 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:02.324328 2576 generic.go:358] "Generic (PLEG): container finished" podID="2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b" containerID="514c368d474e0f561cb8b1fd09d6442687a59455c90b16405284f6c4bc35df2c" exitCode=0 Apr 16 22:19:02.324405 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:02.324393 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-77c748cfd6-v4gc6" Apr 16 22:19:02.324575 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:02.324400 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-77c748cfd6-v4gc6" event={"ID":"2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b","Type":"ContainerDied","Data":"514c368d474e0f561cb8b1fd09d6442687a59455c90b16405284f6c4bc35df2c"} Apr 16 22:19:02.324575 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:02.324445 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-77c748cfd6-v4gc6" event={"ID":"2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b","Type":"ContainerDied","Data":"a1fa322f911d1567265c8b7c2625d90cb34a363ad3a8b938a82501be1036ece6"} Apr 16 22:19:02.324575 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:02.324467 2576 scope.go:117] "RemoveContainer" containerID="514c368d474e0f561cb8b1fd09d6442687a59455c90b16405284f6c4bc35df2c" Apr 16 22:19:02.335109 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:02.335090 2576 scope.go:117] "RemoveContainer" containerID="514c368d474e0f561cb8b1fd09d6442687a59455c90b16405284f6c4bc35df2c" Apr 16 22:19:02.335383 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:19:02.335363 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"514c368d474e0f561cb8b1fd09d6442687a59455c90b16405284f6c4bc35df2c\": container with ID starting with 514c368d474e0f561cb8b1fd09d6442687a59455c90b16405284f6c4bc35df2c not found: ID does not exist" containerID="514c368d474e0f561cb8b1fd09d6442687a59455c90b16405284f6c4bc35df2c" Apr 16 22:19:02.335459 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:02.335396 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"514c368d474e0f561cb8b1fd09d6442687a59455c90b16405284f6c4bc35df2c"} err="failed to get container status \"514c368d474e0f561cb8b1fd09d6442687a59455c90b16405284f6c4bc35df2c\": rpc error: code = NotFound desc = could not find container \"514c368d474e0f561cb8b1fd09d6442687a59455c90b16405284f6c4bc35df2c\": container with ID starting with 514c368d474e0f561cb8b1fd09d6442687a59455c90b16405284f6c4bc35df2c not found: ID does not exist" Apr 16 22:19:02.355189 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:02.355165 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r89k\" (UniqueName: \"kubernetes.io/projected/2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b-kube-api-access-2r89k\") pod \"2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b\" (UID: \"2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b\") " Apr 16 22:19:02.357079 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:02.357054 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b-kube-api-access-2r89k" (OuterVolumeSpecName: "kube-api-access-2r89k") pod "2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b" (UID: "2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b"). InnerVolumeSpecName "kube-api-access-2r89k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:19:02.456694 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:02.456660 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2r89k\" (UniqueName: \"kubernetes.io/projected/2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b-kube-api-access-2r89k\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:19:02.645217 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:02.645192 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-77c748cfd6-v4gc6"] Apr 16 22:19:02.648665 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:02.648635 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-77c748cfd6-v4gc6"] Apr 16 22:19:03.356875 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:03.356840 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-77c748cfd6-chdx8"] Apr 16 22:19:03.357342 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:03.357228 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b" containerName="manager" Apr 16 22:19:03.357342 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:03.357240 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b" containerName="manager" Apr 16 22:19:03.357439 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:03.357347 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b" containerName="manager" Apr 16 22:19:03.361783 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:03.361765 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-77c748cfd6-chdx8" Apr 16 22:19:03.364450 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:03.364429 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-rt5bg\"" Apr 16 22:19:03.367373 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:03.367351 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-77c748cfd6-chdx8"] Apr 16 22:19:03.465645 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:03.465618 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hqx8\" (UniqueName: \"kubernetes.io/projected/d34ab5d4-de5c-423a-b625-de87870833eb-kube-api-access-6hqx8\") pod \"maas-controller-77c748cfd6-chdx8\" (UID: \"d34ab5d4-de5c-423a-b625-de87870833eb\") " pod="opendatahub/maas-controller-77c748cfd6-chdx8" Apr 16 22:19:03.566361 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:03.566334 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6hqx8\" (UniqueName: \"kubernetes.io/projected/d34ab5d4-de5c-423a-b625-de87870833eb-kube-api-access-6hqx8\") pod \"maas-controller-77c748cfd6-chdx8\" (UID: \"d34ab5d4-de5c-423a-b625-de87870833eb\") " pod="opendatahub/maas-controller-77c748cfd6-chdx8" Apr 16 22:19:03.574633 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:03.574609 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hqx8\" (UniqueName: \"kubernetes.io/projected/d34ab5d4-de5c-423a-b625-de87870833eb-kube-api-access-6hqx8\") pod \"maas-controller-77c748cfd6-chdx8\" (UID: \"d34ab5d4-de5c-423a-b625-de87870833eb\") " pod="opendatahub/maas-controller-77c748cfd6-chdx8" Apr 16 22:19:03.673843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:03.673778 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-77c748cfd6-chdx8" Apr 16 22:19:03.796325 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:03.796279 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-77c748cfd6-chdx8"] Apr 16 22:19:03.797654 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:19:03.797629 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd34ab5d4_de5c_423a_b625_de87870833eb.slice/crio-dba9f21d6f78d9f3bfa30f02985ef2faca1958de6976889c479f609c16e9e244 WatchSource:0}: Error finding container dba9f21d6f78d9f3bfa30f02985ef2faca1958de6976889c479f609c16e9e244: Status 404 returned error can't find the container with id dba9f21d6f78d9f3bfa30f02985ef2faca1958de6976889c479f609c16e9e244 Apr 16 22:19:04.283170 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:04.283142 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b" path="/var/lib/kubelet/pods/2d7d7b18-dc9e-4fbf-849b-ce7965c02b8b/volumes" Apr 16 22:19:04.334536 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:04.334501 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-77c748cfd6-chdx8" event={"ID":"d34ab5d4-de5c-423a-b625-de87870833eb","Type":"ContainerStarted","Data":"354cd65b67c1b166da0cbafb225e3a833bab576f8857eaef1a231d9b3bac603b"} Apr 16 22:19:04.334680 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:04.334544 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-77c748cfd6-chdx8" event={"ID":"d34ab5d4-de5c-423a-b625-de87870833eb","Type":"ContainerStarted","Data":"dba9f21d6f78d9f3bfa30f02985ef2faca1958de6976889c479f609c16e9e244"} Apr 16 22:19:04.334680 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:04.334580 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-77c748cfd6-chdx8" Apr 16 22:19:04.349060 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:04.349007 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-77c748cfd6-chdx8" podStartSLOduration=0.916671473 podStartE2EDuration="1.348988221s" podCreationTimestamp="2026-04-16 22:19:03 +0000 UTC" firstStartedPulling="2026-04-16 22:19:03.799035913 +0000 UTC m=+848.059761997" lastFinishedPulling="2026-04-16 22:19:04.231352655 +0000 UTC m=+848.492078745" observedRunningTime="2026-04-16 22:19:04.347240293 +0000 UTC m=+848.607966402" watchObservedRunningTime="2026-04-16 22:19:04.348988221 +0000 UTC m=+848.609714327" Apr 16 22:19:15.343347 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:15.343294 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-77c748cfd6-chdx8" Apr 16 22:19:56.218613 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:56.218544 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/2.log" Apr 16 22:19:56.219716 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:19:56.219692 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/2.log" Apr 16 22:24:56.250109 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:24:56.250080 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/2.log" Apr 16 22:24:56.251457 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:24:56.251436 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/2.log" Apr 16 22:29:11.365048 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:29:11.365009 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6"] Apr 16 22:29:11.367541 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:29:11.365283 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6" podUID="30802795-7f45-4e9c-9470-70dca27eca94" containerName="manager" containerID="cri-o://9eff2c2c3972c11e6e2baae921d94cd8bc73a29f3c7c7e7e5f29e81cc5272d19" gracePeriod=10 Apr 16 22:29:12.013869 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:29:12.013848 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6" Apr 16 22:29:12.144718 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:29:12.144691 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/30802795-7f45-4e9c-9470-70dca27eca94-extensions-socket-volume\") pod \"30802795-7f45-4e9c-9470-70dca27eca94\" (UID: \"30802795-7f45-4e9c-9470-70dca27eca94\") " Apr 16 22:29:12.144718 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:29:12.144720 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9xz8\" (UniqueName: \"kubernetes.io/projected/30802795-7f45-4e9c-9470-70dca27eca94-kube-api-access-f9xz8\") pod \"30802795-7f45-4e9c-9470-70dca27eca94\" (UID: \"30802795-7f45-4e9c-9470-70dca27eca94\") " Apr 16 22:29:12.145044 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:29:12.145022 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30802795-7f45-4e9c-9470-70dca27eca94-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "30802795-7f45-4e9c-9470-70dca27eca94" (UID: "30802795-7f45-4e9c-9470-70dca27eca94"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 22:29:12.146844 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:29:12.146812 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30802795-7f45-4e9c-9470-70dca27eca94-kube-api-access-f9xz8" (OuterVolumeSpecName: "kube-api-access-f9xz8") pod "30802795-7f45-4e9c-9470-70dca27eca94" (UID: "30802795-7f45-4e9c-9470-70dca27eca94"). InnerVolumeSpecName "kube-api-access-f9xz8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 22:29:12.246485 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:29:12.246445 2576 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/30802795-7f45-4e9c-9470-70dca27eca94-extensions-socket-volume\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:29:12.246485 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:29:12.246483 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f9xz8\" (UniqueName: \"kubernetes.io/projected/30802795-7f45-4e9c-9470-70dca27eca94-kube-api-access-f9xz8\") on node \"ip-10-0-130-26.ec2.internal\" DevicePath \"\"" Apr 16 22:29:12.649358 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:29:12.649295 2576 generic.go:358] "Generic (PLEG): container finished" podID="30802795-7f45-4e9c-9470-70dca27eca94" containerID="9eff2c2c3972c11e6e2baae921d94cd8bc73a29f3c7c7e7e5f29e81cc5272d19" exitCode=0 Apr 16 22:29:12.649826 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:29:12.649393 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6" Apr 16 22:29:12.649826 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:29:12.649408 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6" event={"ID":"30802795-7f45-4e9c-9470-70dca27eca94","Type":"ContainerDied","Data":"9eff2c2c3972c11e6e2baae921d94cd8bc73a29f3c7c7e7e5f29e81cc5272d19"} Apr 16 22:29:12.649826 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:29:12.649464 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6" event={"ID":"30802795-7f45-4e9c-9470-70dca27eca94","Type":"ContainerDied","Data":"9f226ce2326b28862e755fb5c4caf8941cca99ec03185916f12017da0ec6515f"} Apr 16 22:29:12.649826 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:29:12.649486 2576 scope.go:117] "RemoveContainer" containerID="9eff2c2c3972c11e6e2baae921d94cd8bc73a29f3c7c7e7e5f29e81cc5272d19" Apr 16 22:29:12.659047 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:29:12.659023 2576 scope.go:117] "RemoveContainer" containerID="9eff2c2c3972c11e6e2baae921d94cd8bc73a29f3c7c7e7e5f29e81cc5272d19" Apr 16 22:29:12.659357 ip-10-0-130-26 kubenswrapper[2576]: E0416 22:29:12.659337 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9eff2c2c3972c11e6e2baae921d94cd8bc73a29f3c7c7e7e5f29e81cc5272d19\": container with ID starting with 9eff2c2c3972c11e6e2baae921d94cd8bc73a29f3c7c7e7e5f29e81cc5272d19 not found: ID does not exist" containerID="9eff2c2c3972c11e6e2baae921d94cd8bc73a29f3c7c7e7e5f29e81cc5272d19" Apr 16 22:29:12.659410 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:29:12.659368 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9eff2c2c3972c11e6e2baae921d94cd8bc73a29f3c7c7e7e5f29e81cc5272d19"} err="failed to get container status \"9eff2c2c3972c11e6e2baae921d94cd8bc73a29f3c7c7e7e5f29e81cc5272d19\": rpc error: code = NotFound desc = could not find container \"9eff2c2c3972c11e6e2baae921d94cd8bc73a29f3c7c7e7e5f29e81cc5272d19\": container with ID starting with 9eff2c2c3972c11e6e2baae921d94cd8bc73a29f3c7c7e7e5f29e81cc5272d19 not found: ID does not exist" Apr 16 22:29:12.666776 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:29:12.666749 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6"] Apr 16 22:29:12.671440 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:29:12.671421 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rp8q6"] Apr 16 22:29:14.283001 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:29:14.282962 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30802795-7f45-4e9c-9470-70dca27eca94" path="/var/lib/kubelet/pods/30802795-7f45-4e9c-9470-70dca27eca94/volumes" Apr 16 22:29:56.280177 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:29:56.280147 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/2.log" Apr 16 22:29:56.283178 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:29:56.283155 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/2.log" Apr 16 22:30:17.437372 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:30:17.437334 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt"] Apr 16 22:30:17.437850 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:30:17.437730 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30802795-7f45-4e9c-9470-70dca27eca94" containerName="manager" Apr 16 22:30:17.437850 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:30:17.437741 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="30802795-7f45-4e9c-9470-70dca27eca94" containerName="manager" Apr 16 22:30:17.437850 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:30:17.437806 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="30802795-7f45-4e9c-9470-70dca27eca94" containerName="manager" Apr 16 22:30:17.440792 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:30:17.440776 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt" Apr 16 22:30:17.444495 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:30:17.444479 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-kvhzx\"" Apr 16 22:30:17.457848 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:30:17.457820 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt"] Apr 16 22:30:17.501988 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:30:17.501954 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pdmt\" (UniqueName: \"kubernetes.io/projected/6e36fad6-77e3-481f-8a6d-5f6ea4af86ea-kube-api-access-6pdmt\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt\" (UID: \"6e36fad6-77e3-481f-8a6d-5f6ea4af86ea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt" Apr 16 22:30:17.502162 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:30:17.502097 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6e36fad6-77e3-481f-8a6d-5f6ea4af86ea-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt\" (UID: \"6e36fad6-77e3-481f-8a6d-5f6ea4af86ea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt" Apr 16 22:30:17.603136 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:30:17.603095 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pdmt\" (UniqueName: \"kubernetes.io/projected/6e36fad6-77e3-481f-8a6d-5f6ea4af86ea-kube-api-access-6pdmt\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt\" (UID: \"6e36fad6-77e3-481f-8a6d-5f6ea4af86ea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt" Apr 16 22:30:17.603331 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:30:17.603199 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6e36fad6-77e3-481f-8a6d-5f6ea4af86ea-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt\" (UID: \"6e36fad6-77e3-481f-8a6d-5f6ea4af86ea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt" Apr 16 22:30:17.603614 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:30:17.603597 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6e36fad6-77e3-481f-8a6d-5f6ea4af86ea-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt\" (UID: \"6e36fad6-77e3-481f-8a6d-5f6ea4af86ea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt" Apr 16 22:30:17.610437 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:30:17.610410 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pdmt\" (UniqueName: \"kubernetes.io/projected/6e36fad6-77e3-481f-8a6d-5f6ea4af86ea-kube-api-access-6pdmt\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt\" (UID: \"6e36fad6-77e3-481f-8a6d-5f6ea4af86ea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt" Apr 16 22:30:17.750607 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:30:17.750526 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt" Apr 16 22:30:17.881657 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:30:17.881629 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt"] Apr 16 22:30:17.884094 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:30:17.884066 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e36fad6_77e3_481f_8a6d_5f6ea4af86ea.slice/crio-d32529181f9ac48404100c79f2e1f568c466cefae47614e0d1538957bae845c1 WatchSource:0}: Error finding container d32529181f9ac48404100c79f2e1f568c466cefae47614e0d1538957bae845c1: Status 404 returned error can't find the container with id d32529181f9ac48404100c79f2e1f568c466cefae47614e0d1538957bae845c1 Apr 16 22:30:17.886430 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:30:17.886416 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:30:17.901886 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:30:17.901859 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt" event={"ID":"6e36fad6-77e3-481f-8a6d-5f6ea4af86ea","Type":"ContainerStarted","Data":"d32529181f9ac48404100c79f2e1f568c466cefae47614e0d1538957bae845c1"} Apr 16 22:30:18.906720 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:30:18.906684 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt" event={"ID":"6e36fad6-77e3-481f-8a6d-5f6ea4af86ea","Type":"ContainerStarted","Data":"829c6f2b495ed2aede71a6712bf38d3a20ebe5ac2bc8ff923d25a0ab5a8f2d5a"} Apr 16 22:30:18.907194 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:30:18.906801 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt" Apr 16 22:30:18.928575 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:30:18.928519 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt" podStartSLOduration=1.928501009 podStartE2EDuration="1.928501009s" podCreationTimestamp="2026-04-16 22:30:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:30:18.924765507 +0000 UTC m=+1523.185491612" watchObservedRunningTime="2026-04-16 22:30:18.928501009 +0000 UTC m=+1523.189227119" Apr 16 22:30:29.912351 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:30:29.912299 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt" Apr 16 22:34:56.311142 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:34:56.311113 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/2.log" Apr 16 22:34:56.312994 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:34:56.312974 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/2.log" Apr 16 22:39:56.340665 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:39:56.340638 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/2.log" Apr 16 22:39:56.343745 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:39:56.343715 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/2.log" Apr 16 22:39:58.602873 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:39:58.602825 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-5b8d76df79-8hw22_e833b21b-3807-49ab-9889-2cf5e77ad763/maas-api/0.log" Apr 16 22:39:58.725706 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:39:58.725671 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-77c748cfd6-chdx8_d34ab5d4-de5c-423a-b625-de87870833eb/manager/0.log" Apr 16 22:39:59.069924 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:39:59.069840 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-674f8cc5cf-vprm4_eb4d92c0-6bd2-45be-9dd6-28bb8b0b5a99/manager/0.log" Apr 16 22:39:59.315819 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:39:59.315780 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-gwkmp_220c566b-3d68-44b0-9d48-2b4ef15781a1/postgres/0.log" Apr 16 22:40:00.054136 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:00.054109 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h_44c36f49-c527-45ce-b977-372564bbf626/util/0.log" Apr 16 22:40:00.060349 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:00.060326 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h_44c36f49-c527-45ce-b977-372564bbf626/pull/0.log" Apr 16 22:40:00.066173 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:00.066151 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h_44c36f49-c527-45ce-b977-372564bbf626/extract/0.log" Apr 16 22:40:00.176493 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:00.176473 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg_ab32208f-6560-4abb-83f2-8cb550dbed35/util/0.log" Apr 16 22:40:00.182825 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:00.182804 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg_ab32208f-6560-4abb-83f2-8cb550dbed35/pull/0.log" Apr 16 22:40:00.188793 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:00.188766 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg_ab32208f-6560-4abb-83f2-8cb550dbed35/extract/0.log" Apr 16 22:40:00.296241 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:00.296216 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt_016deeab-866e-45c0-ba71-db4cd5f9149b/util/0.log" Apr 16 22:40:00.302461 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:00.302436 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt_016deeab-866e-45c0-ba71-db4cd5f9149b/pull/0.log" Apr 16 22:40:00.308631 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:00.308562 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt_016deeab-866e-45c0-ba71-db4cd5f9149b/extract/0.log" Apr 16 22:40:00.414116 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:00.414087 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7_13dcea9f-b75a-429a-bd14-f3a952ab88c9/util/0.log" Apr 16 22:40:00.420334 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:00.420291 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7_13dcea9f-b75a-429a-bd14-f3a952ab88c9/pull/0.log" Apr 16 22:40:00.426684 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:00.426657 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7_13dcea9f-b75a-429a-bd14-f3a952ab88c9/extract/0.log" Apr 16 22:40:00.771086 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:00.771060 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-65cbp_131861e7-abf2-48d6-a9d5-136cf8543227/manager/0.log" Apr 16 22:40:00.874145 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:00.874118 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-cw8r9_3dce1467-c2b9-482d-baf3-6f1e2bcf237c/kuadrant-console-plugin/0.log" Apr 16 22:40:00.998047 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:00.998022 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-5n2k9_5526f190-7cc2-4eb3-9a88-a2565891302d/registry-server/0.log" Apr 16 22:40:01.118618 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:01.118587 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt_6e36fad6-77e3-481f-8a6d-5f6ea4af86ea/manager/0.log" Apr 16 22:40:01.678299 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:01.678267 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw_37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd/istio-proxy/0.log" Apr 16 22:40:02.155742 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:02.155711 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-gb8pw_66b10bf8-2a76-488d-9eaa-9f7369c48145/istio-proxy/0.log" Apr 16 22:40:02.262691 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:02.262661 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-567b8b8c4d-864ph_51846908-299c-4e1f-b417-c0af1029a45f/router/0.log" Apr 16 22:40:02.944827 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:02.944799 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk_6a03252c-b02c-423b-8f0d-a5f9f7a013e3/storage-initializer/0.log" Apr 16 22:40:02.952046 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:02.952022 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccc49fk_6a03252c-b02c-423b-8f0d-a5f9f7a013e3/main/0.log" Apr 16 22:40:03.184632 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:03.184605 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr_2facde92-691f-406e-a26c-a68c92e6c31c/storage-initializer/0.log" Apr 16 22:40:03.192426 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:03.192402 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-4jxdr_2facde92-691f-406e-a26c-a68c92e6c31c/main/0.log" Apr 16 22:40:06.895507 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:06.895475 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wq7jn/must-gather-9fqbm"] Apr 16 22:40:06.899270 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:06.899250 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wq7jn/must-gather-9fqbm" Apr 16 22:40:06.901609 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:06.901586 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wq7jn\"/\"kube-root-ca.crt\"" Apr 16 22:40:06.901705 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:06.901586 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wq7jn\"/\"openshift-service-ca.crt\"" Apr 16 22:40:06.902226 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:06.902208 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wq7jn\"/\"default-dockercfg-9fxhp\"" Apr 16 22:40:06.913250 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:06.913228 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wq7jn/must-gather-9fqbm"] Apr 16 22:40:06.948687 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:06.948661 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2vmw\" (UniqueName: \"kubernetes.io/projected/5577fc49-ce32-40e2-bc49-bb0849a4ecbc-kube-api-access-n2vmw\") pod \"must-gather-9fqbm\" (UID: \"5577fc49-ce32-40e2-bc49-bb0849a4ecbc\") " pod="openshift-must-gather-wq7jn/must-gather-9fqbm" Apr 16 22:40:06.948821 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:06.948800 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5577fc49-ce32-40e2-bc49-bb0849a4ecbc-must-gather-output\") pod \"must-gather-9fqbm\" (UID: \"5577fc49-ce32-40e2-bc49-bb0849a4ecbc\") " pod="openshift-must-gather-wq7jn/must-gather-9fqbm" Apr 16 22:40:07.049366 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:07.049328 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2vmw\" (UniqueName: \"kubernetes.io/projected/5577fc49-ce32-40e2-bc49-bb0849a4ecbc-kube-api-access-n2vmw\") pod \"must-gather-9fqbm\" (UID: \"5577fc49-ce32-40e2-bc49-bb0849a4ecbc\") " pod="openshift-must-gather-wq7jn/must-gather-9fqbm" Apr 16 22:40:07.049545 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:07.049465 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5577fc49-ce32-40e2-bc49-bb0849a4ecbc-must-gather-output\") pod \"must-gather-9fqbm\" (UID: \"5577fc49-ce32-40e2-bc49-bb0849a4ecbc\") " pod="openshift-must-gather-wq7jn/must-gather-9fqbm" Apr 16 22:40:07.049838 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:07.049821 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5577fc49-ce32-40e2-bc49-bb0849a4ecbc-must-gather-output\") pod \"must-gather-9fqbm\" (UID: \"5577fc49-ce32-40e2-bc49-bb0849a4ecbc\") " pod="openshift-must-gather-wq7jn/must-gather-9fqbm" Apr 16 22:40:07.058093 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:07.058067 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2vmw\" (UniqueName: \"kubernetes.io/projected/5577fc49-ce32-40e2-bc49-bb0849a4ecbc-kube-api-access-n2vmw\") pod \"must-gather-9fqbm\" (UID: \"5577fc49-ce32-40e2-bc49-bb0849a4ecbc\") " pod="openshift-must-gather-wq7jn/must-gather-9fqbm" Apr 16 22:40:07.209159 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:07.209078 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wq7jn/must-gather-9fqbm" Apr 16 22:40:07.333981 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:07.333948 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wq7jn/must-gather-9fqbm"] Apr 16 22:40:07.338741 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:40:07.338711 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5577fc49_ce32_40e2_bc49_bb0849a4ecbc.slice/crio-d3bfbf8409a202b96c2d84680ac4695c6290a471cf39384edb53e40254270ccf WatchSource:0}: Error finding container d3bfbf8409a202b96c2d84680ac4695c6290a471cf39384edb53e40254270ccf: Status 404 returned error can't find the container with id d3bfbf8409a202b96c2d84680ac4695c6290a471cf39384edb53e40254270ccf Apr 16 22:40:07.340723 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:07.340705 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 22:40:08.150138 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:08.150106 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wq7jn/must-gather-9fqbm" event={"ID":"5577fc49-ce32-40e2-bc49-bb0849a4ecbc","Type":"ContainerStarted","Data":"d3bfbf8409a202b96c2d84680ac4695c6290a471cf39384edb53e40254270ccf"} Apr 16 22:40:09.158926 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:09.158887 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wq7jn/must-gather-9fqbm" event={"ID":"5577fc49-ce32-40e2-bc49-bb0849a4ecbc","Type":"ContainerStarted","Data":"5df5a11e27359d8e5143ec1c7b73881a0694d3c07ecb5c2457987b054427a088"} Apr 16 22:40:09.159443 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:09.159212 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wq7jn/must-gather-9fqbm" event={"ID":"5577fc49-ce32-40e2-bc49-bb0849a4ecbc","Type":"ContainerStarted","Data":"c284d250bfe68cd1fbe93d43b6861bc26fe884586087ce9a75f702324c3f3ab4"} Apr 16 22:40:09.176580 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:09.176528 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wq7jn/must-gather-9fqbm" podStartSLOduration=2.3162677289999998 podStartE2EDuration="3.176512592s" podCreationTimestamp="2026-04-16 22:40:06 +0000 UTC" firstStartedPulling="2026-04-16 22:40:07.340848397 +0000 UTC m=+2111.601574480" lastFinishedPulling="2026-04-16 22:40:08.201093242 +0000 UTC m=+2112.461819343" observedRunningTime="2026-04-16 22:40:09.174467041 +0000 UTC m=+2113.435193143" watchObservedRunningTime="2026-04-16 22:40:09.176512592 +0000 UTC m=+2113.437238743" Apr 16 22:40:09.822541 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:09.822509 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-gjxcr_9d43755a-3076-45ec-8abc-1fa99470f09b/global-pull-secret-syncer/0.log" Apr 16 22:40:09.875485 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:09.875454 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7qvcm_0982ea7f-a131-4caf-8792-d0c2c1bf4089/konnectivity-agent/0.log" Apr 16 22:40:09.979939 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:09.979905 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-26.ec2.internal_be5bbbaa384543c3c03d64000cc8b573/haproxy/0.log" Apr 16 22:40:13.851728 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:13.851665 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h_44c36f49-c527-45ce-b977-372564bbf626/extract/0.log" Apr 16 22:40:13.875229 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:13.875201 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h_44c36f49-c527-45ce-b977-372564bbf626/util/0.log" Apr 16 22:40:13.899501 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:13.899476 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b7599k42h_44c36f49-c527-45ce-b977-372564bbf626/pull/0.log" Apr 16 22:40:13.923466 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:13.923435 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg_ab32208f-6560-4abb-83f2-8cb550dbed35/extract/0.log" Apr 16 22:40:13.949278 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:13.949249 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg_ab32208f-6560-4abb-83f2-8cb550dbed35/util/0.log" Apr 16 22:40:13.971040 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:13.970972 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0tbstg_ab32208f-6560-4abb-83f2-8cb550dbed35/pull/0.log" Apr 16 22:40:14.002502 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:14.002470 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt_016deeab-866e-45c0-ba71-db4cd5f9149b/extract/0.log" Apr 16 22:40:14.024167 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:14.024133 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt_016deeab-866e-45c0-ba71-db4cd5f9149b/util/0.log" Apr 16 22:40:14.052456 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:14.052421 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed739v5bt_016deeab-866e-45c0-ba71-db4cd5f9149b/pull/0.log" Apr 16 22:40:14.082137 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:14.082103 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7_13dcea9f-b75a-429a-bd14-f3a952ab88c9/extract/0.log" Apr 16 22:40:14.107596 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:14.107531 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7_13dcea9f-b75a-429a-bd14-f3a952ab88c9/util/0.log" Apr 16 22:40:14.129468 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:14.129441 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1nv5b7_13dcea9f-b75a-429a-bd14-f3a952ab88c9/pull/0.log" Apr 16 22:40:14.384462 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:14.384370 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-65cbp_131861e7-abf2-48d6-a9d5-136cf8543227/manager/0.log" Apr 16 22:40:14.415101 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:14.415070 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-cw8r9_3dce1467-c2b9-482d-baf3-6f1e2bcf237c/kuadrant-console-plugin/0.log" Apr 16 22:40:14.457409 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:14.457381 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-5n2k9_5526f190-7cc2-4eb3-9a88-a2565891302d/registry-server/0.log" Apr 16 22:40:14.547227 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:14.547191 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-rj4lt_6e36fad6-77e3-481f-8a6d-5f6ea4af86ea/manager/0.log" Apr 16 22:40:16.078167 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:16.078131 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-dsspm_d7ac4c50-4d27-4e9e-8655-01b821d74833/kube-state-metrics/0.log" Apr 16 22:40:16.101604 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:16.101583 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-dsspm_d7ac4c50-4d27-4e9e-8655-01b821d74833/kube-rbac-proxy-main/0.log" Apr 16 22:40:16.128420 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:16.128334 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-dsspm_d7ac4c50-4d27-4e9e-8655-01b821d74833/kube-rbac-proxy-self/0.log" Apr 16 22:40:16.196214 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:16.196164 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-zxhlm_47b14de1-d053-4a3b-9584-2febe7614435/monitoring-plugin/0.log" Apr 16 22:40:16.442475 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:16.442440 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fmjz2_d91998a7-1fa5-477b-8779-9c8df24c3680/node-exporter/0.log" Apr 16 22:40:16.466618 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:16.466563 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fmjz2_d91998a7-1fa5-477b-8779-9c8df24c3680/kube-rbac-proxy/0.log" Apr 16 22:40:16.490036 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:16.490007 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-fmjz2_d91998a7-1fa5-477b-8779-9c8df24c3680/init-textfile/0.log" Apr 16 22:40:16.799331 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:16.799240 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-t8f88_54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3/prometheus-operator/0.log" Apr 16 22:40:16.818807 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:16.818778 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-t8f88_54ee9079-d4a2-4fab-a8d3-2b0d9cc43da3/kube-rbac-proxy/0.log" Apr 16 22:40:18.419800 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:18.419765 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25"] Apr 16 22:40:18.424668 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:18.424646 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25" Apr 16 22:40:18.432470 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:18.432435 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25"] Apr 16 22:40:18.583261 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:18.583223 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a4c4d11a-ee42-4522-abd8-177cc0005f27-lib-modules\") pod \"perf-node-gather-daemonset-xjl25\" (UID: \"a4c4d11a-ee42-4522-abd8-177cc0005f27\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25" Apr 16 22:40:18.583472 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:18.583273 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a4c4d11a-ee42-4522-abd8-177cc0005f27-proc\") pod \"perf-node-gather-daemonset-xjl25\" (UID: \"a4c4d11a-ee42-4522-abd8-177cc0005f27\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25" Apr 16 22:40:18.583472 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:18.583349 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlwlp\" (UniqueName: \"kubernetes.io/projected/a4c4d11a-ee42-4522-abd8-177cc0005f27-kube-api-access-wlwlp\") pod \"perf-node-gather-daemonset-xjl25\" (UID: \"a4c4d11a-ee42-4522-abd8-177cc0005f27\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25" Apr 16 22:40:18.583472 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:18.583464 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a4c4d11a-ee42-4522-abd8-177cc0005f27-sys\") pod \"perf-node-gather-daemonset-xjl25\" (UID: \"a4c4d11a-ee42-4522-abd8-177cc0005f27\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25" Apr 16 22:40:18.583651 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:18.583570 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a4c4d11a-ee42-4522-abd8-177cc0005f27-podres\") pod \"perf-node-gather-daemonset-xjl25\" (UID: \"a4c4d11a-ee42-4522-abd8-177cc0005f27\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25" Apr 16 22:40:18.643145 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:18.643099 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/2.log" Apr 16 22:40:18.652501 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:18.652475 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fcfwt_ad6a094a-d7bf-42af-90ae-94731039404b/console-operator/3.log" Apr 16 22:40:18.684838 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:18.684747 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlwlp\" (UniqueName: \"kubernetes.io/projected/a4c4d11a-ee42-4522-abd8-177cc0005f27-kube-api-access-wlwlp\") pod \"perf-node-gather-daemonset-xjl25\" (UID: \"a4c4d11a-ee42-4522-abd8-177cc0005f27\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25" Apr 16 22:40:18.685136 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:18.685118 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a4c4d11a-ee42-4522-abd8-177cc0005f27-sys\") pod \"perf-node-gather-daemonset-xjl25\" (UID: \"a4c4d11a-ee42-4522-abd8-177cc0005f27\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25" Apr 16 22:40:18.685399 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:18.685384 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a4c4d11a-ee42-4522-abd8-177cc0005f27-sys\") pod \"perf-node-gather-daemonset-xjl25\" (UID: \"a4c4d11a-ee42-4522-abd8-177cc0005f27\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25" Apr 16 22:40:18.685705 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:18.685687 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a4c4d11a-ee42-4522-abd8-177cc0005f27-podres\") pod \"perf-node-gather-daemonset-xjl25\" (UID: \"a4c4d11a-ee42-4522-abd8-177cc0005f27\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25" Apr 16 22:40:18.685843 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:18.685552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a4c4d11a-ee42-4522-abd8-177cc0005f27-podres\") pod \"perf-node-gather-daemonset-xjl25\" (UID: \"a4c4d11a-ee42-4522-abd8-177cc0005f27\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25" Apr 16 22:40:18.686041 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:18.686023 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a4c4d11a-ee42-4522-abd8-177cc0005f27-lib-modules\") pod \"perf-node-gather-daemonset-xjl25\" (UID: \"a4c4d11a-ee42-4522-abd8-177cc0005f27\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25" Apr 16 22:40:18.686159 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:18.686147 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a4c4d11a-ee42-4522-abd8-177cc0005f27-proc\") pod \"perf-node-gather-daemonset-xjl25\" (UID: \"a4c4d11a-ee42-4522-abd8-177cc0005f27\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25" Apr 16 22:40:18.686444 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:18.686427 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a4c4d11a-ee42-4522-abd8-177cc0005f27-proc\") pod \"perf-node-gather-daemonset-xjl25\" (UID: \"a4c4d11a-ee42-4522-abd8-177cc0005f27\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25" Apr 16 22:40:18.686630 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:18.686581 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a4c4d11a-ee42-4522-abd8-177cc0005f27-lib-modules\") pod \"perf-node-gather-daemonset-xjl25\" (UID: \"a4c4d11a-ee42-4522-abd8-177cc0005f27\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25" Apr 16 22:40:18.693708 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:18.693670 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlwlp\" (UniqueName: \"kubernetes.io/projected/a4c4d11a-ee42-4522-abd8-177cc0005f27-kube-api-access-wlwlp\") pod \"perf-node-gather-daemonset-xjl25\" (UID: \"a4c4d11a-ee42-4522-abd8-177cc0005f27\") " pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25" Apr 16 22:40:18.739349 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:18.739298 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25" Apr 16 22:40:19.092111 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:19.092008 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25"] Apr 16 22:40:19.095039 ip-10-0-130-26 kubenswrapper[2576]: W0416 22:40:19.095005 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda4c4d11a_ee42_4522_abd8_177cc0005f27.slice/crio-fdd2ccc175afd3ca53af36b4c91544498b3770330eeb157521180344a9aed573 WatchSource:0}: Error finding container fdd2ccc175afd3ca53af36b4c91544498b3770330eeb157521180344a9aed573: Status 404 returned error can't find the container with id fdd2ccc175afd3ca53af36b4c91544498b3770330eeb157521180344a9aed573 Apr 16 22:40:19.134124 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:19.134102 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-586fc7995b-gf69z_a6165f3d-da4e-403b-a5c6-c1506f6a34c8/console/0.log" Apr 16 22:40:19.214248 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:19.214209 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25" event={"ID":"a4c4d11a-ee42-4522-abd8-177cc0005f27","Type":"ContainerStarted","Data":"fdd2ccc175afd3ca53af36b4c91544498b3770330eeb157521180344a9aed573"} Apr 16 22:40:19.666259 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:19.666226 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-5jz8x_07b94c63-bfb5-4f8c-8a11-7985e312b413/volume-data-source-validator/0.log" Apr 16 22:40:20.219792 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:20.219756 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25" event={"ID":"a4c4d11a-ee42-4522-abd8-177cc0005f27","Type":"ContainerStarted","Data":"56644ac58f7df14660bba5b9e0e3f826e13e6d8fd0cb4e1b10bc6391da2ed866"} Apr 16 22:40:20.219992 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:20.219846 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25" Apr 16 22:40:20.237468 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:20.237414 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25" podStartSLOduration=2.237397401 podStartE2EDuration="2.237397401s" podCreationTimestamp="2026-04-16 22:40:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 22:40:20.234371256 +0000 UTC m=+2124.495097362" watchObservedRunningTime="2026-04-16 22:40:20.237397401 +0000 UTC m=+2124.498123535" Apr 16 22:40:20.511133 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:20.511060 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xdjs7_049ab1ab-5b48-4b74-9527-0075f2bb7467/dns/0.log" Apr 16 22:40:20.528369 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:20.528338 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-xdjs7_049ab1ab-5b48-4b74-9527-0075f2bb7467/kube-rbac-proxy/0.log" Apr 16 22:40:20.572604 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:20.572580 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-7pgrh_a333a392-a24f-4b6c-85d5-2cc457992bf5/dns-node-resolver/0.log" Apr 16 22:40:21.057557 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:21.057520 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7958cfb6f-htkcm_e7e1d96e-bc71-418f-bb1f-ed1d7b4b4954/registry/0.log" Apr 16 22:40:21.078960 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:21.078936 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4nfgz_4591d12d-774b-43ce-a862-67018bf47f0c/node-ca/0.log" Apr 16 22:40:21.927472 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:21.927438 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf8mshw_37b7d56d-46e5-4e3f-8212-92a0cfbaa0fd/istio-proxy/0.log" Apr 16 22:40:22.217968 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:22.217895 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-gb8pw_66b10bf8-2a76-488d-9eaa-9f7369c48145/istio-proxy/0.log" Apr 16 22:40:22.237430 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:22.237407 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-567b8b8c4d-864ph_51846908-299c-4e1f-b417-c0af1029a45f/router/0.log" Apr 16 22:40:22.741835 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:22.741798 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6dv5d_6d61274e-1ceb-496e-8a75-17916b110ed5/serve-healthcheck-canary/0.log" Apr 16 22:40:23.210483 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:23.210449 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-r49p7_9b8d4070-576e-4533-a43c-28e13d203ec1/insights-operator/0.log" Apr 16 22:40:23.213915 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:23.213885 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-r49p7_9b8d4070-576e-4533-a43c-28e13d203ec1/insights-operator/1.log" Apr 16 22:40:23.298618 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:23.298589 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-87bvh_ab00035f-c880-4d27-99da-a870c1adb974/kube-rbac-proxy/0.log" Apr 16 22:40:23.316716 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:23.316694 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-87bvh_ab00035f-c880-4d27-99da-a870c1adb974/exporter/0.log" Apr 16 22:40:23.336132 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:23.336103 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-87bvh_ab00035f-c880-4d27-99da-a870c1adb974/extractor/0.log" Apr 16 22:40:25.277166 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:25.277118 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-5b8d76df79-8hw22_e833b21b-3807-49ab-9889-2cf5e77ad763/maas-api/0.log" Apr 16 22:40:25.360805 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:25.360771 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-77c748cfd6-chdx8_d34ab5d4-de5c-423a-b625-de87870833eb/manager/0.log" Apr 16 22:40:25.450583 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:25.450553 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-674f8cc5cf-vprm4_eb4d92c0-6bd2-45be-9dd6-28bb8b0b5a99/manager/0.log" Apr 16 22:40:25.512458 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:25.512426 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-gwkmp_220c566b-3d68-44b0-9d48-2b4ef15781a1/postgres/0.log" Apr 16 22:40:26.242169 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:26.242141 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wq7jn/perf-node-gather-daemonset-xjl25" Apr 16 22:40:26.634766 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:26.634729 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-fd7d9b88b-cdwxh_81ce82ca-5118-4f08-bf3c-381fe90dadb1/manager/0.log" Apr 16 22:40:31.538142 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:31.538110 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-lxr9z_69e30ac7-c0e6-4cd4-ab1f-3df7d3277790/kube-storage-version-migrator-operator/1.log" Apr 16 22:40:31.539763 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:31.539736 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-lxr9z_69e30ac7-c0e6-4cd4-ab1f-3df7d3277790/kube-storage-version-migrator-operator/0.log" Apr 16 22:40:32.826459 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:32.826389 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mz6rs_87bd69c1-f44b-42b9-ba15-a3ed7fdad078/kube-multus-additional-cni-plugins/0.log" Apr 16 22:40:32.852421 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:32.852390 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mz6rs_87bd69c1-f44b-42b9-ba15-a3ed7fdad078/egress-router-binary-copy/0.log" Apr 16 22:40:32.871289 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:32.871258 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mz6rs_87bd69c1-f44b-42b9-ba15-a3ed7fdad078/cni-plugins/0.log" Apr 16 22:40:32.891180 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:32.891156 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mz6rs_87bd69c1-f44b-42b9-ba15-a3ed7fdad078/bond-cni-plugin/0.log" Apr 16 22:40:32.909494 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:32.909474 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mz6rs_87bd69c1-f44b-42b9-ba15-a3ed7fdad078/routeoverride-cni/0.log" Apr 16 22:40:32.931923 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:32.931896 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mz6rs_87bd69c1-f44b-42b9-ba15-a3ed7fdad078/whereabouts-cni-bincopy/0.log" Apr 16 22:40:32.950407 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:32.950384 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-mz6rs_87bd69c1-f44b-42b9-ba15-a3ed7fdad078/whereabouts-cni/0.log" Apr 16 22:40:32.987831 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:32.987809 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ltxhs_0dc08e01-6796-4c69-9ed5-214b13ad71cd/kube-multus/0.log" Apr 16 22:40:33.099718 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:33.099692 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nrljs_2f7c8d95-90cb-497a-8866-d2c45b825b72/network-metrics-daemon/0.log" Apr 16 22:40:33.114272 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:33.114248 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nrljs_2f7c8d95-90cb-497a-8866-d2c45b825b72/kube-rbac-proxy/0.log" Apr 16 22:40:34.622398 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:34.622356 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fw6md_5c889066-e69a-44d5-b456-b37d09282234/ovn-controller/0.log" Apr 16 22:40:34.658998 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:34.658970 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fw6md_5c889066-e69a-44d5-b456-b37d09282234/ovn-acl-logging/0.log" Apr 16 22:40:34.679575 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:34.679541 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fw6md_5c889066-e69a-44d5-b456-b37d09282234/kube-rbac-proxy-node/0.log" Apr 16 22:40:34.699449 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:34.699425 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fw6md_5c889066-e69a-44d5-b456-b37d09282234/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 22:40:34.713708 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:34.713687 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fw6md_5c889066-e69a-44d5-b456-b37d09282234/northd/0.log" Apr 16 22:40:34.730749 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:34.730731 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fw6md_5c889066-e69a-44d5-b456-b37d09282234/nbdb/0.log" Apr 16 22:40:34.748296 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:34.748275 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fw6md_5c889066-e69a-44d5-b456-b37d09282234/sbdb/0.log" Apr 16 22:40:34.926188 ip-10-0-130-26 kubenswrapper[2576]: I0416 22:40:34.926091 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fw6md_5c889066-e69a-44d5-b456-b37d09282234/ovnkube-controller/0.log"