Apr 20 14:52:43.882214 ip-10-0-130-249 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 14:52:43.882224 ip-10-0-130-249 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 14:52:43.882231 ip-10-0-130-249 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 14:52:43.882445 ip-10-0-130-249 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 14:52:53.990336 ip-10-0-130-249 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 14:52:53.990358 ip-10-0-130-249 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot c3f25fb6fedd466a904f3903dc74fd23 -- Apr 20 14:55:20.814388 ip-10-0-130-249 systemd[1]: Starting Kubernetes Kubelet... Apr 20 14:55:21.221679 ip-10-0-130-249 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:55:21.221679 ip-10-0-130-249 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 14:55:21.221679 ip-10-0-130-249 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:55:21.221679 ip-10-0-130-249 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 14:55:21.221679 ip-10-0-130-249 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:55:21.223188 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.223099 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 14:55:21.225948 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.225933 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:55:21.225988 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.225949 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:55:21.225988 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.225953 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:55:21.225988 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.225956 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:55:21.225988 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.225960 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:55:21.225988 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.225963 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:55:21.225988 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.225965 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:55:21.225988 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.225968 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:55:21.225988 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.225971 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:55:21.225988 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.225973 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:55:21.225988 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.225976 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:55:21.225988 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.225978 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:55:21.225988 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.225981 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:55:21.225988 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.225984 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:55:21.225988 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.225986 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:55:21.225988 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.225989 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:55:21.225988 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.225992 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:55:21.225988 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.225994 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:55:21.225988 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.225997 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:55:21.226429 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226007 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:55:21.226429 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226010 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:55:21.226429 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226013 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:55:21.226429 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226015 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:55:21.226429 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226018 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:55:21.226429 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226021 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:55:21.226429 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226023 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:55:21.226429 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226026 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:55:21.226429 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226029 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:55:21.226429 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226031 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:55:21.226429 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226034 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:55:21.226429 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226037 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:55:21.226429 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226040 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:55:21.226429 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226042 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:55:21.226429 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226045 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:55:21.226429 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226047 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:55:21.226429 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226050 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:55:21.226429 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226053 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:55:21.226429 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226055 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:55:21.226429 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226058 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:55:21.226912 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226060 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:55:21.226912 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226062 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:55:21.226912 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226065 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:55:21.226912 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226068 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:55:21.226912 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226071 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:55:21.226912 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226073 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:55:21.226912 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226076 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:55:21.226912 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226079 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:55:21.226912 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226081 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:55:21.226912 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226084 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:55:21.226912 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226086 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:55:21.226912 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226089 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:55:21.226912 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226091 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:55:21.226912 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226095 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:55:21.226912 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226098 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:55:21.226912 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226100 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:55:21.226912 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226103 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:55:21.226912 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226106 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:55:21.226912 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226108 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:55:21.226912 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226111 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:55:21.227441 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226114 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:55:21.227441 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226117 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:55:21.227441 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226119 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:55:21.227441 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226124 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:55:21.227441 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226128 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:55:21.227441 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226130 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:55:21.227441 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226133 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:55:21.227441 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226136 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:55:21.227441 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226138 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:55:21.227441 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226141 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:55:21.227441 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226143 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:55:21.227441 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226146 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:55:21.227441 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226148 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:55:21.227441 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226150 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:55:21.227441 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226153 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:55:21.227441 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226156 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:55:21.227441 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226158 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:55:21.227441 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226161 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:55:21.227441 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226163 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:55:21.227901 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226166 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:55:21.227901 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226168 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:55:21.227901 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226172 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:55:21.227901 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226177 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:55:21.227901 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226181 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:55:21.227901 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226185 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:55:21.227901 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226190 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:55:21.227901 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226193 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:55:21.227901 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226580 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:55:21.227901 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226586 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:55:21.227901 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226588 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:55:21.227901 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226591 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:55:21.227901 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226594 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:55:21.227901 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226597 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:55:21.227901 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226599 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:55:21.227901 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226602 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:55:21.227901 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226605 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:55:21.227901 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226607 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:55:21.227901 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226610 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:55:21.228349 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226612 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:55:21.228349 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226615 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:55:21.228349 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226617 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:55:21.228349 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226620 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:55:21.228349 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226622 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:55:21.228349 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226625 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:55:21.228349 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226627 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:55:21.228349 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226630 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:55:21.228349 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226633 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:55:21.228349 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226635 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:55:21.228349 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226639 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:55:21.228349 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226642 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:55:21.228349 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226645 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:55:21.228349 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226647 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:55:21.228349 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226649 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:55:21.228349 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226652 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:55:21.228349 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226655 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:55:21.228349 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226657 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:55:21.228349 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226660 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:55:21.228831 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226664 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:55:21.228831 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226668 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:55:21.228831 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226670 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:55:21.228831 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226673 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:55:21.228831 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226676 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:55:21.228831 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226678 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:55:21.228831 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226681 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:55:21.228831 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226683 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:55:21.228831 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226686 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:55:21.228831 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226689 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:55:21.228831 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226691 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:55:21.228831 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226693 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:55:21.228831 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226696 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:55:21.228831 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226698 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:55:21.228831 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226701 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:55:21.228831 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226703 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:55:21.228831 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226705 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:55:21.228831 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226708 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:55:21.228831 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226710 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:55:21.229287 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226713 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:55:21.229287 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226715 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:55:21.229287 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226717 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:55:21.229287 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226720 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:55:21.229287 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226723 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:55:21.229287 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226726 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:55:21.229287 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226729 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:55:21.229287 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226731 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:55:21.229287 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226734 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:55:21.229287 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226737 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:55:21.229287 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226739 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:55:21.229287 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226741 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:55:21.229287 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226744 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:55:21.229287 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226747 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:55:21.229287 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226749 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:55:21.229287 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226752 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:55:21.229287 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226755 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:55:21.229287 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226757 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:55:21.229287 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226760 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:55:21.229287 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226762 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:55:21.229771 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226765 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:55:21.229771 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226770 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:55:21.229771 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226773 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:55:21.229771 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226776 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:55:21.229771 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226779 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:55:21.229771 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226782 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:55:21.229771 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226785 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:55:21.229771 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226787 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:55:21.229771 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226790 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:55:21.229771 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226793 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:55:21.229771 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226796 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:55:21.229771 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226799 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:55:21.229771 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226802 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:55:21.229771 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226805 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:55:21.229771 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226808 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:55:21.229771 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226810 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:55:21.229771 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.226813 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:55:21.229771 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226882 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 14:55:21.229771 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226889 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 14:55:21.229771 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226895 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226899 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226904 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226907 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226911 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226916 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226923 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226927 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226930 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226934 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226937 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226940 2574 flags.go:64] FLAG: --cgroup-root="" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226943 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226945 2574 flags.go:64] FLAG: --client-ca-file="" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226948 2574 flags.go:64] FLAG: --cloud-config="" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226951 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226954 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226959 2574 flags.go:64] FLAG: --cluster-domain="" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226962 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226965 2574 flags.go:64] FLAG: --config-dir="" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226968 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226971 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226975 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226978 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 14:55:21.230257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226981 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226985 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226988 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226991 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226994 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.226997 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227000 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227004 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227008 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227011 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227014 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227017 2574 flags.go:64] FLAG: --enable-server="true" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227020 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227024 2574 flags.go:64] FLAG: --event-burst="100" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227028 2574 flags.go:64] FLAG: --event-qps="50" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227030 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227033 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227036 2574 flags.go:64] FLAG: --eviction-hard="" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227040 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227043 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227046 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227049 2574 flags.go:64] FLAG: --eviction-soft="" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227052 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227055 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227057 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 14:55:21.230862 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227060 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227063 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227066 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227069 2574 flags.go:64] FLAG: --feature-gates="" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227073 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227076 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227079 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227082 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227085 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227088 2574 flags.go:64] FLAG: --help="false" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227091 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-130-249.ec2.internal" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227094 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227097 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227099 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227102 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227106 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227109 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227111 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227114 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227117 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227120 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227123 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227126 2574 flags.go:64] FLAG: --kube-reserved="" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227129 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 14:55:21.231498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227132 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227135 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227140 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227142 2574 flags.go:64] FLAG: --lock-file="" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227145 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227148 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227151 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227156 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227159 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227162 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227165 2574 flags.go:64] FLAG: --logging-format="text" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227168 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227171 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227174 2574 flags.go:64] FLAG: --manifest-url="" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227177 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227181 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227184 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227188 2574 flags.go:64] FLAG: --max-pods="110" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227192 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227195 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227197 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227200 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227203 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227207 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227210 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 14:55:21.232123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227218 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227221 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227224 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227227 2574 flags.go:64] FLAG: --pod-cidr="" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227230 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227236 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227239 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227242 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227246 2574 flags.go:64] FLAG: --port="10250" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227250 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227253 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e6e2200412cd19cf" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227256 2574 flags.go:64] FLAG: --qos-reserved="" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227259 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227262 2574 flags.go:64] FLAG: --register-node="true" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227265 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227268 2574 flags.go:64] FLAG: --register-with-taints="" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227276 2574 flags.go:64] FLAG: --registry-burst="10" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227278 2574 flags.go:64] FLAG: --registry-qps="5" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227281 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227284 2574 flags.go:64] FLAG: --reserved-memory="" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227288 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227291 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227294 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227297 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227300 2574 flags.go:64] FLAG: --runonce="false" Apr 20 14:55:21.232729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227303 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227306 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227309 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227312 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227315 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227318 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227321 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227324 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227328 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227331 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227334 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227336 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227340 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227343 2574 flags.go:64] FLAG: --system-cgroups="" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227346 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227352 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227359 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227363 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227396 2574 flags.go:64] FLAG: --tls-min-version="" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227400 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227403 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227406 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227409 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227413 2574 flags.go:64] FLAG: --v="2" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227417 2574 flags.go:64] FLAG: --version="false" Apr 20 14:55:21.233332 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227421 2574 flags.go:64] FLAG: --vmodule="" Apr 20 14:55:21.233971 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227426 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 14:55:21.233971 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.227429 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 14:55:21.233971 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227515 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:55:21.233971 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227520 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:55:21.233971 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227523 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:55:21.233971 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227527 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:55:21.233971 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227530 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:55:21.233971 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227533 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:55:21.233971 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227535 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:55:21.233971 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227538 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:55:21.233971 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227541 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:55:21.233971 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227543 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:55:21.233971 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227546 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:55:21.233971 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227548 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:55:21.233971 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227551 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:55:21.233971 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227554 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:55:21.233971 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227556 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:55:21.233971 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227559 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:55:21.233971 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227563 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:55:21.233971 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227566 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:55:21.234473 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227569 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:55:21.234473 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227571 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:55:21.234473 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227575 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:55:21.234473 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227578 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:55:21.234473 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227580 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:55:21.234473 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227583 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:55:21.234473 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227585 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:55:21.234473 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227588 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:55:21.234473 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227590 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:55:21.234473 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227593 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:55:21.234473 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227596 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:55:21.234473 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227598 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:55:21.234473 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227601 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:55:21.234473 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227616 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:55:21.234473 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227621 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:55:21.234473 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227625 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:55:21.234473 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227628 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:55:21.234473 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227630 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:55:21.234473 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227633 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:55:21.234950 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227636 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:55:21.234950 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227638 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:55:21.234950 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227641 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:55:21.234950 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227643 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:55:21.234950 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227646 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:55:21.234950 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227648 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:55:21.234950 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227651 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:55:21.234950 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227653 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:55:21.234950 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227656 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:55:21.234950 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227659 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:55:21.234950 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227662 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:55:21.234950 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227666 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:55:21.234950 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227669 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:55:21.234950 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227671 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:55:21.234950 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227674 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:55:21.234950 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227678 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:55:21.234950 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227681 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:55:21.234950 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227684 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:55:21.234950 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227686 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:55:21.234950 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227689 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:55:21.235452 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227692 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:55:21.235452 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227695 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:55:21.235452 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227697 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:55:21.235452 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227700 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:55:21.235452 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227702 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:55:21.235452 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227705 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:55:21.235452 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227707 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:55:21.235452 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227710 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:55:21.235452 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227712 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:55:21.235452 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227716 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:55:21.235452 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227720 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:55:21.235452 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227722 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:55:21.235452 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227725 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:55:21.235452 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227728 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:55:21.235452 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227731 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:55:21.235452 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227733 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:55:21.235452 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227736 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:55:21.235452 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227739 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:55:21.235452 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227741 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:55:21.235956 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227744 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:55:21.235956 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227746 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:55:21.235956 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227758 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:55:21.235956 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227762 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:55:21.235956 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227766 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:55:21.235956 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227768 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:55:21.235956 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227771 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:55:21.235956 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227774 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:55:21.235956 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227778 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:55:21.235956 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.227780 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:55:21.235956 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.228388 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:55:21.235956 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.234991 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 14:55:21.235956 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.235091 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 14:55:21.235956 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235136 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:55:21.235956 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235141 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:55:21.235956 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235144 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:55:21.236354 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235147 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:55:21.236354 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235150 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:55:21.236354 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235153 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:55:21.236354 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235156 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:55:21.236354 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235159 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:55:21.236354 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235161 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:55:21.236354 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235164 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:55:21.236354 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235167 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:55:21.236354 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235169 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:55:21.236354 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235172 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:55:21.236354 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235174 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:55:21.236354 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235177 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:55:21.236354 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235180 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:55:21.236354 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235183 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:55:21.236354 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235185 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:55:21.236354 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235187 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:55:21.236354 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235191 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:55:21.236354 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235193 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:55:21.236354 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235196 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:55:21.236837 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235198 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:55:21.236837 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235203 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:55:21.236837 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235208 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:55:21.236837 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235211 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:55:21.236837 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235214 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:55:21.236837 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235217 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:55:21.236837 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235220 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:55:21.236837 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235223 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:55:21.236837 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235225 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:55:21.236837 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235228 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:55:21.236837 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235231 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:55:21.236837 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235234 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:55:21.236837 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235236 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:55:21.236837 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235239 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:55:21.236837 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235242 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:55:21.236837 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235244 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:55:21.236837 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235247 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:55:21.236837 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235249 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:55:21.236837 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235251 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:55:21.236837 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235254 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:55:21.237324 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235258 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:55:21.237324 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235261 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:55:21.237324 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235264 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:55:21.237324 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235267 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:55:21.237324 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235269 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:55:21.237324 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235271 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:55:21.237324 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235274 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:55:21.237324 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235277 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:55:21.237324 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235279 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:55:21.237324 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235282 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:55:21.237324 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235285 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:55:21.237324 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235288 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:55:21.237324 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235290 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:55:21.237324 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235293 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:55:21.237324 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235296 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:55:21.237324 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235298 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:55:21.237324 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235301 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:55:21.237324 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235304 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:55:21.237324 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235306 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:55:21.237324 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235309 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:55:21.237827 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235311 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:55:21.237827 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235314 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:55:21.237827 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235317 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:55:21.237827 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235319 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:55:21.237827 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235321 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:55:21.237827 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235324 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:55:21.237827 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235326 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:55:21.237827 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235329 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:55:21.237827 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235331 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:55:21.237827 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235334 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:55:21.237827 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235336 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:55:21.237827 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235339 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:55:21.237827 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235341 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:55:21.237827 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235344 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:55:21.237827 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235346 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:55:21.237827 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235348 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:55:21.237827 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235351 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:55:21.237827 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235353 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:55:21.237827 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235356 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:55:21.237827 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235358 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:55:21.238305 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235361 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:55:21.238305 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235363 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:55:21.238305 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235379 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:55:21.238305 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235382 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:55:21.238305 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.235387 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:55:21.238305 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235480 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:55:21.238305 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235485 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:55:21.238305 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235489 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:55:21.238305 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235492 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:55:21.238305 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235495 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:55:21.238305 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235498 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:55:21.238305 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235501 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:55:21.238305 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235505 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:55:21.238305 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235509 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:55:21.238305 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235511 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:55:21.238687 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235515 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:55:21.238687 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235518 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:55:21.238687 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235520 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:55:21.238687 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235523 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:55:21.238687 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235525 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:55:21.238687 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235528 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:55:21.238687 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235531 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:55:21.238687 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235533 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:55:21.238687 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235536 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:55:21.238687 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235538 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:55:21.238687 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235541 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:55:21.238687 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235543 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:55:21.238687 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235546 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:55:21.238687 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235548 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:55:21.238687 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235550 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:55:21.238687 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235553 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:55:21.238687 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235555 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:55:21.238687 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235558 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:55:21.238687 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235561 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:55:21.239147 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235563 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:55:21.239147 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235566 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:55:21.239147 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235568 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:55:21.239147 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235571 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:55:21.239147 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235573 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:55:21.239147 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235576 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:55:21.239147 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235579 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:55:21.239147 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235581 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:55:21.239147 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235584 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:55:21.239147 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235586 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:55:21.239147 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235589 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:55:21.239147 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235591 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:55:21.239147 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235594 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:55:21.239147 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235596 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:55:21.239147 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235599 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:55:21.239147 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235602 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:55:21.239147 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235604 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:55:21.239147 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235607 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:55:21.239147 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235609 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:55:21.239147 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235612 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:55:21.239656 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235614 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:55:21.239656 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235617 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:55:21.239656 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235619 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:55:21.239656 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235622 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:55:21.239656 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235625 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:55:21.239656 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235627 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:55:21.239656 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235630 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:55:21.239656 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235632 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:55:21.239656 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235635 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:55:21.239656 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235637 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:55:21.239656 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235639 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:55:21.239656 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235642 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:55:21.239656 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235645 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:55:21.239656 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235647 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:55:21.239656 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235650 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:55:21.239656 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235652 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:55:21.239656 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235655 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:55:21.239656 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235658 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:55:21.239656 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235660 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:55:21.239656 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235663 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:55:21.240141 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235665 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:55:21.240141 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235668 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:55:21.240141 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235672 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:55:21.240141 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235674 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:55:21.240141 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235677 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:55:21.240141 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235679 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:55:21.240141 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235682 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:55:21.240141 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235685 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:55:21.240141 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235687 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:55:21.240141 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235690 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:55:21.240141 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235692 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:55:21.240141 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235695 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:55:21.240141 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235697 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:55:21.240141 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235700 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:55:21.240141 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235702 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:55:21.240141 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235705 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:55:21.240141 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:21.235707 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:55:21.240772 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.235712 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:55:21.240772 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.236361 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 14:55:21.240772 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.238168 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 14:55:21.240772 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.239031 2574 server.go:1019] "Starting client certificate rotation" Apr 20 14:55:21.240772 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.239118 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 14:55:21.240772 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.239153 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 14:55:21.262927 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.262910 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 14:55:21.265188 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.265168 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 14:55:21.275383 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.275347 2574 log.go:25] "Validated CRI v1 runtime API" Apr 20 14:55:21.280509 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.280494 2574 log.go:25] "Validated CRI v1 image API" Apr 20 14:55:21.281753 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.281731 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 14:55:21.288005 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.287983 2574 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 a7aa4e42-9b8f-4d8e-ae94-4b68c82c9e0b:/dev/nvme0n1p4 db3dc04f-af33-41e6-be26-cff9bf07e4c2:/dev/nvme0n1p3] Apr 20 14:55:21.288071 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.288003 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 14:55:21.292680 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.292656 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 14:55:21.293470 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.293341 2574 manager.go:217] Machine: {Timestamp:2026-04-20 14:55:21.291536297 +0000 UTC m=+0.364811933 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3087432 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28c54cd21be8ee3fef04ce889dad61 SystemUUID:ec28c54c-d21b-e8ee-3fef-04ce889dad61 BootID:c3f25fb6-fedd-466a-904f-3903dc74fd23 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:fb:df:fa:f3:d1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:fb:df:fa:f3:d1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ca:fa:9e:42:fd:2f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 14:55:21.293470 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.293466 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 14:55:21.293572 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.293542 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 14:55:21.294525 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.294499 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 14:55:21.294655 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.294527 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-249.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 14:55:21.294697 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.294664 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 14:55:21.294697 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.294673 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 14:55:21.294697 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.294686 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 14:55:21.295260 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.295250 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 14:55:21.296014 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.296004 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 20 14:55:21.296113 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.296104 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 14:55:21.298346 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.298337 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 20 14:55:21.298390 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.298355 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 14:55:21.298390 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.298378 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 14:55:21.298390 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.298388 2574 kubelet.go:397] "Adding apiserver pod source" Apr 20 14:55:21.298479 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.298400 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 14:55:21.299407 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.299393 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 14:55:21.299446 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.299421 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 14:55:21.302130 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.302115 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 14:55:21.303361 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.303349 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 14:55:21.305117 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.305103 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 14:55:21.305190 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.305121 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 14:55:21.305190 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.305127 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 14:55:21.305190 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.305133 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 14:55:21.305190 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.305139 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 14:55:21.305190 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.305144 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 14:55:21.305190 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.305150 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 14:55:21.305190 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.305156 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 14:55:21.305190 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.305170 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 14:55:21.305190 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.305177 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 14:55:21.305190 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.305189 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 14:55:21.305465 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.305199 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 14:55:21.306605 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.306583 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 14:55:21.306605 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.306600 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 14:55:21.308224 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.308208 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-54dbz" Apr 20 14:55:21.309232 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.309213 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-249.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 14:55:21.309300 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:21.309273 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 14:55:21.309776 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:21.309760 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-249.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 14:55:21.310251 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.310239 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 14:55:21.310287 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.310272 2574 server.go:1295] "Started kubelet" Apr 20 14:55:21.310402 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.310353 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 14:55:21.310497 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.310392 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 14:55:21.310497 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.310462 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 14:55:21.311761 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.311724 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 14:55:21.312016 ip-10-0-130-249 systemd[1]: Started Kubernetes Kubelet. Apr 20 14:55:21.313636 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.313588 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 20 14:55:21.315571 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.315554 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-54dbz" Apr 20 14:55:21.321410 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:21.321389 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 14:55:21.321775 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.321758 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 14:55:21.321775 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.321769 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 14:55:21.322337 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:21.322308 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-249.ec2.internal\" not found" Apr 20 14:55:21.322413 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.322348 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 14:55:21.322413 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.322358 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 14:55:21.322413 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.322380 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 14:55:21.322556 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.322457 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 20 14:55:21.322556 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.322468 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 20 14:55:21.322650 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.322572 2574 factory.go:55] Registering systemd factory Apr 20 14:55:21.322650 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.322589 2574 factory.go:223] Registration of the systemd container factory successfully Apr 20 14:55:21.322782 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.322757 2574 factory.go:153] Registering CRI-O factory Apr 20 14:55:21.322782 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.322772 2574 factory.go:223] Registration of the crio container factory successfully Apr 20 14:55:21.322916 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.322822 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 14:55:21.322916 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.322848 2574 factory.go:103] Registering Raw factory Apr 20 14:55:21.322916 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.322863 2574 manager.go:1196] Started watching for new ooms in manager Apr 20 14:55:21.323265 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.323248 2574 manager.go:319] Starting recovery of all containers Apr 20 14:55:21.326480 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.326461 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:55:21.329197 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:21.329171 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-249.ec2.internal\" not found" node="ip-10-0-130-249.ec2.internal" Apr 20 14:55:21.331481 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.331466 2574 manager.go:324] Recovery completed Apr 20 14:55:21.335669 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.335655 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:55:21.338174 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.338160 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-249.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:55:21.338237 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.338184 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-249.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:55:21.338237 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.338196 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-249.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:55:21.338653 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.338638 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 14:55:21.338653 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.338651 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 14:55:21.338731 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.338667 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 20 14:55:21.341079 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.341062 2574 policy_none.go:49] "None policy: Start" Apr 20 14:55:21.341079 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.341076 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 14:55:21.341220 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.341085 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 20 14:55:21.379937 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.379917 2574 manager.go:341] "Starting Device Plugin manager" Apr 20 14:55:21.399516 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:21.379954 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 14:55:21.399516 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.379967 2574 server.go:85] "Starting device plugin registration server" Apr 20 14:55:21.399516 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.380162 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 14:55:21.399516 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.380170 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 14:55:21.399516 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.380275 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 14:55:21.399516 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.380360 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 14:55:21.399516 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.380387 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 14:55:21.399516 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:21.380991 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 14:55:21.399516 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:21.381020 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-249.ec2.internal\" not found" Apr 20 14:55:21.440797 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.440749 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 14:55:21.441929 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.441912 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 14:55:21.442003 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.441944 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 14:55:21.442003 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.441965 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 14:55:21.442003 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.441974 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 14:55:21.442141 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:21.442011 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 14:55:21.444787 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.444766 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:55:21.480542 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.480504 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:55:21.481562 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.481547 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-249.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:55:21.481641 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.481574 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-249.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:55:21.481641 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.481585 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-249.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:55:21.481641 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.481606 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-249.ec2.internal" Apr 20 14:55:21.490104 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.490082 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-249.ec2.internal" Apr 20 14:55:21.490104 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:21.490103 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-249.ec2.internal\": node \"ip-10-0-130-249.ec2.internal\" not found" Apr 20 14:55:21.506033 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:21.506009 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-249.ec2.internal\" not found" Apr 20 14:55:21.542381 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.542351 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-249.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-249.ec2.internal"] Apr 20 14:55:21.542462 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.542432 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:55:21.543728 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.543715 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-249.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:55:21.543811 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.543743 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-249.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:55:21.543811 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.543766 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-249.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:55:21.545973 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.545959 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:55:21.546095 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.546081 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-249.ec2.internal" Apr 20 14:55:21.546138 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.546107 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:55:21.546626 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.546605 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-249.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:55:21.546626 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.546621 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-249.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:55:21.546746 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.546632 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-249.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:55:21.546746 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.546641 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-249.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:55:21.546746 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.546647 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-249.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:55:21.546746 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.546650 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-249.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:55:21.548732 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.548720 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-249.ec2.internal" Apr 20 14:55:21.548804 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.548743 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:55:21.549353 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.549335 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-249.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:55:21.549454 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.549363 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-249.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:55:21.549454 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.549387 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-249.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:55:21.567877 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:21.567857 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-249.ec2.internal\" not found" node="ip-10-0-130-249.ec2.internal" Apr 20 14:55:21.572194 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:21.572178 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-249.ec2.internal\" not found" node="ip-10-0-130-249.ec2.internal" Apr 20 14:55:21.606480 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:21.606453 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-249.ec2.internal\" not found" Apr 20 14:55:21.623535 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.623512 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/75684a0b9a7080a9984ae7578d5b190b-config\") pod \"kube-apiserver-proxy-ip-10-0-130-249.ec2.internal\" (UID: \"75684a0b9a7080a9984ae7578d5b190b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-249.ec2.internal" Apr 20 14:55:21.706722 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:21.706705 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-249.ec2.internal\" not found" Apr 20 14:55:21.723988 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.723961 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a68c7dbaa1e3f6bab6aec0a6e079f58c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-249.ec2.internal\" (UID: \"a68c7dbaa1e3f6bab6aec0a6e079f58c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-249.ec2.internal" Apr 20 14:55:21.724050 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.724019 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/75684a0b9a7080a9984ae7578d5b190b-config\") pod \"kube-apiserver-proxy-ip-10-0-130-249.ec2.internal\" (UID: \"75684a0b9a7080a9984ae7578d5b190b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-249.ec2.internal" Apr 20 14:55:21.724050 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.724045 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a68c7dbaa1e3f6bab6aec0a6e079f58c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-249.ec2.internal\" (UID: \"a68c7dbaa1e3f6bab6aec0a6e079f58c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-249.ec2.internal" Apr 20 14:55:21.724120 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.724085 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/75684a0b9a7080a9984ae7578d5b190b-config\") pod \"kube-apiserver-proxy-ip-10-0-130-249.ec2.internal\" (UID: \"75684a0b9a7080a9984ae7578d5b190b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-249.ec2.internal" Apr 20 14:55:21.807402 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:21.807352 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-249.ec2.internal\" not found" Apr 20 14:55:21.824667 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.824643 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a68c7dbaa1e3f6bab6aec0a6e079f58c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-249.ec2.internal\" (UID: \"a68c7dbaa1e3f6bab6aec0a6e079f58c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-249.ec2.internal" Apr 20 14:55:21.824667 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.824672 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a68c7dbaa1e3f6bab6aec0a6e079f58c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-249.ec2.internal\" (UID: \"a68c7dbaa1e3f6bab6aec0a6e079f58c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-249.ec2.internal" Apr 20 14:55:21.824801 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.824704 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a68c7dbaa1e3f6bab6aec0a6e079f58c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-249.ec2.internal\" (UID: \"a68c7dbaa1e3f6bab6aec0a6e079f58c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-249.ec2.internal" Apr 20 14:55:21.824801 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.824765 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a68c7dbaa1e3f6bab6aec0a6e079f58c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-249.ec2.internal\" (UID: \"a68c7dbaa1e3f6bab6aec0a6e079f58c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-249.ec2.internal" Apr 20 14:55:21.869816 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.869795 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-249.ec2.internal" Apr 20 14:55:21.874334 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:21.874319 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-249.ec2.internal" Apr 20 14:55:21.907965 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:21.907936 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-249.ec2.internal\" not found" Apr 20 14:55:22.008474 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:22.008447 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-249.ec2.internal\" not found" Apr 20 14:55:22.108959 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:22.108897 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-249.ec2.internal\" not found" Apr 20 14:55:22.209523 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:22.209497 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-249.ec2.internal\" not found" Apr 20 14:55:22.239939 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:22.239919 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 14:55:22.240442 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:22.240049 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 14:55:22.240442 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:22.240084 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 14:55:22.309749 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:22.309719 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-249.ec2.internal\" not found" Apr 20 14:55:22.318473 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:22.318445 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 14:50:21 +0000 UTC" deadline="2027-11-03 22:19:57.97912168 +0000 UTC" Apr 20 14:55:22.318473 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:22.318470 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13495h24m35.660655648s" Apr 20 14:55:22.321934 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:22.321914 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 14:55:22.335003 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:22.334978 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 14:55:22.353353 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:22.353332 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-g7fc9" Apr 20 14:55:22.361743 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:22.361699 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-g7fc9" Apr 20 14:55:22.385710 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:22.385684 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda68c7dbaa1e3f6bab6aec0a6e079f58c.slice/crio-26daf0102da01fa3f1438c616e507777fcf7069be9dad9aef5b99bee186fbef3 WatchSource:0}: Error finding container 26daf0102da01fa3f1438c616e507777fcf7069be9dad9aef5b99bee186fbef3: Status 404 returned error can't find the container with id 26daf0102da01fa3f1438c616e507777fcf7069be9dad9aef5b99bee186fbef3 Apr 20 14:55:22.386320 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:22.386305 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75684a0b9a7080a9984ae7578d5b190b.slice/crio-bac5e7fc10e313b2d5c775f61b4ad9b6cc7ec9722069b98f7acde8c6a303f0a2 WatchSource:0}: Error finding container bac5e7fc10e313b2d5c775f61b4ad9b6cc7ec9722069b98f7acde8c6a303f0a2: Status 404 returned error can't find the container with id bac5e7fc10e313b2d5c775f61b4ad9b6cc7ec9722069b98f7acde8c6a303f0a2 Apr 20 14:55:22.390705 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:22.390691 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 14:55:22.410420 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:22.410399 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-249.ec2.internal\" not found" Apr 20 14:55:22.444288 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:22.444247 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-249.ec2.internal" event={"ID":"75684a0b9a7080a9984ae7578d5b190b","Type":"ContainerStarted","Data":"bac5e7fc10e313b2d5c775f61b4ad9b6cc7ec9722069b98f7acde8c6a303f0a2"} Apr 20 14:55:22.445211 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:22.445191 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-249.ec2.internal" event={"ID":"a68c7dbaa1e3f6bab6aec0a6e079f58c","Type":"ContainerStarted","Data":"26daf0102da01fa3f1438c616e507777fcf7069be9dad9aef5b99bee186fbef3"} Apr 20 14:55:22.511376 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:22.511353 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-249.ec2.internal\" not found" Apr 20 14:55:22.596451 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:22.596274 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:55:22.622144 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:22.622099 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-249.ec2.internal" Apr 20 14:55:22.633263 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:22.633239 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 14:55:22.634873 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:22.634861 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-249.ec2.internal" Apr 20 14:55:22.642902 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:22.642887 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 14:55:22.835247 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:22.835098 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:55:23.080256 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.080169 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:55:23.300111 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.300085 2574 apiserver.go:52] "Watching apiserver" Apr 20 14:55:23.310793 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.310765 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 14:55:23.311183 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.311144 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-2xvqt","openshift-image-registry/node-ca-qxsfj","openshift-network-operator/iptables-alerter-xm99w","openshift-ovn-kubernetes/ovnkube-node-5vmgd","kube-system/kube-apiserver-proxy-ip-10-0-130-249.ec2.internal","openshift-dns/node-resolver-mj558","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-249.ec2.internal","openshift-multus/multus-additional-cni-plugins-blx8n","openshift-multus/multus-zlwvt","openshift-multus/network-metrics-daemon-h24vc","openshift-network-diagnostics/network-check-target-j2mjp","kube-system/konnectivity-agent-plvj4","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv"] Apr 20 14:55:23.313623 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.313601 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:23.313729 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:23.313684 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h24vc" podUID="90e2aae6-6b60-4b8e-a0ba-12474f425b1d" Apr 20 14:55:23.317779 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.317756 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qxsfj" Apr 20 14:55:23.317896 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.317872 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xm99w" Apr 20 14:55:23.320061 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.320039 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.320163 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.320106 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 14:55:23.320163 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.320122 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:55:23.320891 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.320875 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 14:55:23.321106 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.321086 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 14:55:23.321181 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.321115 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 14:55:23.321181 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.321086 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4fsdr\"" Apr 20 14:55:23.321535 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.321514 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-6zlnz\"" Apr 20 14:55:23.321627 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.321554 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 14:55:23.322214 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.322186 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mj558" Apr 20 14:55:23.322563 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.322349 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 14:55:23.322665 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.322639 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 14:55:23.322725 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.322670 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 14:55:23.322789 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.322770 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 14:55:23.322868 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.322853 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-whf9x\"" Apr 20 14:55:23.323029 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.323006 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 14:55:23.323219 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.323203 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 14:55:23.324358 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.324340 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 14:55:23.324511 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.324493 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-mx55k\"" Apr 20 14:55:23.324718 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.324699 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 14:55:23.327214 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.327196 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.327330 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.327314 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.329636 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.329609 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.329865 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.329841 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 14:55:23.330139 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.330086 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mgzjq\"" Apr 20 14:55:23.330321 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.330273 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 14:55:23.330475 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.330455 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 14:55:23.330580 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.330571 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 14:55:23.330636 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.330604 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 14:55:23.330687 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.330648 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-bwj5r\"" Apr 20 14:55:23.330687 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.330606 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 14:55:23.331951 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.331890 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 14:55:23.331951 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.331893 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-ljhlf\"" Apr 20 14:55:23.332284 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.332143 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:55:23.332284 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.332153 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:23.332284 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:23.332241 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j2mjp" podUID="0ca0f6c2-6280-464c-8916-90374e2c88b8" Apr 20 14:55:23.333629 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.333329 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-run-systemd\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.333629 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.333609 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/772f88da-629b-4161-9ed5-8a916387c9bd-hosts-file\") pod \"node-resolver-mj558\" (UID: \"772f88da-629b-4161-9ed5-8a916387c9bd\") " pod="openshift-dns/node-resolver-mj558" Apr 20 14:55:23.333768 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.333652 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79fbt\" (UniqueName: \"kubernetes.io/projected/772f88da-629b-4161-9ed5-8a916387c9bd-kube-api-access-79fbt\") pod \"node-resolver-mj558\" (UID: \"772f88da-629b-4161-9ed5-8a916387c9bd\") " pod="openshift-dns/node-resolver-mj558" Apr 20 14:55:23.333768 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.333688 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/61708c39-4987-438d-b51f-59e8cd1a1e59-cni-binary-copy\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.333768 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.333719 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shnmb\" (UniqueName: \"kubernetes.io/projected/61708c39-4987-438d-b51f-59e8cd1a1e59-kube-api-access-shnmb\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.333768 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.333750 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs\") pod \"network-metrics-daemon-h24vc\" (UID: \"90e2aae6-6b60-4b8e-a0ba-12474f425b1d\") " pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:23.333923 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.333842 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10081761-39cd-4657-8ccf-94426cfd0833-host\") pod \"node-ca-qxsfj\" (UID: \"10081761-39cd-4657-8ccf-94426cfd0833\") " pod="openshift-image-registry/node-ca-qxsfj" Apr 20 14:55:23.333923 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.333893 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/10081761-39cd-4657-8ccf-94426cfd0833-serviceca\") pod \"node-ca-qxsfj\" (UID: \"10081761-39cd-4657-8ccf-94426cfd0833\") " pod="openshift-image-registry/node-ca-qxsfj" Apr 20 14:55:23.334013 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.333944 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/61708c39-4987-438d-b51f-59e8cd1a1e59-system-cni-dir\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.334071 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.334046 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv6vg\" (UniqueName: \"kubernetes.io/projected/10081761-39cd-4657-8ccf-94426cfd0833-kube-api-access-dv6vg\") pod \"node-ca-qxsfj\" (UID: \"10081761-39cd-4657-8ccf-94426cfd0833\") " pod="openshift-image-registry/node-ca-qxsfj" Apr 20 14:55:23.334914 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.334119 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd5jg\" (UniqueName: \"kubernetes.io/projected/ea32ceac-045d-412e-95db-ec7a62502246-kube-api-access-gd5jg\") pod \"iptables-alerter-xm99w\" (UID: \"ea32ceac-045d-412e-95db-ec7a62502246\") " pod="openshift-network-operator/iptables-alerter-xm99w" Apr 20 14:55:23.334914 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.334215 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-node-log\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.334914 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.334272 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-host-cni-netd\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.334914 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.334308 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-systemd-units\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.334914 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.334336 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/61708c39-4987-438d-b51f-59e8cd1a1e59-cnibin\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.334914 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.334384 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/61708c39-4987-438d-b51f-59e8cd1a1e59-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.334914 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.334419 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-var-lib-openvswitch\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.334914 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.334467 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-etc-openvswitch\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.334914 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.334497 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-log-socket\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.334914 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.334525 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-host-cni-bin\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.334914 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.334536 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-plvj4" Apr 20 14:55:23.334914 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.334555 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-host-run-netns\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.334914 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.334715 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7hcl\" (UniqueName: \"kubernetes.io/projected/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-kube-api-access-z7hcl\") pod \"network-metrics-daemon-h24vc\" (UID: \"90e2aae6-6b60-4b8e-a0ba-12474f425b1d\") " pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:23.334914 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.334863 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/61708c39-4987-438d-b51f-59e8cd1a1e59-os-release\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.334914 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.334897 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/61708c39-4987-438d-b51f-59e8cd1a1e59-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.335632 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.335010 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ea32ceac-045d-412e-95db-ec7a62502246-iptables-alerter-script\") pod \"iptables-alerter-xm99w\" (UID: \"ea32ceac-045d-412e-95db-ec7a62502246\") " pod="openshift-network-operator/iptables-alerter-xm99w" Apr 20 14:55:23.335632 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.335042 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-host-kubelet\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.335632 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.335088 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-host-slash\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.335632 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.335127 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-run-openvswitch\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.335632 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.335164 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.335632 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.335206 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-run-ovn\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.335632 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.335236 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e6562aeb-103a-4d96-b5d3-356a382186d6-env-overrides\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.335632 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.335270 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/772f88da-629b-4161-9ed5-8a916387c9bd-tmp-dir\") pod \"node-resolver-mj558\" (UID: \"772f88da-629b-4161-9ed5-8a916387c9bd\") " pod="openshift-dns/node-resolver-mj558" Apr 20 14:55:23.335632 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.335322 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/61708c39-4987-438d-b51f-59e8cd1a1e59-tuning-conf-dir\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.335632 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.335355 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ea32ceac-045d-412e-95db-ec7a62502246-host-slash\") pod \"iptables-alerter-xm99w\" (UID: \"ea32ceac-045d-412e-95db-ec7a62502246\") " pod="openshift-network-operator/iptables-alerter-xm99w" Apr 20 14:55:23.335632 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.335399 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.335632 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.335442 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e6562aeb-103a-4d96-b5d3-356a382186d6-ovnkube-config\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.335632 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.335470 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e6562aeb-103a-4d96-b5d3-356a382186d6-ovn-node-metrics-cert\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.335632 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.335497 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e6562aeb-103a-4d96-b5d3-356a382186d6-ovnkube-script-lib\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.335632 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.335522 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjrwx\" (UniqueName: \"kubernetes.io/projected/e6562aeb-103a-4d96-b5d3-356a382186d6-kube-api-access-rjrwx\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.337202 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.337170 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 14:55:23.337303 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.337214 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-lb9tw\"" Apr 20 14:55:23.337303 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.337294 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.337495 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.337475 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 14:55:23.339432 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.339412 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 14:55:23.339578 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.339560 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 14:55:23.339662 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.339651 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-jc2sv\"" Apr 20 14:55:23.339781 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.339712 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 14:55:23.358118 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.358100 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:55:23.362463 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.362435 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:50:22 +0000 UTC" deadline="2027-11-18 10:08:02.270097727 +0000 UTC" Apr 20 14:55:23.362561 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.362463 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13843h12m38.907637564s" Apr 20 14:55:23.423563 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.423544 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 14:55:23.436078 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436058 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjrwx\" (UniqueName: \"kubernetes.io/projected/e6562aeb-103a-4d96-b5d3-356a382186d6-kube-api-access-rjrwx\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.436177 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436088 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-host-run-k8s-cni-cncf-io\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.436177 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436111 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-host-run-multus-certs\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.436300 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436283 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e656ecc0-7223-45e9-8f4c-15a416238cc3-sys-fs\") pod \"aws-ebs-csi-driver-node-sttsv\" (UID: \"e656ecc0-7223-45e9-8f4c-15a416238cc3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.436359 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436339 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-run-systemd\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.436434 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436384 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10081761-39cd-4657-8ccf-94426cfd0833-host\") pod \"node-ca-qxsfj\" (UID: \"10081761-39cd-4657-8ccf-94426cfd0833\") " pod="openshift-image-registry/node-ca-qxsfj" Apr 20 14:55:23.436434 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436406 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-run-systemd\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.436434 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436415 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/10081761-39cd-4657-8ccf-94426cfd0833-serviceca\") pod \"node-ca-qxsfj\" (UID: \"10081761-39cd-4657-8ccf-94426cfd0833\") " pod="openshift-image-registry/node-ca-qxsfj" Apr 20 14:55:23.436571 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436445 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-run\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.436571 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436464 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3fea6c02-a196-43fc-bb1f-edc946b98e7f-etc-tuned\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.436571 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436475 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/10081761-39cd-4657-8ccf-94426cfd0833-host\") pod \"node-ca-qxsfj\" (UID: \"10081761-39cd-4657-8ccf-94426cfd0833\") " pod="openshift-image-registry/node-ca-qxsfj" Apr 20 14:55:23.436571 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436479 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3fea6c02-a196-43fc-bb1f-edc946b98e7f-tmp\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.436571 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436503 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7k5q\" (UniqueName: \"kubernetes.io/projected/3fea6c02-a196-43fc-bb1f-edc946b98e7f-kube-api-access-z7k5q\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.436571 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436541 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6658032c-bba0-4e90-8a55-840d8cdab9e3-konnectivity-ca\") pod \"konnectivity-agent-plvj4\" (UID: \"6658032c-bba0-4e90-8a55-840d8cdab9e3\") " pod="kube-system/konnectivity-agent-plvj4" Apr 20 14:55:23.436571 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436559 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/61708c39-4987-438d-b51f-59e8cd1a1e59-system-cni-dir\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.436896 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436579 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gd5jg\" (UniqueName: \"kubernetes.io/projected/ea32ceac-045d-412e-95db-ec7a62502246-kube-api-access-gd5jg\") pod \"iptables-alerter-xm99w\" (UID: \"ea32ceac-045d-412e-95db-ec7a62502246\") " pod="openshift-network-operator/iptables-alerter-xm99w" Apr 20 14:55:23.436896 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436593 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-node-log\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.436896 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436607 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-var-lib-kubelet\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.436896 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436622 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-etc-sysconfig\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.436896 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436636 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-etc-sysctl-d\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.436896 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436665 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-node-log\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.436896 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436722 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/61708c39-4987-438d-b51f-59e8cd1a1e59-system-cni-dir\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.436896 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436738 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-os-release\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.436896 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436770 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlzk6\" (UniqueName: \"kubernetes.io/projected/e656ecc0-7223-45e9-8f4c-15a416238cc3-kube-api-access-rlzk6\") pod \"aws-ebs-csi-driver-node-sttsv\" (UID: \"e656ecc0-7223-45e9-8f4c-15a416238cc3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.436896 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436794 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/61708c39-4987-438d-b51f-59e8cd1a1e59-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.436896 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.436830 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/10081761-39cd-4657-8ccf-94426cfd0833-serviceca\") pod \"node-ca-qxsfj\" (UID: \"10081761-39cd-4657-8ccf-94426cfd0833\") " pod="openshift-image-registry/node-ca-qxsfj" Apr 20 14:55:23.437249 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437225 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-host-cni-bin\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.437306 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437269 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-host-cni-bin\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.437306 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437280 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-multus-daemon-config\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.437425 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437313 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e656ecc0-7223-45e9-8f4c-15a416238cc3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sttsv\" (UID: \"e656ecc0-7223-45e9-8f4c-15a416238cc3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.437425 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437343 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-host-run-netns\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.437425 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437395 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7hcl\" (UniqueName: \"kubernetes.io/projected/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-kube-api-access-z7hcl\") pod \"network-metrics-daemon-h24vc\" (UID: \"90e2aae6-6b60-4b8e-a0ba-12474f425b1d\") " pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:23.437425 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437410 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-host-run-netns\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.437615 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437425 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cckvm\" (UniqueName: \"kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm\") pod \"network-check-target-j2mjp\" (UID: \"0ca0f6c2-6280-464c-8916-90374e2c88b8\") " pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:23.437615 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437450 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-cni-binary-copy\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.437615 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437483 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-host-kubelet\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.437615 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437507 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-host-slash\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.437615 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437540 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-host-kubelet\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.437615 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437544 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-run-openvswitch\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.437615 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437572 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e656ecc0-7223-45e9-8f4c-15a416238cc3-etc-selinux\") pod \"aws-ebs-csi-driver-node-sttsv\" (UID: \"e656ecc0-7223-45e9-8f4c-15a416238cc3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.437615 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437570 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-host-slash\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.437615 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437608 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e6562aeb-103a-4d96-b5d3-356a382186d6-env-overrides\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.437615 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437614 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-run-openvswitch\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.437941 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437633 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ea32ceac-045d-412e-95db-ec7a62502246-host-slash\") pod \"iptables-alerter-xm99w\" (UID: \"ea32ceac-045d-412e-95db-ec7a62502246\") " pod="openshift-network-operator/iptables-alerter-xm99w" Apr 20 14:55:23.437941 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437683 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ea32ceac-045d-412e-95db-ec7a62502246-host-slash\") pod \"iptables-alerter-xm99w\" (UID: \"ea32ceac-045d-412e-95db-ec7a62502246\") " pod="openshift-network-operator/iptables-alerter-xm99w" Apr 20 14:55:23.437941 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437701 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/61708c39-4987-438d-b51f-59e8cd1a1e59-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.437941 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437713 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.437941 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437756 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e6562aeb-103a-4d96-b5d3-356a382186d6-ovnkube-config\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.437941 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437784 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-etc-systemd\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.437941 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437765 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.437941 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437836 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e656ecc0-7223-45e9-8f4c-15a416238cc3-socket-dir\") pod \"aws-ebs-csi-driver-node-sttsv\" (UID: \"e656ecc0-7223-45e9-8f4c-15a416238cc3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.437941 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437863 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/772f88da-629b-4161-9ed5-8a916387c9bd-hosts-file\") pod \"node-resolver-mj558\" (UID: \"772f88da-629b-4161-9ed5-8a916387c9bd\") " pod="openshift-dns/node-resolver-mj558" Apr 20 14:55:23.437941 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437893 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-79fbt\" (UniqueName: \"kubernetes.io/projected/772f88da-629b-4161-9ed5-8a916387c9bd-kube-api-access-79fbt\") pod \"node-resolver-mj558\" (UID: \"772f88da-629b-4161-9ed5-8a916387c9bd\") " pod="openshift-dns/node-resolver-mj558" Apr 20 14:55:23.438305 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437948 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/772f88da-629b-4161-9ed5-8a916387c9bd-hosts-file\") pod \"node-resolver-mj558\" (UID: \"772f88da-629b-4161-9ed5-8a916387c9bd\") " pod="openshift-dns/node-resolver-mj558" Apr 20 14:55:23.438305 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437975 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/61708c39-4987-438d-b51f-59e8cd1a1e59-cni-binary-copy\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.438305 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.437992 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shnmb\" (UniqueName: \"kubernetes.io/projected/61708c39-4987-438d-b51f-59e8cd1a1e59-kube-api-access-shnmb\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.438305 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.438013 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs\") pod \"network-metrics-daemon-h24vc\" (UID: \"90e2aae6-6b60-4b8e-a0ba-12474f425b1d\") " pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:23.438305 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.438032 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dv6vg\" (UniqueName: \"kubernetes.io/projected/10081761-39cd-4657-8ccf-94426cfd0833-kube-api-access-dv6vg\") pod \"node-ca-qxsfj\" (UID: \"10081761-39cd-4657-8ccf-94426cfd0833\") " pod="openshift-image-registry/node-ca-qxsfj" Apr 20 14:55:23.438305 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.438060 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e6562aeb-103a-4d96-b5d3-356a382186d6-env-overrides\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.438305 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.438077 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6658032c-bba0-4e90-8a55-840d8cdab9e3-agent-certs\") pod \"konnectivity-agent-plvj4\" (UID: \"6658032c-bba0-4e90-8a55-840d8cdab9e3\") " pod="kube-system/konnectivity-agent-plvj4" Apr 20 14:55:23.438305 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.438100 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-hostroot\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.438305 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.438127 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-host-cni-netd\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.438305 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:23.438144 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:23.438790 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.438733 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-sys\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.438790 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.438759 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e6562aeb-103a-4d96-b5d3-356a382186d6-ovnkube-config\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.438890 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:23.438814 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs podName:90e2aae6-6b60-4b8e-a0ba-12474f425b1d nodeName:}" failed. No retries permitted until 2026-04-20 14:55:23.938709258 +0000 UTC m=+3.011984869 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs") pod "network-metrics-daemon-h24vc" (UID: "90e2aae6-6b60-4b8e-a0ba-12474f425b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:23.438890 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.438181 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-host-cni-netd\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.438890 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.438876 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-multus-socket-dir-parent\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.439032 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.438918 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-host-var-lib-cni-bin\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.439032 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.438983 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-multus-conf-dir\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.439123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439044 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vrl2\" (UniqueName: \"kubernetes.io/projected/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-kube-api-access-4vrl2\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.439123 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439045 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/61708c39-4987-438d-b51f-59e8cd1a1e59-cni-binary-copy\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.439211 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439141 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-systemd-units\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.439211 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439175 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-etc-modprobe-d\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.439211 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439204 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-etc-kubernetes\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.439344 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439232 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-etc-sysctl-conf\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.439344 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439257 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-etc-kubernetes\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.439344 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439288 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/61708c39-4987-438d-b51f-59e8cd1a1e59-cnibin\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.439344 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439318 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-var-lib-openvswitch\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.439531 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439347 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-etc-openvswitch\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.439531 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439391 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-log-socket\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.439531 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439426 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-host\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.439531 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439453 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-cnibin\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.439531 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439476 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-host-run-netns\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.439531 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439503 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-host-var-lib-kubelet\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.439531 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439530 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-lib-modules\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.439807 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439568 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-host-var-lib-cni-multus\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.439807 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439596 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e656ecc0-7223-45e9-8f4c-15a416238cc3-registration-dir\") pod \"aws-ebs-csi-driver-node-sttsv\" (UID: \"e656ecc0-7223-45e9-8f4c-15a416238cc3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.439807 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439622 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/61708c39-4987-438d-b51f-59e8cd1a1e59-os-release\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.439807 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439655 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/61708c39-4987-438d-b51f-59e8cd1a1e59-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.439807 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439686 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ea32ceac-045d-412e-95db-ec7a62502246-iptables-alerter-script\") pod \"iptables-alerter-xm99w\" (UID: \"ea32ceac-045d-412e-95db-ec7a62502246\") " pod="openshift-network-operator/iptables-alerter-xm99w" Apr 20 14:55:23.439807 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439721 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.439807 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439759 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-multus-cni-dir\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.439807 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439787 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-run-ovn\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.440138 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439817 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-system-cni-dir\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.440138 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439846 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e656ecc0-7223-45e9-8f4c-15a416238cc3-device-dir\") pod \"aws-ebs-csi-driver-node-sttsv\" (UID: \"e656ecc0-7223-45e9-8f4c-15a416238cc3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.440138 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439878 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/772f88da-629b-4161-9ed5-8a916387c9bd-tmp-dir\") pod \"node-resolver-mj558\" (UID: \"772f88da-629b-4161-9ed5-8a916387c9bd\") " pod="openshift-dns/node-resolver-mj558" Apr 20 14:55:23.440138 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439907 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/61708c39-4987-438d-b51f-59e8cd1a1e59-tuning-conf-dir\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.440138 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439941 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e6562aeb-103a-4d96-b5d3-356a382186d6-ovn-node-metrics-cert\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.440138 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.439969 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e6562aeb-103a-4d96-b5d3-356a382186d6-ovnkube-script-lib\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.440938 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.440550 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e6562aeb-103a-4d96-b5d3-356a382186d6-ovnkube-script-lib\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.440938 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.440630 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-systemd-units\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.440938 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.440725 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/61708c39-4987-438d-b51f-59e8cd1a1e59-cnibin\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.440938 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.440773 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-var-lib-openvswitch\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.440938 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.440822 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-etc-openvswitch\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.440938 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.440867 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-log-socket\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.441223 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.440999 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/61708c39-4987-438d-b51f-59e8cd1a1e59-os-release\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.441467 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.441424 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/61708c39-4987-438d-b51f-59e8cd1a1e59-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.441559 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.441508 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.441623 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.441566 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/61708c39-4987-438d-b51f-59e8cd1a1e59-tuning-conf-dir\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.441669 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.441612 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e6562aeb-103a-4d96-b5d3-356a382186d6-run-ovn\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.442008 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.441992 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/772f88da-629b-4161-9ed5-8a916387c9bd-tmp-dir\") pod \"node-resolver-mj558\" (UID: \"772f88da-629b-4161-9ed5-8a916387c9bd\") " pod="openshift-dns/node-resolver-mj558" Apr 20 14:55:23.442244 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.442189 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/ea32ceac-045d-412e-95db-ec7a62502246-iptables-alerter-script\") pod \"iptables-alerter-xm99w\" (UID: \"ea32ceac-045d-412e-95db-ec7a62502246\") " pod="openshift-network-operator/iptables-alerter-xm99w" Apr 20 14:55:23.442316 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.442299 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 14:55:23.446604 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.446582 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e6562aeb-103a-4d96-b5d3-356a382186d6-ovn-node-metrics-cert\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.449778 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.449752 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjrwx\" (UniqueName: \"kubernetes.io/projected/e6562aeb-103a-4d96-b5d3-356a382186d6-kube-api-access-rjrwx\") pod \"ovnkube-node-5vmgd\" (UID: \"e6562aeb-103a-4d96-b5d3-356a382186d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.450142 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.450124 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv6vg\" (UniqueName: \"kubernetes.io/projected/10081761-39cd-4657-8ccf-94426cfd0833-kube-api-access-dv6vg\") pod \"node-ca-qxsfj\" (UID: \"10081761-39cd-4657-8ccf-94426cfd0833\") " pod="openshift-image-registry/node-ca-qxsfj" Apr 20 14:55:23.450578 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.450555 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7hcl\" (UniqueName: \"kubernetes.io/projected/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-kube-api-access-z7hcl\") pod \"network-metrics-daemon-h24vc\" (UID: \"90e2aae6-6b60-4b8e-a0ba-12474f425b1d\") " pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:23.450727 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.450710 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shnmb\" (UniqueName: \"kubernetes.io/projected/61708c39-4987-438d-b51f-59e8cd1a1e59-kube-api-access-shnmb\") pod \"multus-additional-cni-plugins-blx8n\" (UID: \"61708c39-4987-438d-b51f-59e8cd1a1e59\") " pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.451984 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.451953 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-79fbt\" (UniqueName: \"kubernetes.io/projected/772f88da-629b-4161-9ed5-8a916387c9bd-kube-api-access-79fbt\") pod \"node-resolver-mj558\" (UID: \"772f88da-629b-4161-9ed5-8a916387c9bd\") " pod="openshift-dns/node-resolver-mj558" Apr 20 14:55:23.452159 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.452141 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd5jg\" (UniqueName: \"kubernetes.io/projected/ea32ceac-045d-412e-95db-ec7a62502246-kube-api-access-gd5jg\") pod \"iptables-alerter-xm99w\" (UID: \"ea32ceac-045d-412e-95db-ec7a62502246\") " pod="openshift-network-operator/iptables-alerter-xm99w" Apr 20 14:55:23.540486 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540456 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-os-release\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.540486 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540490 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlzk6\" (UniqueName: \"kubernetes.io/projected/e656ecc0-7223-45e9-8f4c-15a416238cc3-kube-api-access-rlzk6\") pod \"aws-ebs-csi-driver-node-sttsv\" (UID: \"e656ecc0-7223-45e9-8f4c-15a416238cc3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.540719 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540510 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-multus-daemon-config\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.540719 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540529 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e656ecc0-7223-45e9-8f4c-15a416238cc3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sttsv\" (UID: \"e656ecc0-7223-45e9-8f4c-15a416238cc3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.540719 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540552 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cckvm\" (UniqueName: \"kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm\") pod \"network-check-target-j2mjp\" (UID: \"0ca0f6c2-6280-464c-8916-90374e2c88b8\") " pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:23.540719 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540570 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-cni-binary-copy\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.540719 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540593 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e656ecc0-7223-45e9-8f4c-15a416238cc3-etc-selinux\") pod \"aws-ebs-csi-driver-node-sttsv\" (UID: \"e656ecc0-7223-45e9-8f4c-15a416238cc3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.540719 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540603 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-os-release\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.540719 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540612 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-etc-systemd\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.540719 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540668 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e656ecc0-7223-45e9-8f4c-15a416238cc3-socket-dir\") pod \"aws-ebs-csi-driver-node-sttsv\" (UID: \"e656ecc0-7223-45e9-8f4c-15a416238cc3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.540719 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540677 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/e656ecc0-7223-45e9-8f4c-15a416238cc3-etc-selinux\") pod \"aws-ebs-csi-driver-node-sttsv\" (UID: \"e656ecc0-7223-45e9-8f4c-15a416238cc3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.540719 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540680 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e656ecc0-7223-45e9-8f4c-15a416238cc3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-sttsv\" (UID: \"e656ecc0-7223-45e9-8f4c-15a416238cc3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.540719 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540703 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-etc-systemd\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.540719 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540719 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6658032c-bba0-4e90-8a55-840d8cdab9e3-agent-certs\") pod \"konnectivity-agent-plvj4\" (UID: \"6658032c-bba0-4e90-8a55-840d8cdab9e3\") " pod="kube-system/konnectivity-agent-plvj4" Apr 20 14:55:23.541261 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540743 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-hostroot\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.541261 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540771 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-sys\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.541261 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540784 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e656ecc0-7223-45e9-8f4c-15a416238cc3-socket-dir\") pod \"aws-ebs-csi-driver-node-sttsv\" (UID: \"e656ecc0-7223-45e9-8f4c-15a416238cc3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.541261 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540794 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-multus-socket-dir-parent\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.541261 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540828 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-hostroot\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.541261 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540837 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-host-var-lib-cni-bin\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.541261 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540846 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-sys\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.541261 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540864 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-multus-conf-dir\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.541261 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540887 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-host-var-lib-cni-bin\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.541261 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540888 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vrl2\" (UniqueName: \"kubernetes.io/projected/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-kube-api-access-4vrl2\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.541261 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540904 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-multus-socket-dir-parent\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.541261 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540911 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-multus-conf-dir\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.541261 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540929 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-etc-modprobe-d\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.541261 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540948 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-etc-kubernetes\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.541261 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.540963 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-etc-sysctl-conf\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.541261 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541010 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-etc-kubernetes\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.541261 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541043 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-host\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.541261 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541067 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-cnibin\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.542020 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541078 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-etc-modprobe-d\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.542020 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541090 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-host-run-netns\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.542020 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541115 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-host-var-lib-kubelet\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.542020 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541126 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-host\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.542020 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541138 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-lib-modules\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.542020 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541140 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-etc-kubernetes\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.542020 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541153 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-etc-sysctl-conf\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.542020 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541163 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-host-run-netns\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.542020 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541175 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-host-var-lib-cni-multus\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.542020 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541206 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e656ecc0-7223-45e9-8f4c-15a416238cc3-registration-dir\") pod \"aws-ebs-csi-driver-node-sttsv\" (UID: \"e656ecc0-7223-45e9-8f4c-15a416238cc3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.542020 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541214 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-etc-kubernetes\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.542020 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541219 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-lib-modules\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.542020 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541247 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-host-var-lib-kubelet\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.542020 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541254 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-host-var-lib-cni-multus\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.542020 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541271 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-cnibin\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.542020 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541305 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-multus-cni-dir\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.542020 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541314 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e656ecc0-7223-45e9-8f4c-15a416238cc3-registration-dir\") pod \"aws-ebs-csi-driver-node-sttsv\" (UID: \"e656ecc0-7223-45e9-8f4c-15a416238cc3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.542020 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541352 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-multus-cni-dir\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.542806 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541350 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-system-cni-dir\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.542806 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541403 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e656ecc0-7223-45e9-8f4c-15a416238cc3-device-dir\") pod \"aws-ebs-csi-driver-node-sttsv\" (UID: \"e656ecc0-7223-45e9-8f4c-15a416238cc3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.542806 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541418 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-system-cni-dir\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.542806 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541423 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-host-run-k8s-cni-cncf-io\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.542806 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541452 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-host-run-multus-certs\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.542806 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541446 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-host-run-k8s-cni-cncf-io\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.542806 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541477 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e656ecc0-7223-45e9-8f4c-15a416238cc3-sys-fs\") pod \"aws-ebs-csi-driver-node-sttsv\" (UID: \"e656ecc0-7223-45e9-8f4c-15a416238cc3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.542806 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541484 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-cni-binary-copy\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.542806 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541493 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-multus-daemon-config\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.542806 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541497 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/e656ecc0-7223-45e9-8f4c-15a416238cc3-device-dir\") pod \"aws-ebs-csi-driver-node-sttsv\" (UID: \"e656ecc0-7223-45e9-8f4c-15a416238cc3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.542806 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541502 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-run\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.542806 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541533 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-run\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.542806 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541545 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3fea6c02-a196-43fc-bb1f-edc946b98e7f-etc-tuned\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.542806 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541507 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-host-run-multus-certs\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.542806 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541570 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3fea6c02-a196-43fc-bb1f-edc946b98e7f-tmp\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.542806 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541546 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/e656ecc0-7223-45e9-8f4c-15a416238cc3-sys-fs\") pod \"aws-ebs-csi-driver-node-sttsv\" (UID: \"e656ecc0-7223-45e9-8f4c-15a416238cc3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.542806 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541594 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z7k5q\" (UniqueName: \"kubernetes.io/projected/3fea6c02-a196-43fc-bb1f-edc946b98e7f-kube-api-access-z7k5q\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.542806 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541620 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6658032c-bba0-4e90-8a55-840d8cdab9e3-konnectivity-ca\") pod \"konnectivity-agent-plvj4\" (UID: \"6658032c-bba0-4e90-8a55-840d8cdab9e3\") " pod="kube-system/konnectivity-agent-plvj4" Apr 20 14:55:23.543616 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541646 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-var-lib-kubelet\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.543616 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541670 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-etc-sysconfig\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.543616 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541691 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-etc-sysctl-d\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.543616 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541830 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-etc-sysctl-d\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.543616 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541881 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-var-lib-kubelet\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.543616 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.541894 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3fea6c02-a196-43fc-bb1f-edc946b98e7f-etc-sysconfig\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.543616 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.542190 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6658032c-bba0-4e90-8a55-840d8cdab9e3-konnectivity-ca\") pod \"konnectivity-agent-plvj4\" (UID: \"6658032c-bba0-4e90-8a55-840d8cdab9e3\") " pod="kube-system/konnectivity-agent-plvj4" Apr 20 14:55:23.543947 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.543710 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6658032c-bba0-4e90-8a55-840d8cdab9e3-agent-certs\") pod \"konnectivity-agent-plvj4\" (UID: \"6658032c-bba0-4e90-8a55-840d8cdab9e3\") " pod="kube-system/konnectivity-agent-plvj4" Apr 20 14:55:23.543947 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.543921 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3fea6c02-a196-43fc-bb1f-edc946b98e7f-tmp\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.544449 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.544426 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3fea6c02-a196-43fc-bb1f-edc946b98e7f-etc-tuned\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.546937 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:23.546688 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:55:23.546937 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:23.546711 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:55:23.546937 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:23.546725 2574 projected.go:194] Error preparing data for projected volume kube-api-access-cckvm for pod openshift-network-diagnostics/network-check-target-j2mjp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:23.546937 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:23.546783 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm podName:0ca0f6c2-6280-464c-8916-90374e2c88b8 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:24.046763693 +0000 UTC m=+3.120039314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cckvm" (UniqueName: "kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm") pod "network-check-target-j2mjp" (UID: "0ca0f6c2-6280-464c-8916-90374e2c88b8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:23.549092 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.549069 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7k5q\" (UniqueName: \"kubernetes.io/projected/3fea6c02-a196-43fc-bb1f-edc946b98e7f-kube-api-access-z7k5q\") pod \"tuned-2xvqt\" (UID: \"3fea6c02-a196-43fc-bb1f-edc946b98e7f\") " pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.549438 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.549415 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlzk6\" (UniqueName: \"kubernetes.io/projected/e656ecc0-7223-45e9-8f4c-15a416238cc3-kube-api-access-rlzk6\") pod \"aws-ebs-csi-driver-node-sttsv\" (UID: \"e656ecc0-7223-45e9-8f4c-15a416238cc3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.549438 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.549431 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vrl2\" (UniqueName: \"kubernetes.io/projected/09f1e2bc-4d9a-4838-b68f-01c2612ca3af-kube-api-access-4vrl2\") pod \"multus-zlwvt\" (UID: \"09f1e2bc-4d9a-4838-b68f-01c2612ca3af\") " pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.628981 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.628852 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qxsfj" Apr 20 14:55:23.639608 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.639586 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xm99w" Apr 20 14:55:23.647331 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.647313 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:23.652861 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.652840 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mj558" Apr 20 14:55:23.660361 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.660343 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-blx8n" Apr 20 14:55:23.665871 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.665856 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-zlwvt" Apr 20 14:55:23.672431 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.672414 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" Apr 20 14:55:23.678893 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.678877 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-plvj4" Apr 20 14:55:23.683432 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.683413 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" Apr 20 14:55:23.944205 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:23.944128 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs\") pod \"network-metrics-daemon-h24vc\" (UID: \"90e2aae6-6b60-4b8e-a0ba-12474f425b1d\") " pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:23.944392 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:23.944295 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:23.944392 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:23.944385 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs podName:90e2aae6-6b60-4b8e-a0ba-12474f425b1d nodeName:}" failed. No retries permitted until 2026-04-20 14:55:24.944349163 +0000 UTC m=+4.017624770 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs") pod "network-metrics-daemon-h24vc" (UID: "90e2aae6-6b60-4b8e-a0ba-12474f425b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:24.019663 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:24.019641 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6562aeb_103a_4d96_b5d3_356a382186d6.slice/crio-07690ace81997013258a8e288fd5f287c2b7e52ae52d87c78de1796d691ff6a2 WatchSource:0}: Error finding container 07690ace81997013258a8e288fd5f287c2b7e52ae52d87c78de1796d691ff6a2: Status 404 returned error can't find the container with id 07690ace81997013258a8e288fd5f287c2b7e52ae52d87c78de1796d691ff6a2 Apr 20 14:55:24.021324 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:24.021299 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea32ceac_045d_412e_95db_ec7a62502246.slice/crio-01930f62e34cba59119db12b2d5282086ce836638d3cdc4df844305607e5c3bb WatchSource:0}: Error finding container 01930f62e34cba59119db12b2d5282086ce836638d3cdc4df844305607e5c3bb: Status 404 returned error can't find the container with id 01930f62e34cba59119db12b2d5282086ce836638d3cdc4df844305607e5c3bb Apr 20 14:55:24.022783 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:24.022756 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09f1e2bc_4d9a_4838_b68f_01c2612ca3af.slice/crio-f8c63a1b3216096ab197ea17be2e7923d2f9cf9723f6da99cf0401e89b9cedbb WatchSource:0}: Error finding container f8c63a1b3216096ab197ea17be2e7923d2f9cf9723f6da99cf0401e89b9cedbb: Status 404 returned error can't find the container with id f8c63a1b3216096ab197ea17be2e7923d2f9cf9723f6da99cf0401e89b9cedbb Apr 20 14:55:24.024971 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:24.024949 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61708c39_4987_438d_b51f_59e8cd1a1e59.slice/crio-b32f4603a42f1e43093aaf6b273b9a183fdc6b4535ea5965bb259e1a1d8cc2b3 WatchSource:0}: Error finding container b32f4603a42f1e43093aaf6b273b9a183fdc6b4535ea5965bb259e1a1d8cc2b3: Status 404 returned error can't find the container with id b32f4603a42f1e43093aaf6b273b9a183fdc6b4535ea5965bb259e1a1d8cc2b3 Apr 20 14:55:24.025298 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:24.025274 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10081761_39cd_4657_8ccf_94426cfd0833.slice/crio-fabd13721460f52ce88007455493a2b9566eebb0e2ec7cffe488aa1d644a7a94 WatchSource:0}: Error finding container fabd13721460f52ce88007455493a2b9566eebb0e2ec7cffe488aa1d644a7a94: Status 404 returned error can't find the container with id fabd13721460f52ce88007455493a2b9566eebb0e2ec7cffe488aa1d644a7a94 Apr 20 14:55:24.026599 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:24.026575 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode656ecc0_7223_45e9_8f4c_15a416238cc3.slice/crio-8bdcfe099da12d89e690483c762246aba3904b2d02715270d9415a3174838217 WatchSource:0}: Error finding container 8bdcfe099da12d89e690483c762246aba3904b2d02715270d9415a3174838217: Status 404 returned error can't find the container with id 8bdcfe099da12d89e690483c762246aba3904b2d02715270d9415a3174838217 Apr 20 14:55:24.027784 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:24.027706 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod772f88da_629b_4161_9ed5_8a916387c9bd.slice/crio-2222681d179c4afae7e5ca8b6e8e4491606d9d83e6998e6b4f4cb9d785f2519e WatchSource:0}: Error finding container 2222681d179c4afae7e5ca8b6e8e4491606d9d83e6998e6b4f4cb9d785f2519e: Status 404 returned error can't find the container with id 2222681d179c4afae7e5ca8b6e8e4491606d9d83e6998e6b4f4cb9d785f2519e Apr 20 14:55:24.028166 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:24.028141 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6658032c_bba0_4e90_8a55_840d8cdab9e3.slice/crio-ee11f531e450165f7db3972c30161c51eb219b4a9dd8d4e1ff7727c15adccd37 WatchSource:0}: Error finding container ee11f531e450165f7db3972c30161c51eb219b4a9dd8d4e1ff7727c15adccd37: Status 404 returned error can't find the container with id ee11f531e450165f7db3972c30161c51eb219b4a9dd8d4e1ff7727c15adccd37 Apr 20 14:55:24.029841 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:24.029704 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fea6c02_a196_43fc_bb1f_edc946b98e7f.slice/crio-1fbd8db37289a13f84da669869be832d3fbc3e088239ce2bda0d98c8078f0510 WatchSource:0}: Error finding container 1fbd8db37289a13f84da669869be832d3fbc3e088239ce2bda0d98c8078f0510: Status 404 returned error can't find the container with id 1fbd8db37289a13f84da669869be832d3fbc3e088239ce2bda0d98c8078f0510 Apr 20 14:55:24.145627 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:24.145604 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cckvm\" (UniqueName: \"kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm\") pod \"network-check-target-j2mjp\" (UID: \"0ca0f6c2-6280-464c-8916-90374e2c88b8\") " pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:24.145736 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:24.145726 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:55:24.145775 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:24.145739 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:55:24.145775 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:24.145747 2574 projected.go:194] Error preparing data for projected volume kube-api-access-cckvm for pod openshift-network-diagnostics/network-check-target-j2mjp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:24.145838 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:24.145787 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm podName:0ca0f6c2-6280-464c-8916-90374e2c88b8 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:25.145774368 +0000 UTC m=+4.219049971 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cckvm" (UniqueName: "kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm") pod "network-check-target-j2mjp" (UID: "0ca0f6c2-6280-464c-8916-90374e2c88b8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:24.363059 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:24.362984 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:50:22 +0000 UTC" deadline="2027-10-13 17:30:09.088812622 +0000 UTC" Apr 20 14:55:24.363059 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:24.363017 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12986h34m44.725798905s" Apr 20 14:55:24.453674 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:24.452948 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:24.453674 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:24.453106 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j2mjp" podUID="0ca0f6c2-6280-464c-8916-90374e2c88b8" Apr 20 14:55:24.464011 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:24.463859 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-249.ec2.internal" event={"ID":"75684a0b9a7080a9984ae7578d5b190b","Type":"ContainerStarted","Data":"ad49fa413abb3ecfb474e4bd7e9b6a8682cfe560564d2021fbc6ae91ea46ca9e"} Apr 20 14:55:24.468537 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:24.468508 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" event={"ID":"3fea6c02-a196-43fc-bb1f-edc946b98e7f","Type":"ContainerStarted","Data":"1fbd8db37289a13f84da669869be832d3fbc3e088239ce2bda0d98c8078f0510"} Apr 20 14:55:24.473743 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:24.473694 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-plvj4" event={"ID":"6658032c-bba0-4e90-8a55-840d8cdab9e3","Type":"ContainerStarted","Data":"ee11f531e450165f7db3972c30161c51eb219b4a9dd8d4e1ff7727c15adccd37"} Apr 20 14:55:24.476837 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:24.476806 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mj558" event={"ID":"772f88da-629b-4161-9ed5-8a916387c9bd","Type":"ContainerStarted","Data":"2222681d179c4afae7e5ca8b6e8e4491606d9d83e6998e6b4f4cb9d785f2519e"} Apr 20 14:55:24.480544 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:24.480503 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" event={"ID":"e656ecc0-7223-45e9-8f4c-15a416238cc3","Type":"ContainerStarted","Data":"8bdcfe099da12d89e690483c762246aba3904b2d02715270d9415a3174838217"} Apr 20 14:55:24.483090 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:24.483029 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blx8n" event={"ID":"61708c39-4987-438d-b51f-59e8cd1a1e59","Type":"ContainerStarted","Data":"b32f4603a42f1e43093aaf6b273b9a183fdc6b4535ea5965bb259e1a1d8cc2b3"} Apr 20 14:55:24.491529 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:24.491484 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xm99w" event={"ID":"ea32ceac-045d-412e-95db-ec7a62502246","Type":"ContainerStarted","Data":"01930f62e34cba59119db12b2d5282086ce836638d3cdc4df844305607e5c3bb"} Apr 20 14:55:24.496306 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:24.496253 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" event={"ID":"e6562aeb-103a-4d96-b5d3-356a382186d6","Type":"ContainerStarted","Data":"07690ace81997013258a8e288fd5f287c2b7e52ae52d87c78de1796d691ff6a2"} Apr 20 14:55:24.500061 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:24.500036 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qxsfj" event={"ID":"10081761-39cd-4657-8ccf-94426cfd0833","Type":"ContainerStarted","Data":"fabd13721460f52ce88007455493a2b9566eebb0e2ec7cffe488aa1d644a7a94"} Apr 20 14:55:24.501770 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:24.501747 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zlwvt" event={"ID":"09f1e2bc-4d9a-4838-b68f-01c2612ca3af","Type":"ContainerStarted","Data":"f8c63a1b3216096ab197ea17be2e7923d2f9cf9723f6da99cf0401e89b9cedbb"} Apr 20 14:55:24.966271 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:24.966241 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs\") pod \"network-metrics-daemon-h24vc\" (UID: \"90e2aae6-6b60-4b8e-a0ba-12474f425b1d\") " pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:24.966418 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:24.966403 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:24.966513 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:24.966470 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs podName:90e2aae6-6b60-4b8e-a0ba-12474f425b1d nodeName:}" failed. No retries permitted until 2026-04-20 14:55:26.966450139 +0000 UTC m=+6.039725744 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs") pod "network-metrics-daemon-h24vc" (UID: "90e2aae6-6b60-4b8e-a0ba-12474f425b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:25.168560 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:25.168474 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cckvm\" (UniqueName: \"kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm\") pod \"network-check-target-j2mjp\" (UID: \"0ca0f6c2-6280-464c-8916-90374e2c88b8\") " pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:25.168818 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:25.168797 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:55:25.168915 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:25.168825 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:55:25.168915 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:25.168839 2574 projected.go:194] Error preparing data for projected volume kube-api-access-cckvm for pod openshift-network-diagnostics/network-check-target-j2mjp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:25.169024 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:25.168929 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm podName:0ca0f6c2-6280-464c-8916-90374e2c88b8 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:27.168878847 +0000 UTC m=+6.242154456 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cckvm" (UniqueName: "kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm") pod "network-check-target-j2mjp" (UID: "0ca0f6c2-6280-464c-8916-90374e2c88b8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:25.443438 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:25.442442 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:25.443438 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:25.442636 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h24vc" podUID="90e2aae6-6b60-4b8e-a0ba-12474f425b1d" Apr 20 14:55:25.516314 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:25.516278 2574 generic.go:358] "Generic (PLEG): container finished" podID="a68c7dbaa1e3f6bab6aec0a6e079f58c" containerID="15646d88d1613629fbafc0ac3cb70705baf7d6de4ba92ef9bb0d35f0ca53d851" exitCode=0 Apr 20 14:55:25.517405 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:25.517124 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-249.ec2.internal" event={"ID":"a68c7dbaa1e3f6bab6aec0a6e079f58c","Type":"ContainerDied","Data":"15646d88d1613629fbafc0ac3cb70705baf7d6de4ba92ef9bb0d35f0ca53d851"} Apr 20 14:55:25.532407 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:25.532342 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-249.ec2.internal" podStartSLOduration=3.532325434 podStartE2EDuration="3.532325434s" podCreationTimestamp="2026-04-20 14:55:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:55:24.479974475 +0000 UTC m=+3.553250106" watchObservedRunningTime="2026-04-20 14:55:25.532325434 +0000 UTC m=+4.605601085" Apr 20 14:55:26.443091 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:26.443058 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:26.443251 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:26.443182 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j2mjp" podUID="0ca0f6c2-6280-464c-8916-90374e2c88b8" Apr 20 14:55:26.529260 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:26.528579 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-249.ec2.internal" event={"ID":"a68c7dbaa1e3f6bab6aec0a6e079f58c","Type":"ContainerStarted","Data":"d8f8ea098c098d92c8a28c93d264e14419812172296b80cfc5aa46898bd72176"} Apr 20 14:55:26.989015 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:26.988854 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs\") pod \"network-metrics-daemon-h24vc\" (UID: \"90e2aae6-6b60-4b8e-a0ba-12474f425b1d\") " pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:26.989175 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:26.989007 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:26.989175 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:26.989086 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs podName:90e2aae6-6b60-4b8e-a0ba-12474f425b1d nodeName:}" failed. No retries permitted until 2026-04-20 14:55:30.989066796 +0000 UTC m=+10.062342411 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs") pod "network-metrics-daemon-h24vc" (UID: "90e2aae6-6b60-4b8e-a0ba-12474f425b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:27.191710 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:27.191034 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cckvm\" (UniqueName: \"kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm\") pod \"network-check-target-j2mjp\" (UID: \"0ca0f6c2-6280-464c-8916-90374e2c88b8\") " pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:27.191710 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:27.191219 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:55:27.191710 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:27.191242 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:55:27.191710 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:27.191255 2574 projected.go:194] Error preparing data for projected volume kube-api-access-cckvm for pod openshift-network-diagnostics/network-check-target-j2mjp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:27.191710 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:27.191316 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm podName:0ca0f6c2-6280-464c-8916-90374e2c88b8 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:31.191294836 +0000 UTC m=+10.264570448 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cckvm" (UniqueName: "kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm") pod "network-check-target-j2mjp" (UID: "0ca0f6c2-6280-464c-8916-90374e2c88b8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:27.444729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:27.443123 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:27.444729 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:27.443273 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h24vc" podUID="90e2aae6-6b60-4b8e-a0ba-12474f425b1d" Apr 20 14:55:28.442470 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:28.442430 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:28.442876 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:28.442594 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j2mjp" podUID="0ca0f6c2-6280-464c-8916-90374e2c88b8" Apr 20 14:55:29.443384 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:29.443343 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:29.443800 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:29.443494 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h24vc" podUID="90e2aae6-6b60-4b8e-a0ba-12474f425b1d" Apr 20 14:55:30.442328 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:30.442266 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:30.442524 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:30.442413 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j2mjp" podUID="0ca0f6c2-6280-464c-8916-90374e2c88b8" Apr 20 14:55:31.024699 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:31.024661 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs\") pod \"network-metrics-daemon-h24vc\" (UID: \"90e2aae6-6b60-4b8e-a0ba-12474f425b1d\") " pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:31.025156 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:31.024807 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:31.025156 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:31.024877 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs podName:90e2aae6-6b60-4b8e-a0ba-12474f425b1d nodeName:}" failed. No retries permitted until 2026-04-20 14:55:39.024856096 +0000 UTC m=+18.098131704 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs") pod "network-metrics-daemon-h24vc" (UID: "90e2aae6-6b60-4b8e-a0ba-12474f425b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:31.226724 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:31.226682 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cckvm\" (UniqueName: \"kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm\") pod \"network-check-target-j2mjp\" (UID: \"0ca0f6c2-6280-464c-8916-90374e2c88b8\") " pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:31.226883 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:31.226840 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:55:31.226883 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:31.226861 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:55:31.226883 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:31.226874 2574 projected.go:194] Error preparing data for projected volume kube-api-access-cckvm for pod openshift-network-diagnostics/network-check-target-j2mjp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:31.227011 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:31.226938 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm podName:0ca0f6c2-6280-464c-8916-90374e2c88b8 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:39.226918104 +0000 UTC m=+18.300193714 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cckvm" (UniqueName: "kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm") pod "network-check-target-j2mjp" (UID: "0ca0f6c2-6280-464c-8916-90374e2c88b8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:31.444585 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:31.444052 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:31.444585 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:31.444181 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h24vc" podUID="90e2aae6-6b60-4b8e-a0ba-12474f425b1d" Apr 20 14:55:32.443195 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:32.443160 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:32.443642 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:32.443291 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j2mjp" podUID="0ca0f6c2-6280-464c-8916-90374e2c88b8" Apr 20 14:55:33.442461 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:33.442433 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:33.442621 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:33.442533 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h24vc" podUID="90e2aae6-6b60-4b8e-a0ba-12474f425b1d" Apr 20 14:55:34.442175 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:34.442143 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:34.442572 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:34.442251 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j2mjp" podUID="0ca0f6c2-6280-464c-8916-90374e2c88b8" Apr 20 14:55:35.442800 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:35.442772 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:35.443195 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:35.442906 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h24vc" podUID="90e2aae6-6b60-4b8e-a0ba-12474f425b1d" Apr 20 14:55:36.442649 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:36.442626 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:36.442734 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:36.442719 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j2mjp" podUID="0ca0f6c2-6280-464c-8916-90374e2c88b8" Apr 20 14:55:37.442511 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:37.442469 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:37.442920 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:37.442613 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h24vc" podUID="90e2aae6-6b60-4b8e-a0ba-12474f425b1d" Apr 20 14:55:38.443231 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:38.442985 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:38.443867 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:38.443340 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j2mjp" podUID="0ca0f6c2-6280-464c-8916-90374e2c88b8" Apr 20 14:55:39.084683 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:39.084646 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs\") pod \"network-metrics-daemon-h24vc\" (UID: \"90e2aae6-6b60-4b8e-a0ba-12474f425b1d\") " pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:39.084869 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:39.084803 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:39.084936 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:39.084876 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs podName:90e2aae6-6b60-4b8e-a0ba-12474f425b1d nodeName:}" failed. No retries permitted until 2026-04-20 14:55:55.084855248 +0000 UTC m=+34.158130863 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs") pod "network-metrics-daemon-h24vc" (UID: "90e2aae6-6b60-4b8e-a0ba-12474f425b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:39.286476 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:39.286431 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cckvm\" (UniqueName: \"kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm\") pod \"network-check-target-j2mjp\" (UID: \"0ca0f6c2-6280-464c-8916-90374e2c88b8\") " pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:39.286653 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:39.286608 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:55:39.286653 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:39.286630 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:55:39.286653 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:39.286643 2574 projected.go:194] Error preparing data for projected volume kube-api-access-cckvm for pod openshift-network-diagnostics/network-check-target-j2mjp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:39.286804 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:39.286707 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm podName:0ca0f6c2-6280-464c-8916-90374e2c88b8 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:55.286687091 +0000 UTC m=+34.359962695 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cckvm" (UniqueName: "kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm") pod "network-check-target-j2mjp" (UID: "0ca0f6c2-6280-464c-8916-90374e2c88b8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:39.442611 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:39.442531 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:39.442763 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:39.442662 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h24vc" podUID="90e2aae6-6b60-4b8e-a0ba-12474f425b1d" Apr 20 14:55:40.442643 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:40.442611 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:40.443106 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:40.442749 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j2mjp" podUID="0ca0f6c2-6280-464c-8916-90374e2c88b8" Apr 20 14:55:41.443117 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:41.442858 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:41.443940 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:41.443191 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h24vc" podUID="90e2aae6-6b60-4b8e-a0ba-12474f425b1d" Apr 20 14:55:41.553623 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:41.553596 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mj558" event={"ID":"772f88da-629b-4161-9ed5-8a916387c9bd","Type":"ContainerStarted","Data":"7bd659cdfb7864758948bda7233bec726d2b08413a28b314e463ee3c52ee4e3a"} Apr 20 14:55:41.554986 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:41.554954 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" event={"ID":"e656ecc0-7223-45e9-8f4c-15a416238cc3","Type":"ContainerStarted","Data":"8ffc79aabc795c0c8c250ff9a984eb9d0301eca3bad762cc0a4fb0920453b420"} Apr 20 14:55:41.556300 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:41.556277 2574 generic.go:358] "Generic (PLEG): container finished" podID="61708c39-4987-438d-b51f-59e8cd1a1e59" containerID="5a6c20b257b5bb0cded13dc7baabd4686700de1aa3ad3286116f14a7ff64ee8b" exitCode=0 Apr 20 14:55:41.556440 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:41.556391 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blx8n" event={"ID":"61708c39-4987-438d-b51f-59e8cd1a1e59","Type":"ContainerDied","Data":"5a6c20b257b5bb0cded13dc7baabd4686700de1aa3ad3286116f14a7ff64ee8b"} Apr 20 14:55:41.558015 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:41.557992 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" event={"ID":"e6562aeb-103a-4d96-b5d3-356a382186d6","Type":"ContainerStarted","Data":"3d70d5ea5af05f4fb997a2afae0f39e9fce9a06913dbe29c67a42fb8372b96b8"} Apr 20 14:55:41.559531 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:41.559508 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qxsfj" event={"ID":"10081761-39cd-4657-8ccf-94426cfd0833","Type":"ContainerStarted","Data":"854b681834256b575a134c4c6a17ce8cf4f4c8da2dfb4dde44f987e16fc3367b"} Apr 20 14:55:41.560882 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:41.560863 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-zlwvt" event={"ID":"09f1e2bc-4d9a-4838-b68f-01c2612ca3af","Type":"ContainerStarted","Data":"c65c2aa4c898f405648d9f5eb1986fc2115ef859e87c15637772d1fba5130952"} Apr 20 14:55:41.562024 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:41.562004 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" event={"ID":"3fea6c02-a196-43fc-bb1f-edc946b98e7f","Type":"ContainerStarted","Data":"75ec56c9cd9207128e17ba9482a4f232a4e4ede939dbb40affd6de581738d13f"} Apr 20 14:55:41.563731 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:41.563573 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-plvj4" event={"ID":"6658032c-bba0-4e90-8a55-840d8cdab9e3","Type":"ContainerStarted","Data":"74da2563a73fce31c67370d8d16eda6dd247aa84802d6eb12ce1f121b142e0d6"} Apr 20 14:55:41.568495 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:41.568453 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-mj558" podStartSLOduration=3.524002408 podStartE2EDuration="20.568439303s" podCreationTimestamp="2026-04-20 14:55:21 +0000 UTC" firstStartedPulling="2026-04-20 14:55:24.030490479 +0000 UTC m=+3.103766087" lastFinishedPulling="2026-04-20 14:55:41.074927375 +0000 UTC m=+20.148202982" observedRunningTime="2026-04-20 14:55:41.567219215 +0000 UTC m=+20.640494840" watchObservedRunningTime="2026-04-20 14:55:41.568439303 +0000 UTC m=+20.641714948" Apr 20 14:55:41.568669 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:41.568596 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-249.ec2.internal" podStartSLOduration=19.568586388 podStartE2EDuration="19.568586388s" podCreationTimestamp="2026-04-20 14:55:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:55:26.543691476 +0000 UTC m=+5.616967102" watchObservedRunningTime="2026-04-20 14:55:41.568586388 +0000 UTC m=+20.641862014" Apr 20 14:55:41.597023 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:41.596971 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-plvj4" podStartSLOduration=3.884193599 podStartE2EDuration="20.596952263s" podCreationTimestamp="2026-04-20 14:55:21 +0000 UTC" firstStartedPulling="2026-04-20 14:55:24.030454616 +0000 UTC m=+3.103730224" lastFinishedPulling="2026-04-20 14:55:40.743213275 +0000 UTC m=+19.816488888" observedRunningTime="2026-04-20 14:55:41.596297151 +0000 UTC m=+20.669572776" watchObservedRunningTime="2026-04-20 14:55:41.596952263 +0000 UTC m=+20.670227889" Apr 20 14:55:41.597181 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:41.597152 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-2xvqt" podStartSLOduration=3.885406214 podStartE2EDuration="20.597145708s" podCreationTimestamp="2026-04-20 14:55:21 +0000 UTC" firstStartedPulling="2026-04-20 14:55:24.031465672 +0000 UTC m=+3.104741277" lastFinishedPulling="2026-04-20 14:55:40.74320515 +0000 UTC m=+19.816480771" observedRunningTime="2026-04-20 14:55:41.583434825 +0000 UTC m=+20.656710450" watchObservedRunningTime="2026-04-20 14:55:41.597145708 +0000 UTC m=+20.670421335" Apr 20 14:55:41.611836 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:41.611801 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-zlwvt" podStartSLOduration=3.55374043 podStartE2EDuration="20.61178148s" podCreationTimestamp="2026-04-20 14:55:21 +0000 UTC" firstStartedPulling="2026-04-20 14:55:24.02475867 +0000 UTC m=+3.098034287" lastFinishedPulling="2026-04-20 14:55:41.08279973 +0000 UTC m=+20.156075337" observedRunningTime="2026-04-20 14:55:41.611460105 +0000 UTC m=+20.684735729" watchObservedRunningTime="2026-04-20 14:55:41.61178148 +0000 UTC m=+20.685057149" Apr 20 14:55:41.624778 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:41.624743 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qxsfj" podStartSLOduration=3.908604942 podStartE2EDuration="20.624731131s" podCreationTimestamp="2026-04-20 14:55:21 +0000 UTC" firstStartedPulling="2026-04-20 14:55:24.027145303 +0000 UTC m=+3.100420912" lastFinishedPulling="2026-04-20 14:55:40.743271484 +0000 UTC m=+19.816547101" observedRunningTime="2026-04-20 14:55:41.624691874 +0000 UTC m=+20.697967499" watchObservedRunningTime="2026-04-20 14:55:41.624731131 +0000 UTC m=+20.698006755" Apr 20 14:55:42.340294 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:42.340139 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 14:55:42.392475 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:42.392364 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T14:55:42.340291392Z","UUID":"172513db-5d51-4514-b637-9994e2b02abb","Handler":null,"Name":"","Endpoint":""} Apr 20 14:55:42.393886 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:42.393865 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 14:55:42.393981 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:42.393896 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 14:55:42.443040 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:42.443016 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:42.443171 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:42.443129 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j2mjp" podUID="0ca0f6c2-6280-464c-8916-90374e2c88b8" Apr 20 14:55:42.567853 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:42.567806 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" event={"ID":"e656ecc0-7223-45e9-8f4c-15a416238cc3","Type":"ContainerStarted","Data":"f8b2edcc9afd1e99ce3267c65a776e1461acb2dea4df8f8981cb6666662af13b"} Apr 20 14:55:42.572077 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:42.571979 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" event={"ID":"e6562aeb-103a-4d96-b5d3-356a382186d6","Type":"ContainerStarted","Data":"c63184f6ff9791a95c330a2dff8237fefe43d49c430bf680b36a9c5ddef240d3"} Apr 20 14:55:42.572077 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:42.572021 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" event={"ID":"e6562aeb-103a-4d96-b5d3-356a382186d6","Type":"ContainerStarted","Data":"fe326b85853f9b10ff88ebcfd018f8f176b72dff637d04aa433ec92007c50e35"} Apr 20 14:55:42.572077 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:42.572035 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" event={"ID":"e6562aeb-103a-4d96-b5d3-356a382186d6","Type":"ContainerStarted","Data":"b7dfae0fb2826e5c8186f7e36ae3d49ee4fe899bd9b9f99f33a39f0f7fe27ee5"} Apr 20 14:55:42.572077 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:42.572048 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" event={"ID":"e6562aeb-103a-4d96-b5d3-356a382186d6","Type":"ContainerStarted","Data":"7e86c79b3dcad53476bc283ab7c72020c62cf3af35c59faa93081e585f9f3650"} Apr 20 14:55:42.572077 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:42.572058 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" event={"ID":"e6562aeb-103a-4d96-b5d3-356a382186d6","Type":"ContainerStarted","Data":"1eb82ffba19eac439ee72efa71651f0694aecede8dee06e79bd2720113519ce5"} Apr 20 14:55:43.443107 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:43.443079 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:43.443242 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:43.443213 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h24vc" podUID="90e2aae6-6b60-4b8e-a0ba-12474f425b1d" Apr 20 14:55:43.576002 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:43.575972 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" event={"ID":"e656ecc0-7223-45e9-8f4c-15a416238cc3","Type":"ContainerStarted","Data":"9e01d22bc50242802f66f664ab513f39ad03d886f9dba0587af355410f47b5f3"} Apr 20 14:55:43.577315 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:43.577282 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xm99w" event={"ID":"ea32ceac-045d-412e-95db-ec7a62502246","Type":"ContainerStarted","Data":"24b056af3f73e79ef3c9ac0c0444d623abbbff7b54e9f4e4101f17a666190abb"} Apr 20 14:55:43.594796 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:43.594750 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-sttsv" podStartSLOduration=3.2227557239999998 podStartE2EDuration="22.594736869s" podCreationTimestamp="2026-04-20 14:55:21 +0000 UTC" firstStartedPulling="2026-04-20 14:55:24.028871948 +0000 UTC m=+3.102147552" lastFinishedPulling="2026-04-20 14:55:43.400853078 +0000 UTC m=+22.474128697" observedRunningTime="2026-04-20 14:55:43.594236423 +0000 UTC m=+22.667512048" watchObservedRunningTime="2026-04-20 14:55:43.594736869 +0000 UTC m=+22.668012493" Apr 20 14:55:43.608067 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:43.608027 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-xm99w" podStartSLOduration=5.554687911 podStartE2EDuration="22.608018067s" podCreationTimestamp="2026-04-20 14:55:21 +0000 UTC" firstStartedPulling="2026-04-20 14:55:24.023467961 +0000 UTC m=+3.096743567" lastFinishedPulling="2026-04-20 14:55:41.076798105 +0000 UTC m=+20.150073723" observedRunningTime="2026-04-20 14:55:43.607751784 +0000 UTC m=+22.681027405" watchObservedRunningTime="2026-04-20 14:55:43.608018067 +0000 UTC m=+22.681293694" Apr 20 14:55:44.443148 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:44.443114 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:44.443326 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:44.443216 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j2mjp" podUID="0ca0f6c2-6280-464c-8916-90374e2c88b8" Apr 20 14:55:44.582311 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:44.582271 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" event={"ID":"e6562aeb-103a-4d96-b5d3-356a382186d6","Type":"ContainerStarted","Data":"38c9b59e58a840046a1be42947a4eae9b1515e317f9773c165e52c651dbdc0af"} Apr 20 14:55:45.443083 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:45.443048 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:45.443241 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:45.443199 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h24vc" podUID="90e2aae6-6b60-4b8e-a0ba-12474f425b1d" Apr 20 14:55:45.917230 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:45.917207 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-plvj4" Apr 20 14:55:45.917986 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:45.917969 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-plvj4" Apr 20 14:55:46.442537 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:46.442473 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:46.442658 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:46.442555 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j2mjp" podUID="0ca0f6c2-6280-464c-8916-90374e2c88b8" Apr 20 14:55:46.587069 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:46.587037 2574 generic.go:358] "Generic (PLEG): container finished" podID="61708c39-4987-438d-b51f-59e8cd1a1e59" containerID="0de1dcce616b799553b3da6954bc71660991f8dab3664e83191189f281b6acbc" exitCode=0 Apr 20 14:55:46.587231 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:46.587109 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blx8n" event={"ID":"61708c39-4987-438d-b51f-59e8cd1a1e59","Type":"ContainerDied","Data":"0de1dcce616b799553b3da6954bc71660991f8dab3664e83191189f281b6acbc"} Apr 20 14:55:46.592543 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:46.592123 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" event={"ID":"e6562aeb-103a-4d96-b5d3-356a382186d6","Type":"ContainerStarted","Data":"70780b020613adfc1ddc0945b2389862460bd267d7814ea8a3c5b56261bf098d"} Apr 20 14:55:46.592543 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:46.592159 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-plvj4" Apr 20 14:55:46.592543 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:46.592176 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:46.592543 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:46.592188 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:46.592543 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:46.592303 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-plvj4" Apr 20 14:55:46.592543 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:46.592329 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:46.607452 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:46.607433 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:46.607525 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:46.607500 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:55:46.648172 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:46.648135 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" podStartSLOduration=8.275813479 podStartE2EDuration="25.648123775s" podCreationTimestamp="2026-04-20 14:55:21 +0000 UTC" firstStartedPulling="2026-04-20 14:55:24.021709425 +0000 UTC m=+3.094985030" lastFinishedPulling="2026-04-20 14:55:41.394019711 +0000 UTC m=+20.467295326" observedRunningTime="2026-04-20 14:55:46.647830195 +0000 UTC m=+25.721105821" watchObservedRunningTime="2026-04-20 14:55:46.648123775 +0000 UTC m=+25.721399398" Apr 20 14:55:47.442355 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:47.442320 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:47.442787 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:47.442440 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h24vc" podUID="90e2aae6-6b60-4b8e-a0ba-12474f425b1d" Apr 20 14:55:47.594883 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:47.594643 2574 generic.go:358] "Generic (PLEG): container finished" podID="61708c39-4987-438d-b51f-59e8cd1a1e59" containerID="736b148e45a624d9156fc78e7b8c9f48c5361896dbce4d7f1dedcb6c160618f2" exitCode=0 Apr 20 14:55:47.594883 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:47.594718 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blx8n" event={"ID":"61708c39-4987-438d-b51f-59e8cd1a1e59","Type":"ContainerDied","Data":"736b148e45a624d9156fc78e7b8c9f48c5361896dbce4d7f1dedcb6c160618f2"} Apr 20 14:55:47.898567 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:47.898491 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h24vc"] Apr 20 14:55:47.898735 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:47.898608 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:47.898797 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:47.898729 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h24vc" podUID="90e2aae6-6b60-4b8e-a0ba-12474f425b1d" Apr 20 14:55:47.901511 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:47.901485 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-j2mjp"] Apr 20 14:55:47.901611 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:47.901600 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:47.901720 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:47.901695 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j2mjp" podUID="0ca0f6c2-6280-464c-8916-90374e2c88b8" Apr 20 14:55:48.598774 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:48.598691 2574 generic.go:358] "Generic (PLEG): container finished" podID="61708c39-4987-438d-b51f-59e8cd1a1e59" containerID="cca9caf9e484175a901149ef0426a77a04ba09fa43948141c5f33160d1fb68f7" exitCode=0 Apr 20 14:55:48.599216 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:48.598775 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blx8n" event={"ID":"61708c39-4987-438d-b51f-59e8cd1a1e59","Type":"ContainerDied","Data":"cca9caf9e484175a901149ef0426a77a04ba09fa43948141c5f33160d1fb68f7"} Apr 20 14:55:49.442974 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:49.442944 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:49.442974 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:49.442968 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:49.443222 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:49.443060 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j2mjp" podUID="0ca0f6c2-6280-464c-8916-90374e2c88b8" Apr 20 14:55:49.443222 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:49.443195 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h24vc" podUID="90e2aae6-6b60-4b8e-a0ba-12474f425b1d" Apr 20 14:55:51.443601 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:51.443569 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:51.444266 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:51.443657 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:51.444266 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:51.443687 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j2mjp" podUID="0ca0f6c2-6280-464c-8916-90374e2c88b8" Apr 20 14:55:51.444266 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:51.443740 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h24vc" podUID="90e2aae6-6b60-4b8e-a0ba-12474f425b1d" Apr 20 14:55:53.442241 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.442207 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:53.442712 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.442226 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:53.442712 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:53.442347 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h24vc" podUID="90e2aae6-6b60-4b8e-a0ba-12474f425b1d" Apr 20 14:55:53.442712 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:53.442391 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-j2mjp" podUID="0ca0f6c2-6280-464c-8916-90374e2c88b8" Apr 20 14:55:53.796175 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.796086 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-249.ec2.internal" event="NodeReady" Apr 20 14:55:53.796344 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.796226 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 14:55:53.838300 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.838269 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pq8qx"] Apr 20 14:55:53.867990 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.867958 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-d8wts"] Apr 20 14:55:53.868139 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.868097 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pq8qx" Apr 20 14:55:53.870558 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.870529 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tr5vz\"" Apr 20 14:55:53.870696 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.870534 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 14:55:53.870696 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.870604 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 14:55:53.880944 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.880914 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pq8qx"] Apr 20 14:55:53.880944 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.880939 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-d8wts"] Apr 20 14:55:53.881108 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.881031 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-d8wts" Apr 20 14:55:53.883528 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.883500 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 14:55:53.883650 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.883635 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xg95b\"" Apr 20 14:55:53.883723 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.883650 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 14:55:53.883723 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.883701 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 14:55:53.898169 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.898146 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a001809b-d266-4d12-b9a2-d400942f2755-cert\") pod \"ingress-canary-d8wts\" (UID: \"a001809b-d266-4d12-b9a2-d400942f2755\") " pod="openshift-ingress-canary/ingress-canary-d8wts" Apr 20 14:55:53.898273 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.898212 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/057f0667-15cc-4883-a91d-c360de54e58f-tmp-dir\") pod \"dns-default-pq8qx\" (UID: \"057f0667-15cc-4883-a91d-c360de54e58f\") " pod="openshift-dns/dns-default-pq8qx" Apr 20 14:55:53.898273 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.898232 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/057f0667-15cc-4883-a91d-c360de54e58f-metrics-tls\") pod \"dns-default-pq8qx\" (UID: \"057f0667-15cc-4883-a91d-c360de54e58f\") " pod="openshift-dns/dns-default-pq8qx" Apr 20 14:55:53.898359 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.898290 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99fz7\" (UniqueName: \"kubernetes.io/projected/a001809b-d266-4d12-b9a2-d400942f2755-kube-api-access-99fz7\") pod \"ingress-canary-d8wts\" (UID: \"a001809b-d266-4d12-b9a2-d400942f2755\") " pod="openshift-ingress-canary/ingress-canary-d8wts" Apr 20 14:55:53.898359 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.898332 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/057f0667-15cc-4883-a91d-c360de54e58f-config-volume\") pod \"dns-default-pq8qx\" (UID: \"057f0667-15cc-4883-a91d-c360de54e58f\") " pod="openshift-dns/dns-default-pq8qx" Apr 20 14:55:53.898359 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.898351 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz2cq\" (UniqueName: \"kubernetes.io/projected/057f0667-15cc-4883-a91d-c360de54e58f-kube-api-access-tz2cq\") pod \"dns-default-pq8qx\" (UID: \"057f0667-15cc-4883-a91d-c360de54e58f\") " pod="openshift-dns/dns-default-pq8qx" Apr 20 14:55:53.999544 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.999511 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99fz7\" (UniqueName: \"kubernetes.io/projected/a001809b-d266-4d12-b9a2-d400942f2755-kube-api-access-99fz7\") pod \"ingress-canary-d8wts\" (UID: \"a001809b-d266-4d12-b9a2-d400942f2755\") " pod="openshift-ingress-canary/ingress-canary-d8wts" Apr 20 14:55:53.999689 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.999559 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/057f0667-15cc-4883-a91d-c360de54e58f-config-volume\") pod \"dns-default-pq8qx\" (UID: \"057f0667-15cc-4883-a91d-c360de54e58f\") " pod="openshift-dns/dns-default-pq8qx" Apr 20 14:55:53.999689 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.999577 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tz2cq\" (UniqueName: \"kubernetes.io/projected/057f0667-15cc-4883-a91d-c360de54e58f-kube-api-access-tz2cq\") pod \"dns-default-pq8qx\" (UID: \"057f0667-15cc-4883-a91d-c360de54e58f\") " pod="openshift-dns/dns-default-pq8qx" Apr 20 14:55:53.999689 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.999633 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a001809b-d266-4d12-b9a2-d400942f2755-cert\") pod \"ingress-canary-d8wts\" (UID: \"a001809b-d266-4d12-b9a2-d400942f2755\") " pod="openshift-ingress-canary/ingress-canary-d8wts" Apr 20 14:55:53.999885 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.999705 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/057f0667-15cc-4883-a91d-c360de54e58f-tmp-dir\") pod \"dns-default-pq8qx\" (UID: \"057f0667-15cc-4883-a91d-c360de54e58f\") " pod="openshift-dns/dns-default-pq8qx" Apr 20 14:55:53.999885 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:53.999737 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/057f0667-15cc-4883-a91d-c360de54e58f-metrics-tls\") pod \"dns-default-pq8qx\" (UID: \"057f0667-15cc-4883-a91d-c360de54e58f\") " pod="openshift-dns/dns-default-pq8qx" Apr 20 14:55:53.999885 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:53.999842 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:55:54.000030 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:53.999927 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a001809b-d266-4d12-b9a2-d400942f2755-cert podName:a001809b-d266-4d12-b9a2-d400942f2755 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:54.499891895 +0000 UTC m=+33.573167502 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a001809b-d266-4d12-b9a2-d400942f2755-cert") pod "ingress-canary-d8wts" (UID: "a001809b-d266-4d12-b9a2-d400942f2755") : secret "canary-serving-cert" not found Apr 20 14:55:54.000030 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:53.999843 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:55:54.000030 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:53.999982 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/057f0667-15cc-4883-a91d-c360de54e58f-metrics-tls podName:057f0667-15cc-4883-a91d-c360de54e58f nodeName:}" failed. No retries permitted until 2026-04-20 14:55:54.499970496 +0000 UTC m=+33.573246098 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/057f0667-15cc-4883-a91d-c360de54e58f-metrics-tls") pod "dns-default-pq8qx" (UID: "057f0667-15cc-4883-a91d-c360de54e58f") : secret "dns-default-metrics-tls" not found Apr 20 14:55:54.000179 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:54.000040 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/057f0667-15cc-4883-a91d-c360de54e58f-tmp-dir\") pod \"dns-default-pq8qx\" (UID: \"057f0667-15cc-4883-a91d-c360de54e58f\") " pod="openshift-dns/dns-default-pq8qx" Apr 20 14:55:54.000179 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:54.000143 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/057f0667-15cc-4883-a91d-c360de54e58f-config-volume\") pod \"dns-default-pq8qx\" (UID: \"057f0667-15cc-4883-a91d-c360de54e58f\") " pod="openshift-dns/dns-default-pq8qx" Apr 20 14:55:54.009897 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:54.009873 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz2cq\" (UniqueName: \"kubernetes.io/projected/057f0667-15cc-4883-a91d-c360de54e58f-kube-api-access-tz2cq\") pod \"dns-default-pq8qx\" (UID: \"057f0667-15cc-4883-a91d-c360de54e58f\") " pod="openshift-dns/dns-default-pq8qx" Apr 20 14:55:54.010021 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:54.009968 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99fz7\" (UniqueName: \"kubernetes.io/projected/a001809b-d266-4d12-b9a2-d400942f2755-kube-api-access-99fz7\") pod \"ingress-canary-d8wts\" (UID: \"a001809b-d266-4d12-b9a2-d400942f2755\") " pod="openshift-ingress-canary/ingress-canary-d8wts" Apr 20 14:55:54.504120 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:54.504098 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/057f0667-15cc-4883-a91d-c360de54e58f-metrics-tls\") pod \"dns-default-pq8qx\" (UID: \"057f0667-15cc-4883-a91d-c360de54e58f\") " pod="openshift-dns/dns-default-pq8qx" Apr 20 14:55:54.504458 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:54.504156 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a001809b-d266-4d12-b9a2-d400942f2755-cert\") pod \"ingress-canary-d8wts\" (UID: \"a001809b-d266-4d12-b9a2-d400942f2755\") " pod="openshift-ingress-canary/ingress-canary-d8wts" Apr 20 14:55:54.504458 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:54.504259 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:55:54.504458 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:54.504273 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:55:54.504458 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:54.504313 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/057f0667-15cc-4883-a91d-c360de54e58f-metrics-tls podName:057f0667-15cc-4883-a91d-c360de54e58f nodeName:}" failed. No retries permitted until 2026-04-20 14:55:55.504298414 +0000 UTC m=+34.577574016 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/057f0667-15cc-4883-a91d-c360de54e58f-metrics-tls") pod "dns-default-pq8qx" (UID: "057f0667-15cc-4883-a91d-c360de54e58f") : secret "dns-default-metrics-tls" not found Apr 20 14:55:54.504458 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:54.504326 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a001809b-d266-4d12-b9a2-d400942f2755-cert podName:a001809b-d266-4d12-b9a2-d400942f2755 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:55.504320671 +0000 UTC m=+34.577596273 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a001809b-d266-4d12-b9a2-d400942f2755-cert") pod "ingress-canary-d8wts" (UID: "a001809b-d266-4d12-b9a2-d400942f2755") : secret "canary-serving-cert" not found Apr 20 14:55:54.611948 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:54.611915 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blx8n" event={"ID":"61708c39-4987-438d-b51f-59e8cd1a1e59","Type":"ContainerStarted","Data":"651fa68c6891d0b3da5f4b88f75a8624f9ba5552a140866502156f26e2392499"} Apr 20 14:55:55.107948 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:55.107917 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs\") pod \"network-metrics-daemon-h24vc\" (UID: \"90e2aae6-6b60-4b8e-a0ba-12474f425b1d\") " pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:55.108256 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:55.108077 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:55.108256 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:55.108167 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs podName:90e2aae6-6b60-4b8e-a0ba-12474f425b1d nodeName:}" failed. No retries permitted until 2026-04-20 14:56:27.108146355 +0000 UTC m=+66.181421977 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs") pod "network-metrics-daemon-h24vc" (UID: "90e2aae6-6b60-4b8e-a0ba-12474f425b1d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:55.309435 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:55.309407 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cckvm\" (UniqueName: \"kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm\") pod \"network-check-target-j2mjp\" (UID: \"0ca0f6c2-6280-464c-8916-90374e2c88b8\") " pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:55.309575 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:55.309561 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:55:55.309616 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:55.309583 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:55:55.309616 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:55.309593 2574 projected.go:194] Error preparing data for projected volume kube-api-access-cckvm for pod openshift-network-diagnostics/network-check-target-j2mjp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:55.309678 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:55.309641 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm podName:0ca0f6c2-6280-464c-8916-90374e2c88b8 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:27.309626442 +0000 UTC m=+66.382902045 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cckvm" (UniqueName: "kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm") pod "network-check-target-j2mjp" (UID: "0ca0f6c2-6280-464c-8916-90374e2c88b8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:55.442517 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:55.442449 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:55:55.442637 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:55.442450 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:55:55.446682 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:55.446554 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6rqtq\"" Apr 20 14:55:55.446840 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:55.446822 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 14:55:55.448127 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:55.447523 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 14:55:55.448127 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:55.447580 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 14:55:55.448127 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:55.447891 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-6x78v\"" Apr 20 14:55:55.511323 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:55.511297 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/057f0667-15cc-4883-a91d-c360de54e58f-metrics-tls\") pod \"dns-default-pq8qx\" (UID: \"057f0667-15cc-4883-a91d-c360de54e58f\") " pod="openshift-dns/dns-default-pq8qx" Apr 20 14:55:55.511596 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:55.511354 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a001809b-d266-4d12-b9a2-d400942f2755-cert\") pod \"ingress-canary-d8wts\" (UID: \"a001809b-d266-4d12-b9a2-d400942f2755\") " pod="openshift-ingress-canary/ingress-canary-d8wts" Apr 20 14:55:55.511596 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:55.511453 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:55:55.511596 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:55.511506 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a001809b-d266-4d12-b9a2-d400942f2755-cert podName:a001809b-d266-4d12-b9a2-d400942f2755 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:57.511489922 +0000 UTC m=+36.584765528 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a001809b-d266-4d12-b9a2-d400942f2755-cert") pod "ingress-canary-d8wts" (UID: "a001809b-d266-4d12-b9a2-d400942f2755") : secret "canary-serving-cert" not found Apr 20 14:55:55.511596 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:55.511509 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:55:55.511596 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:55.511555 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/057f0667-15cc-4883-a91d-c360de54e58f-metrics-tls podName:057f0667-15cc-4883-a91d-c360de54e58f nodeName:}" failed. No retries permitted until 2026-04-20 14:55:57.511539105 +0000 UTC m=+36.584814710 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/057f0667-15cc-4883-a91d-c360de54e58f-metrics-tls") pod "dns-default-pq8qx" (UID: "057f0667-15cc-4883-a91d-c360de54e58f") : secret "dns-default-metrics-tls" not found Apr 20 14:55:55.615997 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:55.615971 2574 generic.go:358] "Generic (PLEG): container finished" podID="61708c39-4987-438d-b51f-59e8cd1a1e59" containerID="651fa68c6891d0b3da5f4b88f75a8624f9ba5552a140866502156f26e2392499" exitCode=0 Apr 20 14:55:55.616079 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:55.616028 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blx8n" event={"ID":"61708c39-4987-438d-b51f-59e8cd1a1e59","Type":"ContainerDied","Data":"651fa68c6891d0b3da5f4b88f75a8624f9ba5552a140866502156f26e2392499"} Apr 20 14:55:56.620476 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.620444 2574 generic.go:358] "Generic (PLEG): container finished" podID="61708c39-4987-438d-b51f-59e8cd1a1e59" containerID="00c8faa6cedd688abacad6aea8b64f84b60370c4ec8dcb2cefecd1948032baaa" exitCode=0 Apr 20 14:55:56.620880 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.620512 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blx8n" event={"ID":"61708c39-4987-438d-b51f-59e8cd1a1e59","Type":"ContainerDied","Data":"00c8faa6cedd688abacad6aea8b64f84b60370c4ec8dcb2cefecd1948032baaa"} Apr 20 14:55:56.928238 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.928169 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4h869"] Apr 20 14:55:56.947218 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.946642 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-4z848"] Apr 20 14:55:56.947218 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.946921 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4h869" Apr 20 14:55:56.949483 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.949460 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-2scz9\"" Apr 20 14:55:56.962496 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.962471 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-88xfv"] Apr 20 14:55:56.962633 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.962607 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-4z848" Apr 20 14:55:56.965255 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.965235 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 20 14:55:56.965627 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.965471 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 20 14:55:56.965627 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.965477 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 20 14:55:56.965627 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.965509 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:55:56.965627 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.965570 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-rc282\"" Apr 20 14:55:56.971494 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.971473 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 20 14:55:56.974952 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.974934 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-xc2s2"] Apr 20 14:55:56.975090 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.975076 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-88xfv" Apr 20 14:55:56.977478 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.977458 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 20 14:55:56.977568 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.977479 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 20 14:55:56.977568 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.977522 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:55:56.977568 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.977551 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-fn54p\"" Apr 20 14:55:56.987627 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.987608 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4h869"] Apr 20 14:55:56.987627 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.987617 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-xc2s2" Apr 20 14:55:56.987761 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.987640 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-4z848"] Apr 20 14:55:56.987761 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.987656 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-88xfv"] Apr 20 14:55:56.987761 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.987669 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-xc2s2"] Apr 20 14:55:56.990085 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.990067 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 14:55:56.990085 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.990081 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 20 14:55:56.990245 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.990134 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 20 14:55:56.990245 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.990147 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 14:55:56.990245 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.990137 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-b49p7\"" Apr 20 14:55:56.994640 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:56.994617 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 20 14:55:57.022081 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.022058 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8d0f9644-dc24-4c95-b57e-562d7aec32f9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-88xfv\" (UID: \"8d0f9644-dc24-4c95-b57e-562d7aec32f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-88xfv" Apr 20 14:55:57.022190 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.022092 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54d05c4-b074-4189-b1dd-7ff476b824ec-config\") pod \"console-operator-9d4b6777b-4z848\" (UID: \"d54d05c4-b074-4189-b1dd-7ff476b824ec\") " pod="openshift-console-operator/console-operator-9d4b6777b-4z848" Apr 20 14:55:57.022190 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.022115 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d54d05c4-b074-4189-b1dd-7ff476b824ec-trusted-ca\") pod \"console-operator-9d4b6777b-4z848\" (UID: \"d54d05c4-b074-4189-b1dd-7ff476b824ec\") " pod="openshift-console-operator/console-operator-9d4b6777b-4z848" Apr 20 14:55:57.022190 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.022138 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rtnv\" (UniqueName: \"kubernetes.io/projected/d54d05c4-b074-4189-b1dd-7ff476b824ec-kube-api-access-9rtnv\") pod \"console-operator-9d4b6777b-4z848\" (UID: \"d54d05c4-b074-4189-b1dd-7ff476b824ec\") " pod="openshift-console-operator/console-operator-9d4b6777b-4z848" Apr 20 14:55:57.022292 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.022191 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rb2b\" (UniqueName: \"kubernetes.io/projected/1b93bf46-c126-4ef5-9add-d72c0cbb7dae-kube-api-access-5rb2b\") pod \"insights-operator-585dfdc468-xc2s2\" (UID: \"1b93bf46-c126-4ef5-9add-d72c0cbb7dae\") " pod="openshift-insights/insights-operator-585dfdc468-xc2s2" Apr 20 14:55:57.022292 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.022248 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b93bf46-c126-4ef5-9add-d72c0cbb7dae-service-ca-bundle\") pod \"insights-operator-585dfdc468-xc2s2\" (UID: \"1b93bf46-c126-4ef5-9add-d72c0cbb7dae\") " pod="openshift-insights/insights-operator-585dfdc468-xc2s2" Apr 20 14:55:57.022354 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.022296 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b93bf46-c126-4ef5-9add-d72c0cbb7dae-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-xc2s2\" (UID: \"1b93bf46-c126-4ef5-9add-d72c0cbb7dae\") " pod="openshift-insights/insights-operator-585dfdc468-xc2s2" Apr 20 14:55:57.022354 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.022320 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1b93bf46-c126-4ef5-9add-d72c0cbb7dae-snapshots\") pod \"insights-operator-585dfdc468-xc2s2\" (UID: \"1b93bf46-c126-4ef5-9add-d72c0cbb7dae\") " pod="openshift-insights/insights-operator-585dfdc468-xc2s2" Apr 20 14:55:57.022439 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.022361 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b93bf46-c126-4ef5-9add-d72c0cbb7dae-tmp\") pod \"insights-operator-585dfdc468-xc2s2\" (UID: \"1b93bf46-c126-4ef5-9add-d72c0cbb7dae\") " pod="openshift-insights/insights-operator-585dfdc468-xc2s2" Apr 20 14:55:57.022439 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.022405 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d54d05c4-b074-4189-b1dd-7ff476b824ec-serving-cert\") pod \"console-operator-9d4b6777b-4z848\" (UID: \"d54d05c4-b074-4189-b1dd-7ff476b824ec\") " pod="openshift-console-operator/console-operator-9d4b6777b-4z848" Apr 20 14:55:57.022498 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.022435 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp97b\" (UniqueName: \"kubernetes.io/projected/0ac42d0e-8ffe-4cc9-866d-cb7075ee1fde-kube-api-access-sp97b\") pod \"network-check-source-8894fc9bd-4h869\" (UID: \"0ac42d0e-8ffe-4cc9-866d-cb7075ee1fde\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4h869" Apr 20 14:55:57.022533 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.022495 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzkwm\" (UniqueName: \"kubernetes.io/projected/8d0f9644-dc24-4c95-b57e-562d7aec32f9-kube-api-access-rzkwm\") pod \"cluster-samples-operator-6dc5bdb6b4-88xfv\" (UID: \"8d0f9644-dc24-4c95-b57e-562d7aec32f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-88xfv" Apr 20 14:55:57.022565 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.022522 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b93bf46-c126-4ef5-9add-d72c0cbb7dae-serving-cert\") pod \"insights-operator-585dfdc468-xc2s2\" (UID: \"1b93bf46-c126-4ef5-9add-d72c0cbb7dae\") " pod="openshift-insights/insights-operator-585dfdc468-xc2s2" Apr 20 14:55:57.030267 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.030247 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr"] Apr 20 14:55:57.043683 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.043631 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mv44d"] Apr 20 14:55:57.043836 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.043815 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr" Apr 20 14:55:57.046224 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.046114 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 20 14:55:57.046506 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.046454 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 14:55:57.047458 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.047438 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-nsgjj\"" Apr 20 14:55:57.047458 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.047452 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 20 14:55:57.047598 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.047439 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 14:55:57.051649 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.051632 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr"] Apr 20 14:55:57.051750 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.051658 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mv44d"] Apr 20 14:55:57.051816 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.051761 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mv44d" Apr 20 14:55:57.054398 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.054357 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 20 14:55:57.054491 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.054418 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 20 14:55:57.054491 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.054431 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 20 14:55:57.054714 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.054699 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:55:57.054794 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.054746 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-hg6vr\"" Apr 20 14:55:57.123166 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.123141 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rb2b\" (UniqueName: \"kubernetes.io/projected/1b93bf46-c126-4ef5-9add-d72c0cbb7dae-kube-api-access-5rb2b\") pod \"insights-operator-585dfdc468-xc2s2\" (UID: \"1b93bf46-c126-4ef5-9add-d72c0cbb7dae\") " pod="openshift-insights/insights-operator-585dfdc468-xc2s2" Apr 20 14:55:57.123283 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.123183 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b93bf46-c126-4ef5-9add-d72c0cbb7dae-service-ca-bundle\") pod \"insights-operator-585dfdc468-xc2s2\" (UID: \"1b93bf46-c126-4ef5-9add-d72c0cbb7dae\") " pod="openshift-insights/insights-operator-585dfdc468-xc2s2" Apr 20 14:55:57.123283 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.123212 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22585\" (UniqueName: \"kubernetes.io/projected/332fde80-3942-477a-918e-84086221c09b-kube-api-access-22585\") pod \"kube-storage-version-migrator-operator-6769c5d45-mv44d\" (UID: \"332fde80-3942-477a-918e-84086221c09b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mv44d" Apr 20 14:55:57.123283 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.123242 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b93bf46-c126-4ef5-9add-d72c0cbb7dae-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-xc2s2\" (UID: \"1b93bf46-c126-4ef5-9add-d72c0cbb7dae\") " pod="openshift-insights/insights-operator-585dfdc468-xc2s2" Apr 20 14:55:57.123283 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.123267 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-2whnr\" (UID: \"27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr" Apr 20 14:55:57.123526 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.123295 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d99f\" (UniqueName: \"kubernetes.io/projected/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-kube-api-access-7d99f\") pod \"cluster-monitoring-operator-75587bd455-2whnr\" (UID: \"27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr" Apr 20 14:55:57.123526 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.123322 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1b93bf46-c126-4ef5-9add-d72c0cbb7dae-snapshots\") pod \"insights-operator-585dfdc468-xc2s2\" (UID: \"1b93bf46-c126-4ef5-9add-d72c0cbb7dae\") " pod="openshift-insights/insights-operator-585dfdc468-xc2s2" Apr 20 14:55:57.123526 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.123401 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b93bf46-c126-4ef5-9add-d72c0cbb7dae-tmp\") pod \"insights-operator-585dfdc468-xc2s2\" (UID: \"1b93bf46-c126-4ef5-9add-d72c0cbb7dae\") " pod="openshift-insights/insights-operator-585dfdc468-xc2s2" Apr 20 14:55:57.123526 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.123438 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d54d05c4-b074-4189-b1dd-7ff476b824ec-serving-cert\") pod \"console-operator-9d4b6777b-4z848\" (UID: \"d54d05c4-b074-4189-b1dd-7ff476b824ec\") " pod="openshift-console-operator/console-operator-9d4b6777b-4z848" Apr 20 14:55:57.123526 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.123479 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sp97b\" (UniqueName: \"kubernetes.io/projected/0ac42d0e-8ffe-4cc9-866d-cb7075ee1fde-kube-api-access-sp97b\") pod \"network-check-source-8894fc9bd-4h869\" (UID: \"0ac42d0e-8ffe-4cc9-866d-cb7075ee1fde\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4h869" Apr 20 14:55:57.123526 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.123508 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/332fde80-3942-477a-918e-84086221c09b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-mv44d\" (UID: \"332fde80-3942-477a-918e-84086221c09b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mv44d" Apr 20 14:55:57.123785 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.123536 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2whnr\" (UID: \"27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr" Apr 20 14:55:57.123785 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.123578 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzkwm\" (UniqueName: \"kubernetes.io/projected/8d0f9644-dc24-4c95-b57e-562d7aec32f9-kube-api-access-rzkwm\") pod \"cluster-samples-operator-6dc5bdb6b4-88xfv\" (UID: \"8d0f9644-dc24-4c95-b57e-562d7aec32f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-88xfv" Apr 20 14:55:57.123785 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.123716 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b93bf46-c126-4ef5-9add-d72c0cbb7dae-serving-cert\") pod \"insights-operator-585dfdc468-xc2s2\" (UID: \"1b93bf46-c126-4ef5-9add-d72c0cbb7dae\") " pod="openshift-insights/insights-operator-585dfdc468-xc2s2" Apr 20 14:55:57.123785 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.123761 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/332fde80-3942-477a-918e-84086221c09b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-mv44d\" (UID: \"332fde80-3942-477a-918e-84086221c09b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mv44d" Apr 20 14:55:57.123965 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.123803 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8d0f9644-dc24-4c95-b57e-562d7aec32f9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-88xfv\" (UID: \"8d0f9644-dc24-4c95-b57e-562d7aec32f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-88xfv" Apr 20 14:55:57.123965 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.123800 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b93bf46-c126-4ef5-9add-d72c0cbb7dae-service-ca-bundle\") pod \"insights-operator-585dfdc468-xc2s2\" (UID: \"1b93bf46-c126-4ef5-9add-d72c0cbb7dae\") " pod="openshift-insights/insights-operator-585dfdc468-xc2s2" Apr 20 14:55:57.123965 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:57.123899 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 14:55:57.123965 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.123933 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54d05c4-b074-4189-b1dd-7ff476b824ec-config\") pod \"console-operator-9d4b6777b-4z848\" (UID: \"d54d05c4-b074-4189-b1dd-7ff476b824ec\") " pod="openshift-console-operator/console-operator-9d4b6777b-4z848" Apr 20 14:55:57.123965 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:57.123955 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d0f9644-dc24-4c95-b57e-562d7aec32f9-samples-operator-tls podName:8d0f9644-dc24-4c95-b57e-562d7aec32f9 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:57.623937826 +0000 UTC m=+36.697213430 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8d0f9644-dc24-4c95-b57e-562d7aec32f9-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-88xfv" (UID: "8d0f9644-dc24-4c95-b57e-562d7aec32f9") : secret "samples-operator-tls" not found Apr 20 14:55:57.123965 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.123961 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1b93bf46-c126-4ef5-9add-d72c0cbb7dae-tmp\") pod \"insights-operator-585dfdc468-xc2s2\" (UID: \"1b93bf46-c126-4ef5-9add-d72c0cbb7dae\") " pod="openshift-insights/insights-operator-585dfdc468-xc2s2" Apr 20 14:55:57.124259 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.124012 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d54d05c4-b074-4189-b1dd-7ff476b824ec-trusted-ca\") pod \"console-operator-9d4b6777b-4z848\" (UID: \"d54d05c4-b074-4189-b1dd-7ff476b824ec\") " pod="openshift-console-operator/console-operator-9d4b6777b-4z848" Apr 20 14:55:57.124259 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.124041 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rtnv\" (UniqueName: \"kubernetes.io/projected/d54d05c4-b074-4189-b1dd-7ff476b824ec-kube-api-access-9rtnv\") pod \"console-operator-9d4b6777b-4z848\" (UID: \"d54d05c4-b074-4189-b1dd-7ff476b824ec\") " pod="openshift-console-operator/console-operator-9d4b6777b-4z848" Apr 20 14:55:57.124259 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.124062 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1b93bf46-c126-4ef5-9add-d72c0cbb7dae-snapshots\") pod \"insights-operator-585dfdc468-xc2s2\" (UID: \"1b93bf46-c126-4ef5-9add-d72c0cbb7dae\") " pod="openshift-insights/insights-operator-585dfdc468-xc2s2" Apr 20 14:55:57.124259 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.124236 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b93bf46-c126-4ef5-9add-d72c0cbb7dae-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-xc2s2\" (UID: \"1b93bf46-c126-4ef5-9add-d72c0cbb7dae\") " pod="openshift-insights/insights-operator-585dfdc468-xc2s2" Apr 20 14:55:57.124523 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.124504 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d54d05c4-b074-4189-b1dd-7ff476b824ec-config\") pod \"console-operator-9d4b6777b-4z848\" (UID: \"d54d05c4-b074-4189-b1dd-7ff476b824ec\") " pod="openshift-console-operator/console-operator-9d4b6777b-4z848" Apr 20 14:55:57.124829 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.124806 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d54d05c4-b074-4189-b1dd-7ff476b824ec-trusted-ca\") pod \"console-operator-9d4b6777b-4z848\" (UID: \"d54d05c4-b074-4189-b1dd-7ff476b824ec\") " pod="openshift-console-operator/console-operator-9d4b6777b-4z848" Apr 20 14:55:57.126740 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.126723 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d54d05c4-b074-4189-b1dd-7ff476b824ec-serving-cert\") pod \"console-operator-9d4b6777b-4z848\" (UID: \"d54d05c4-b074-4189-b1dd-7ff476b824ec\") " pod="openshift-console-operator/console-operator-9d4b6777b-4z848" Apr 20 14:55:57.126818 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.126791 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b93bf46-c126-4ef5-9add-d72c0cbb7dae-serving-cert\") pod \"insights-operator-585dfdc468-xc2s2\" (UID: \"1b93bf46-c126-4ef5-9add-d72c0cbb7dae\") " pod="openshift-insights/insights-operator-585dfdc468-xc2s2" Apr 20 14:55:57.131970 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.131859 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-l7992"] Apr 20 14:55:57.138500 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.138474 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp97b\" (UniqueName: \"kubernetes.io/projected/0ac42d0e-8ffe-4cc9-866d-cb7075ee1fde-kube-api-access-sp97b\") pod \"network-check-source-8894fc9bd-4h869\" (UID: \"0ac42d0e-8ffe-4cc9-866d-cb7075ee1fde\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4h869" Apr 20 14:55:57.138628 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.138610 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzkwm\" (UniqueName: \"kubernetes.io/projected/8d0f9644-dc24-4c95-b57e-562d7aec32f9-kube-api-access-rzkwm\") pod \"cluster-samples-operator-6dc5bdb6b4-88xfv\" (UID: \"8d0f9644-dc24-4c95-b57e-562d7aec32f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-88xfv" Apr 20 14:55:57.139066 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.139043 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rtnv\" (UniqueName: \"kubernetes.io/projected/d54d05c4-b074-4189-b1dd-7ff476b824ec-kube-api-access-9rtnv\") pod \"console-operator-9d4b6777b-4z848\" (UID: \"d54d05c4-b074-4189-b1dd-7ff476b824ec\") " pod="openshift-console-operator/console-operator-9d4b6777b-4z848" Apr 20 14:55:57.145077 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.145056 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rb2b\" (UniqueName: \"kubernetes.io/projected/1b93bf46-c126-4ef5-9add-d72c0cbb7dae-kube-api-access-5rb2b\") pod \"insights-operator-585dfdc468-xc2s2\" (UID: \"1b93bf46-c126-4ef5-9add-d72c0cbb7dae\") " pod="openshift-insights/insights-operator-585dfdc468-xc2s2" Apr 20 14:55:57.154882 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.154860 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-l7992"] Apr 20 14:55:57.154959 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.154949 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-l7992" Apr 20 14:55:57.157462 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.157424 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 20 14:55:57.157554 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.157471 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-2ljbw\"" Apr 20 14:55:57.157554 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.157477 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 20 14:55:57.225143 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.224960 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-22585\" (UniqueName: \"kubernetes.io/projected/332fde80-3942-477a-918e-84086221c09b-kube-api-access-22585\") pod \"kube-storage-version-migrator-operator-6769c5d45-mv44d\" (UID: \"332fde80-3942-477a-918e-84086221c09b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mv44d" Apr 20 14:55:57.225143 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.225125 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-2whnr\" (UID: \"27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr" Apr 20 14:55:57.225280 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.225145 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7d99f\" (UniqueName: \"kubernetes.io/projected/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-kube-api-access-7d99f\") pod \"cluster-monitoring-operator-75587bd455-2whnr\" (UID: \"27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr" Apr 20 14:55:57.225280 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.225199 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/332fde80-3942-477a-918e-84086221c09b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-mv44d\" (UID: \"332fde80-3942-477a-918e-84086221c09b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mv44d" Apr 20 14:55:57.225280 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.225216 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2whnr\" (UID: \"27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr" Apr 20 14:55:57.225280 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.225246 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/332fde80-3942-477a-918e-84086221c09b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-mv44d\" (UID: \"332fde80-3942-477a-918e-84086221c09b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mv44d" Apr 20 14:55:57.225531 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.225294 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9444ff6f-3ede-40a2-a63c-97c92b90d755-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-l7992\" (UID: \"9444ff6f-3ede-40a2-a63c-97c92b90d755\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l7992" Apr 20 14:55:57.225531 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.225322 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9444ff6f-3ede-40a2-a63c-97c92b90d755-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-l7992\" (UID: \"9444ff6f-3ede-40a2-a63c-97c92b90d755\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l7992" Apr 20 14:55:57.225531 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:57.225328 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 14:55:57.225531 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:57.225415 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-cluster-monitoring-operator-tls podName:27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:57.725398089 +0000 UTC m=+36.798673699 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-2whnr" (UID: "27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8") : secret "cluster-monitoring-operator-tls" not found Apr 20 14:55:57.225785 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.225762 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/332fde80-3942-477a-918e-84086221c09b-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-mv44d\" (UID: \"332fde80-3942-477a-918e-84086221c09b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mv44d" Apr 20 14:55:57.225873 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.225856 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-2whnr\" (UID: \"27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr" Apr 20 14:55:57.227396 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.227380 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/332fde80-3942-477a-918e-84086221c09b-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-mv44d\" (UID: \"332fde80-3942-477a-918e-84086221c09b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mv44d" Apr 20 14:55:57.234660 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.234638 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d99f\" (UniqueName: \"kubernetes.io/projected/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-kube-api-access-7d99f\") pod \"cluster-monitoring-operator-75587bd455-2whnr\" (UID: \"27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr" Apr 20 14:55:57.234969 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.234944 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-22585\" (UniqueName: \"kubernetes.io/projected/332fde80-3942-477a-918e-84086221c09b-kube-api-access-22585\") pod \"kube-storage-version-migrator-operator-6769c5d45-mv44d\" (UID: \"332fde80-3942-477a-918e-84086221c09b\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mv44d" Apr 20 14:55:57.256882 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.256865 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4h869" Apr 20 14:55:57.275159 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.275136 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-4z848" Apr 20 14:55:57.296074 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.296042 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-xc2s2" Apr 20 14:55:57.326267 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.326239 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9444ff6f-3ede-40a2-a63c-97c92b90d755-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-l7992\" (UID: \"9444ff6f-3ede-40a2-a63c-97c92b90d755\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l7992" Apr 20 14:55:57.326456 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.326273 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9444ff6f-3ede-40a2-a63c-97c92b90d755-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-l7992\" (UID: \"9444ff6f-3ede-40a2-a63c-97c92b90d755\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l7992" Apr 20 14:55:57.326456 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:57.326396 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 14:55:57.326576 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:57.326467 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9444ff6f-3ede-40a2-a63c-97c92b90d755-networking-console-plugin-cert podName:9444ff6f-3ede-40a2-a63c-97c92b90d755 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:57.826449731 +0000 UTC m=+36.899725335 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9444ff6f-3ede-40a2-a63c-97c92b90d755-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-l7992" (UID: "9444ff6f-3ede-40a2-a63c-97c92b90d755") : secret "networking-console-plugin-cert" not found Apr 20 14:55:57.326926 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.326904 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9444ff6f-3ede-40a2-a63c-97c92b90d755-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-l7992\" (UID: \"9444ff6f-3ede-40a2-a63c-97c92b90d755\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l7992" Apr 20 14:55:57.362948 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.362521 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mv44d" Apr 20 14:55:57.451313 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.451266 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4h869"] Apr 20 14:55:57.456777 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:57.456744 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ac42d0e_8ffe_4cc9_866d_cb7075ee1fde.slice/crio-4d7807d18cd31fab29d73321ce970c05c780326ee1788f7f99e2ac42580b0dbc WatchSource:0}: Error finding container 4d7807d18cd31fab29d73321ce970c05c780326ee1788f7f99e2ac42580b0dbc: Status 404 returned error can't find the container with id 4d7807d18cd31fab29d73321ce970c05c780326ee1788f7f99e2ac42580b0dbc Apr 20 14:55:57.463306 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.463284 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-4z848"] Apr 20 14:55:57.466885 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:57.466859 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd54d05c4_b074_4189_b1dd_7ff476b824ec.slice/crio-0c176028f458c2b90b89e7f3696c43047ff05c639db3ae7010ec00944aabe494 WatchSource:0}: Error finding container 0c176028f458c2b90b89e7f3696c43047ff05c639db3ae7010ec00944aabe494: Status 404 returned error can't find the container with id 0c176028f458c2b90b89e7f3696c43047ff05c639db3ae7010ec00944aabe494 Apr 20 14:55:57.470513 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.470491 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-xc2s2"] Apr 20 14:55:57.472939 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:57.472906 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b93bf46_c126_4ef5_9add_d72c0cbb7dae.slice/crio-6f329d01a009a5ed62de9f9c9ed7c6ecbd7d501a15bad86b2ed352ffca29caf4 WatchSource:0}: Error finding container 6f329d01a009a5ed62de9f9c9ed7c6ecbd7d501a15bad86b2ed352ffca29caf4: Status 404 returned error can't find the container with id 6f329d01a009a5ed62de9f9c9ed7c6ecbd7d501a15bad86b2ed352ffca29caf4 Apr 20 14:55:57.514749 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.514721 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mv44d"] Apr 20 14:55:57.517464 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:55:57.517440 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod332fde80_3942_477a_918e_84086221c09b.slice/crio-86c3a91c4e6c5e8f821be2fe3194af2a3f7d9e8b39bc4e30a99f36d0d15e823b WatchSource:0}: Error finding container 86c3a91c4e6c5e8f821be2fe3194af2a3f7d9e8b39bc4e30a99f36d0d15e823b: Status 404 returned error can't find the container with id 86c3a91c4e6c5e8f821be2fe3194af2a3f7d9e8b39bc4e30a99f36d0d15e823b Apr 20 14:55:57.528517 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.528495 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a001809b-d266-4d12-b9a2-d400942f2755-cert\") pod \"ingress-canary-d8wts\" (UID: \"a001809b-d266-4d12-b9a2-d400942f2755\") " pod="openshift-ingress-canary/ingress-canary-d8wts" Apr 20 14:55:57.528620 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.528563 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/057f0667-15cc-4883-a91d-c360de54e58f-metrics-tls\") pod \"dns-default-pq8qx\" (UID: \"057f0667-15cc-4883-a91d-c360de54e58f\") " pod="openshift-dns/dns-default-pq8qx" Apr 20 14:55:57.528678 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:57.528632 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:55:57.528719 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:57.528683 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:55:57.528719 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:57.528687 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a001809b-d266-4d12-b9a2-d400942f2755-cert podName:a001809b-d266-4d12-b9a2-d400942f2755 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:01.528671549 +0000 UTC m=+40.601947154 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a001809b-d266-4d12-b9a2-d400942f2755-cert") pod "ingress-canary-d8wts" (UID: "a001809b-d266-4d12-b9a2-d400942f2755") : secret "canary-serving-cert" not found Apr 20 14:55:57.528793 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:57.528732 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/057f0667-15cc-4883-a91d-c360de54e58f-metrics-tls podName:057f0667-15cc-4883-a91d-c360de54e58f nodeName:}" failed. No retries permitted until 2026-04-20 14:56:01.528718639 +0000 UTC m=+40.601994242 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/057f0667-15cc-4883-a91d-c360de54e58f-metrics-tls") pod "dns-default-pq8qx" (UID: "057f0667-15cc-4883-a91d-c360de54e58f") : secret "dns-default-metrics-tls" not found Apr 20 14:55:57.623957 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.623917 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-xc2s2" event={"ID":"1b93bf46-c126-4ef5-9add-d72c0cbb7dae","Type":"ContainerStarted","Data":"6f329d01a009a5ed62de9f9c9ed7c6ecbd7d501a15bad86b2ed352ffca29caf4"} Apr 20 14:55:57.624886 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.624868 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mv44d" event={"ID":"332fde80-3942-477a-918e-84086221c09b","Type":"ContainerStarted","Data":"86c3a91c4e6c5e8f821be2fe3194af2a3f7d9e8b39bc4e30a99f36d0d15e823b"} Apr 20 14:55:57.625834 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.625819 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4z848" event={"ID":"d54d05c4-b074-4189-b1dd-7ff476b824ec","Type":"ContainerStarted","Data":"0c176028f458c2b90b89e7f3696c43047ff05c639db3ae7010ec00944aabe494"} Apr 20 14:55:57.628717 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.628693 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-blx8n" event={"ID":"61708c39-4987-438d-b51f-59e8cd1a1e59","Type":"ContainerStarted","Data":"9b8ad1ca9161e4760c5d5193c06817e356817ccf2f149d81d5b12c566422b5d4"} Apr 20 14:55:57.628963 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.628939 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8d0f9644-dc24-4c95-b57e-562d7aec32f9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-88xfv\" (UID: \"8d0f9644-dc24-4c95-b57e-562d7aec32f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-88xfv" Apr 20 14:55:57.629095 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:57.629037 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 14:55:57.629143 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:57.629118 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d0f9644-dc24-4c95-b57e-562d7aec32f9-samples-operator-tls podName:8d0f9644-dc24-4c95-b57e-562d7aec32f9 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:58.629097232 +0000 UTC m=+37.702372848 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8d0f9644-dc24-4c95-b57e-562d7aec32f9-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-88xfv" (UID: "8d0f9644-dc24-4c95-b57e-562d7aec32f9") : secret "samples-operator-tls" not found Apr 20 14:55:57.629689 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.629672 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4h869" event={"ID":"0ac42d0e-8ffe-4cc9-866d-cb7075ee1fde","Type":"ContainerStarted","Data":"4d7807d18cd31fab29d73321ce970c05c780326ee1788f7f99e2ac42580b0dbc"} Apr 20 14:55:57.649268 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.649230 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-blx8n" podStartSLOduration=6.307853202 podStartE2EDuration="36.649218547s" podCreationTimestamp="2026-04-20 14:55:21 +0000 UTC" firstStartedPulling="2026-04-20 14:55:24.026413023 +0000 UTC m=+3.099688633" lastFinishedPulling="2026-04-20 14:55:54.367778371 +0000 UTC m=+33.441053978" observedRunningTime="2026-04-20 14:55:57.647437739 +0000 UTC m=+36.720713365" watchObservedRunningTime="2026-04-20 14:55:57.649218547 +0000 UTC m=+36.722494171" Apr 20 14:55:57.730264 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.730185 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2whnr\" (UID: \"27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr" Apr 20 14:55:57.730469 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:57.730449 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 14:55:57.730536 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:57.730512 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-cluster-monitoring-operator-tls podName:27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:58.730493376 +0000 UTC m=+37.803768979 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-2whnr" (UID: "27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8") : secret "cluster-monitoring-operator-tls" not found Apr 20 14:55:57.831602 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:57.831557 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9444ff6f-3ede-40a2-a63c-97c92b90d755-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-l7992\" (UID: \"9444ff6f-3ede-40a2-a63c-97c92b90d755\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l7992" Apr 20 14:55:57.831738 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:57.831702 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 14:55:57.831776 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:57.831768 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9444ff6f-3ede-40a2-a63c-97c92b90d755-networking-console-plugin-cert podName:9444ff6f-3ede-40a2-a63c-97c92b90d755 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:58.831752307 +0000 UTC m=+37.905027910 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9444ff6f-3ede-40a2-a63c-97c92b90d755-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-l7992" (UID: "9444ff6f-3ede-40a2-a63c-97c92b90d755") : secret "networking-console-plugin-cert" not found Apr 20 14:55:58.638033 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:58.637859 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8d0f9644-dc24-4c95-b57e-562d7aec32f9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-88xfv\" (UID: \"8d0f9644-dc24-4c95-b57e-562d7aec32f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-88xfv" Apr 20 14:55:58.639016 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:58.638514 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 14:55:58.639016 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:58.638627 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d0f9644-dc24-4c95-b57e-562d7aec32f9-samples-operator-tls podName:8d0f9644-dc24-4c95-b57e-562d7aec32f9 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:00.638572976 +0000 UTC m=+39.711848584 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8d0f9644-dc24-4c95-b57e-562d7aec32f9-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-88xfv" (UID: "8d0f9644-dc24-4c95-b57e-562d7aec32f9") : secret "samples-operator-tls" not found Apr 20 14:55:58.740433 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:58.738932 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2whnr\" (UID: \"27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr" Apr 20 14:55:58.740433 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:58.739087 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 14:55:58.740433 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:58.739152 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-cluster-monitoring-operator-tls podName:27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:00.739133329 +0000 UTC m=+39.812408938 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-2whnr" (UID: "27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8") : secret "cluster-monitoring-operator-tls" not found Apr 20 14:55:58.840649 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:55:58.840134 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9444ff6f-3ede-40a2-a63c-97c92b90d755-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-l7992\" (UID: \"9444ff6f-3ede-40a2-a63c-97c92b90d755\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l7992" Apr 20 14:55:58.840649 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:58.840384 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 14:55:58.840649 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:55:58.840451 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9444ff6f-3ede-40a2-a63c-97c92b90d755-networking-console-plugin-cert podName:9444ff6f-3ede-40a2-a63c-97c92b90d755 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:00.840430402 +0000 UTC m=+39.913706014 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9444ff6f-3ede-40a2-a63c-97c92b90d755-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-l7992" (UID: "9444ff6f-3ede-40a2-a63c-97c92b90d755") : secret "networking-console-plugin-cert" not found Apr 20 14:56:00.656270 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:00.656035 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8d0f9644-dc24-4c95-b57e-562d7aec32f9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-88xfv\" (UID: \"8d0f9644-dc24-4c95-b57e-562d7aec32f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-88xfv" Apr 20 14:56:00.656737 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:00.656213 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 14:56:00.656737 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:00.656413 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d0f9644-dc24-4c95-b57e-562d7aec32f9-samples-operator-tls podName:8d0f9644-dc24-4c95-b57e-562d7aec32f9 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:04.656392756 +0000 UTC m=+43.729668359 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8d0f9644-dc24-4c95-b57e-562d7aec32f9-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-88xfv" (UID: "8d0f9644-dc24-4c95-b57e-562d7aec32f9") : secret "samples-operator-tls" not found Apr 20 14:56:00.757457 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:00.757405 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2whnr\" (UID: \"27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr" Apr 20 14:56:00.757644 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:00.757570 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 14:56:00.757713 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:00.757653 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-cluster-monitoring-operator-tls podName:27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:04.757631606 +0000 UTC m=+43.830907216 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-2whnr" (UID: "27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8") : secret "cluster-monitoring-operator-tls" not found Apr 20 14:56:00.858689 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:00.858656 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9444ff6f-3ede-40a2-a63c-97c92b90d755-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-l7992\" (UID: \"9444ff6f-3ede-40a2-a63c-97c92b90d755\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l7992" Apr 20 14:56:00.858854 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:00.858818 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 14:56:00.858933 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:00.858897 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9444ff6f-3ede-40a2-a63c-97c92b90d755-networking-console-plugin-cert podName:9444ff6f-3ede-40a2-a63c-97c92b90d755 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:04.858875886 +0000 UTC m=+43.932151494 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9444ff6f-3ede-40a2-a63c-97c92b90d755-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-l7992" (UID: "9444ff6f-3ede-40a2-a63c-97c92b90d755") : secret "networking-console-plugin-cert" not found Apr 20 14:56:01.564346 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:01.564317 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a001809b-d266-4d12-b9a2-d400942f2755-cert\") pod \"ingress-canary-d8wts\" (UID: \"a001809b-d266-4d12-b9a2-d400942f2755\") " pod="openshift-ingress-canary/ingress-canary-d8wts" Apr 20 14:56:01.564513 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:01.564379 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/057f0667-15cc-4883-a91d-c360de54e58f-metrics-tls\") pod \"dns-default-pq8qx\" (UID: \"057f0667-15cc-4883-a91d-c360de54e58f\") " pod="openshift-dns/dns-default-pq8qx" Apr 20 14:56:01.564513 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:01.564466 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:56:01.564513 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:01.564506 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:56:01.564613 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:01.564522 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a001809b-d266-4d12-b9a2-d400942f2755-cert podName:a001809b-d266-4d12-b9a2-d400942f2755 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:09.564507859 +0000 UTC m=+48.637783461 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a001809b-d266-4d12-b9a2-d400942f2755-cert") pod "ingress-canary-d8wts" (UID: "a001809b-d266-4d12-b9a2-d400942f2755") : secret "canary-serving-cert" not found Apr 20 14:56:01.564613 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:01.564540 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/057f0667-15cc-4883-a91d-c360de54e58f-metrics-tls podName:057f0667-15cc-4883-a91d-c360de54e58f nodeName:}" failed. No retries permitted until 2026-04-20 14:56:09.564529656 +0000 UTC m=+48.637805258 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/057f0667-15cc-4883-a91d-c360de54e58f-metrics-tls") pod "dns-default-pq8qx" (UID: "057f0667-15cc-4883-a91d-c360de54e58f") : secret "dns-default-metrics-tls" not found Apr 20 14:56:03.643640 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:03.643602 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-xc2s2" event={"ID":"1b93bf46-c126-4ef5-9add-d72c0cbb7dae","Type":"ContainerStarted","Data":"60f85cde85c2af646080cfbe88b07f8163e29a110e2f38ecdeb6d396dcc00969"} Apr 20 14:56:03.645114 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:03.645085 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mv44d" event={"ID":"332fde80-3942-477a-918e-84086221c09b","Type":"ContainerStarted","Data":"ecb7d6ffa9c5a6ab9ec8fcab7950d8d28559598291a7cb9552a1908ccbfe9d98"} Apr 20 14:56:03.646643 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:03.646622 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4z848_d54d05c4-b074-4189-b1dd-7ff476b824ec/console-operator/0.log" Apr 20 14:56:03.646760 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:03.646662 2574 generic.go:358] "Generic (PLEG): container finished" podID="d54d05c4-b074-4189-b1dd-7ff476b824ec" containerID="fa7fcf71ea92b4913bd382e70f94775483a76a61c0da188f177f922fc11c5f0b" exitCode=255 Apr 20 14:56:03.646760 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:03.646748 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4z848" event={"ID":"d54d05c4-b074-4189-b1dd-7ff476b824ec","Type":"ContainerDied","Data":"fa7fcf71ea92b4913bd382e70f94775483a76a61c0da188f177f922fc11c5f0b"} Apr 20 14:56:03.646911 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:03.646894 2574 scope.go:117] "RemoveContainer" containerID="fa7fcf71ea92b4913bd382e70f94775483a76a61c0da188f177f922fc11c5f0b" Apr 20 14:56:03.648130 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:03.648106 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4h869" event={"ID":"0ac42d0e-8ffe-4cc9-866d-cb7075ee1fde","Type":"ContainerStarted","Data":"88a7179ca407a988929ee19759532b8c22382d31b169c9f558766dcfbcd69f50"} Apr 20 14:56:03.665114 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:03.665073 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-xc2s2" podStartSLOduration=2.39710205 podStartE2EDuration="7.665059149s" podCreationTimestamp="2026-04-20 14:55:56 +0000 UTC" firstStartedPulling="2026-04-20 14:55:57.475427433 +0000 UTC m=+36.548703053" lastFinishedPulling="2026-04-20 14:56:02.743384534 +0000 UTC m=+41.816660152" observedRunningTime="2026-04-20 14:56:03.664228489 +0000 UTC m=+42.737504137" watchObservedRunningTime="2026-04-20 14:56:03.665059149 +0000 UTC m=+42.738334776" Apr 20 14:56:03.685673 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:03.685630 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4h869" podStartSLOduration=2.399807859 podStartE2EDuration="7.685615632s" podCreationTimestamp="2026-04-20 14:55:56 +0000 UTC" firstStartedPulling="2026-04-20 14:55:57.45957573 +0000 UTC m=+36.532851349" lastFinishedPulling="2026-04-20 14:56:02.745383517 +0000 UTC m=+41.818659122" observedRunningTime="2026-04-20 14:56:03.684915442 +0000 UTC m=+42.758191066" watchObservedRunningTime="2026-04-20 14:56:03.685615632 +0000 UTC m=+42.758891258" Apr 20 14:56:03.703261 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:03.703218 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mv44d" podStartSLOduration=1.474724894 podStartE2EDuration="6.703203781s" podCreationTimestamp="2026-04-20 14:55:57 +0000 UTC" firstStartedPulling="2026-04-20 14:55:57.519234873 +0000 UTC m=+36.592510476" lastFinishedPulling="2026-04-20 14:56:02.747713757 +0000 UTC m=+41.820989363" observedRunningTime="2026-04-20 14:56:03.702113487 +0000 UTC m=+42.775389113" watchObservedRunningTime="2026-04-20 14:56:03.703203781 +0000 UTC m=+42.776479404" Apr 20 14:56:04.651294 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:04.651266 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4z848_d54d05c4-b074-4189-b1dd-7ff476b824ec/console-operator/1.log" Apr 20 14:56:04.651722 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:04.651700 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4z848_d54d05c4-b074-4189-b1dd-7ff476b824ec/console-operator/0.log" Apr 20 14:56:04.651764 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:04.651733 2574 generic.go:358] "Generic (PLEG): container finished" podID="d54d05c4-b074-4189-b1dd-7ff476b824ec" containerID="d495cc663ae6ebc18c5d223bb587bcf0df5f1c40659230bddd3b3350652fb1df" exitCode=255 Apr 20 14:56:04.651858 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:04.651837 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4z848" event={"ID":"d54d05c4-b074-4189-b1dd-7ff476b824ec","Type":"ContainerDied","Data":"d495cc663ae6ebc18c5d223bb587bcf0df5f1c40659230bddd3b3350652fb1df"} Apr 20 14:56:04.651908 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:04.651875 2574 scope.go:117] "RemoveContainer" containerID="fa7fcf71ea92b4913bd382e70f94775483a76a61c0da188f177f922fc11c5f0b" Apr 20 14:56:04.652041 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:04.652025 2574 scope.go:117] "RemoveContainer" containerID="d495cc663ae6ebc18c5d223bb587bcf0df5f1c40659230bddd3b3350652fb1df" Apr 20 14:56:04.652259 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:04.652232 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-4z848_openshift-console-operator(d54d05c4-b074-4189-b1dd-7ff476b824ec)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4z848" podUID="d54d05c4-b074-4189-b1dd-7ff476b824ec" Apr 20 14:56:04.687458 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:04.687420 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8d0f9644-dc24-4c95-b57e-562d7aec32f9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-88xfv\" (UID: \"8d0f9644-dc24-4c95-b57e-562d7aec32f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-88xfv" Apr 20 14:56:04.688268 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:04.688160 2574 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 14:56:04.688268 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:04.688243 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d0f9644-dc24-4c95-b57e-562d7aec32f9-samples-operator-tls podName:8d0f9644-dc24-4c95-b57e-562d7aec32f9 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:12.688220551 +0000 UTC m=+51.761496162 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/8d0f9644-dc24-4c95-b57e-562d7aec32f9-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-88xfv" (UID: "8d0f9644-dc24-4c95-b57e-562d7aec32f9") : secret "samples-operator-tls" not found Apr 20 14:56:04.788523 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:04.788490 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2whnr\" (UID: \"27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr" Apr 20 14:56:04.788655 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:04.788630 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 14:56:04.788708 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:04.788692 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-cluster-monitoring-operator-tls podName:27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:12.788677603 +0000 UTC m=+51.861953206 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-2whnr" (UID: "27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8") : secret "cluster-monitoring-operator-tls" not found Apr 20 14:56:04.888936 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:04.888904 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9444ff6f-3ede-40a2-a63c-97c92b90d755-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-l7992\" (UID: \"9444ff6f-3ede-40a2-a63c-97c92b90d755\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l7992" Apr 20 14:56:04.889093 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:04.889028 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 14:56:04.889133 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:04.889094 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9444ff6f-3ede-40a2-a63c-97c92b90d755-networking-console-plugin-cert podName:9444ff6f-3ede-40a2-a63c-97c92b90d755 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:12.889077779 +0000 UTC m=+51.962353383 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9444ff6f-3ede-40a2-a63c-97c92b90d755-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-l7992" (UID: "9444ff6f-3ede-40a2-a63c-97c92b90d755") : secret "networking-console-plugin-cert" not found Apr 20 14:56:05.655168 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:05.655141 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4z848_d54d05c4-b074-4189-b1dd-7ff476b824ec/console-operator/1.log" Apr 20 14:56:05.655558 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:05.655470 2574 scope.go:117] "RemoveContainer" containerID="d495cc663ae6ebc18c5d223bb587bcf0df5f1c40659230bddd3b3350652fb1df" Apr 20 14:56:05.655632 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:05.655616 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-4z848_openshift-console-operator(d54d05c4-b074-4189-b1dd-7ff476b824ec)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4z848" podUID="d54d05c4-b074-4189-b1dd-7ff476b824ec" Apr 20 14:56:07.275390 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:07.275340 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-4z848" Apr 20 14:56:07.275390 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:07.275395 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-4z848" Apr 20 14:56:07.276014 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:07.275782 2574 scope.go:117] "RemoveContainer" containerID="d495cc663ae6ebc18c5d223bb587bcf0df5f1c40659230bddd3b3350652fb1df" Apr 20 14:56:07.276155 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:07.276004 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-4z848_openshift-console-operator(d54d05c4-b074-4189-b1dd-7ff476b824ec)\"" pod="openshift-console-operator/console-operator-9d4b6777b-4z848" podUID="d54d05c4-b074-4189-b1dd-7ff476b824ec" Apr 20 14:56:07.495735 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:07.495706 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mj558_772f88da-629b-4161-9ed5-8a916387c9bd/dns-node-resolver/0.log" Apr 20 14:56:08.695724 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:08.695694 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qxsfj_10081761-39cd-4657-8ccf-94426cfd0833/node-ca/0.log" Apr 20 14:56:09.626575 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:09.626547 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/057f0667-15cc-4883-a91d-c360de54e58f-metrics-tls\") pod \"dns-default-pq8qx\" (UID: \"057f0667-15cc-4883-a91d-c360de54e58f\") " pod="openshift-dns/dns-default-pq8qx" Apr 20 14:56:09.626767 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:09.626647 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a001809b-d266-4d12-b9a2-d400942f2755-cert\") pod \"ingress-canary-d8wts\" (UID: \"a001809b-d266-4d12-b9a2-d400942f2755\") " pod="openshift-ingress-canary/ingress-canary-d8wts" Apr 20 14:56:09.629591 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:09.629561 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/057f0667-15cc-4883-a91d-c360de54e58f-metrics-tls\") pod \"dns-default-pq8qx\" (UID: \"057f0667-15cc-4883-a91d-c360de54e58f\") " pod="openshift-dns/dns-default-pq8qx" Apr 20 14:56:09.629752 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:09.629730 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a001809b-d266-4d12-b9a2-d400942f2755-cert\") pod \"ingress-canary-d8wts\" (UID: \"a001809b-d266-4d12-b9a2-d400942f2755\") " pod="openshift-ingress-canary/ingress-canary-d8wts" Apr 20 14:56:09.698294 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:09.698264 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-mv44d_332fde80-3942-477a-918e-84086221c09b/kube-storage-version-migrator-operator/0.log" Apr 20 14:56:09.779851 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:09.779824 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pq8qx" Apr 20 14:56:09.790469 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:09.790439 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-d8wts" Apr 20 14:56:09.906031 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:09.905966 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pq8qx"] Apr 20 14:56:09.919027 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:09.918999 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-d8wts"] Apr 20 14:56:09.922271 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:56:09.922245 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda001809b_d266_4d12_b9a2_d400942f2755.slice/crio-02fe84af1765fead6e98f267d55febbe10fb65576a13fd8b16ca79accf2076c3 WatchSource:0}: Error finding container 02fe84af1765fead6e98f267d55febbe10fb65576a13fd8b16ca79accf2076c3: Status 404 returned error can't find the container with id 02fe84af1765fead6e98f267d55febbe10fb65576a13fd8b16ca79accf2076c3 Apr 20 14:56:10.666023 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:10.665974 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-d8wts" event={"ID":"a001809b-d266-4d12-b9a2-d400942f2755","Type":"ContainerStarted","Data":"02fe84af1765fead6e98f267d55febbe10fb65576a13fd8b16ca79accf2076c3"} Apr 20 14:56:10.667188 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:10.667159 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pq8qx" event={"ID":"057f0667-15cc-4883-a91d-c360de54e58f","Type":"ContainerStarted","Data":"86546278e6e89769f7a39db10502720b57323a3f6921a237147e575a5667f17f"} Apr 20 14:56:12.675970 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:12.675934 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-d8wts" event={"ID":"a001809b-d266-4d12-b9a2-d400942f2755","Type":"ContainerStarted","Data":"ffc63bf04cbe398507a399edcd29a95f92db4047ed89a5270c35a89c474e4798"} Apr 20 14:56:12.677460 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:12.677439 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pq8qx" event={"ID":"057f0667-15cc-4883-a91d-c360de54e58f","Type":"ContainerStarted","Data":"2f114858026eacb13f4cebc7378499370a1412b6f7fad9f145a2c603599730dc"} Apr 20 14:56:12.677632 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:12.677464 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pq8qx" event={"ID":"057f0667-15cc-4883-a91d-c360de54e58f","Type":"ContainerStarted","Data":"a1c7f4312011c3dec9f73a911080677f2b4812354fcdf6d5c1de51c80008915a"} Apr 20 14:56:12.677632 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:12.677607 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-pq8qx" Apr 20 14:56:12.692211 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:12.692158 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-d8wts" podStartSLOduration=17.648458588 podStartE2EDuration="19.69214709s" podCreationTimestamp="2026-04-20 14:55:53 +0000 UTC" firstStartedPulling="2026-04-20 14:56:09.924608474 +0000 UTC m=+48.997884093" lastFinishedPulling="2026-04-20 14:56:11.968296992 +0000 UTC m=+51.041572595" observedRunningTime="2026-04-20 14:56:12.691243339 +0000 UTC m=+51.764518975" watchObservedRunningTime="2026-04-20 14:56:12.69214709 +0000 UTC m=+51.765422744" Apr 20 14:56:12.710250 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:12.710210 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pq8qx" podStartSLOduration=17.656496218 podStartE2EDuration="19.710200293s" podCreationTimestamp="2026-04-20 14:55:53 +0000 UTC" firstStartedPulling="2026-04-20 14:56:09.911216153 +0000 UTC m=+48.984491760" lastFinishedPulling="2026-04-20 14:56:11.964920232 +0000 UTC m=+51.038195835" observedRunningTime="2026-04-20 14:56:12.710017801 +0000 UTC m=+51.783293425" watchObservedRunningTime="2026-04-20 14:56:12.710200293 +0000 UTC m=+51.783475918" Apr 20 14:56:12.752404 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:12.752363 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8d0f9644-dc24-4c95-b57e-562d7aec32f9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-88xfv\" (UID: \"8d0f9644-dc24-4c95-b57e-562d7aec32f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-88xfv" Apr 20 14:56:12.754916 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:12.754889 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/8d0f9644-dc24-4c95-b57e-562d7aec32f9-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-88xfv\" (UID: \"8d0f9644-dc24-4c95-b57e-562d7aec32f9\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-88xfv" Apr 20 14:56:12.853138 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:12.853101 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2whnr\" (UID: \"27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr" Apr 20 14:56:12.853290 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:12.853210 2574 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 14:56:12.853290 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:12.853263 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-cluster-monitoring-operator-tls podName:27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:28.853250552 +0000 UTC m=+67.926526155 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-2whnr" (UID: "27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8") : secret "cluster-monitoring-operator-tls" not found Apr 20 14:56:12.882883 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:12.882846 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-88xfv" Apr 20 14:56:12.954412 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:12.954358 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9444ff6f-3ede-40a2-a63c-97c92b90d755-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-l7992\" (UID: \"9444ff6f-3ede-40a2-a63c-97c92b90d755\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l7992" Apr 20 14:56:12.954555 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:12.954491 2574 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 14:56:12.954555 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:12.954550 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9444ff6f-3ede-40a2-a63c-97c92b90d755-networking-console-plugin-cert podName:9444ff6f-3ede-40a2-a63c-97c92b90d755 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:28.954533814 +0000 UTC m=+68.027809423 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/9444ff6f-3ede-40a2-a63c-97c92b90d755-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-l7992" (UID: "9444ff6f-3ede-40a2-a63c-97c92b90d755") : secret "networking-console-plugin-cert" not found Apr 20 14:56:12.997921 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:12.997898 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-88xfv"] Apr 20 14:56:13.682527 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:13.682490 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-88xfv" event={"ID":"8d0f9644-dc24-4c95-b57e-562d7aec32f9","Type":"ContainerStarted","Data":"85fc52e3a57810fcb2c4dbebadb9a73d0694a9aa295e2ebcfe3e8bd2c59aa7e0"} Apr 20 14:56:14.686025 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:14.685951 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-88xfv" event={"ID":"8d0f9644-dc24-4c95-b57e-562d7aec32f9","Type":"ContainerStarted","Data":"c81d6ba27e29b34ff88f1109e0e9346fdd070e58c2c822163f8ae5379f389e0c"} Apr 20 14:56:14.686025 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:14.685985 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-88xfv" event={"ID":"8d0f9644-dc24-4c95-b57e-562d7aec32f9","Type":"ContainerStarted","Data":"752a10839d9129c04111c5246221e8318206ffacc19f09c184dfc052d5d1fbaf"} Apr 20 14:56:14.701738 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:14.701695 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-88xfv" podStartSLOduration=17.421648905 podStartE2EDuration="18.701682947s" podCreationTimestamp="2026-04-20 14:55:56 +0000 UTC" firstStartedPulling="2026-04-20 14:56:13.040805615 +0000 UTC m=+52.114081222" lastFinishedPulling="2026-04-20 14:56:14.320839661 +0000 UTC m=+53.394115264" observedRunningTime="2026-04-20 14:56:14.70113468 +0000 UTC m=+53.774410300" watchObservedRunningTime="2026-04-20 14:56:14.701682947 +0000 UTC m=+53.774958621" Apr 20 14:56:18.443036 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:18.443006 2574 scope.go:117] "RemoveContainer" containerID="d495cc663ae6ebc18c5d223bb587bcf0df5f1c40659230bddd3b3350652fb1df" Apr 20 14:56:18.608507 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:18.608482 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5vmgd" Apr 20 14:56:18.698301 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:18.698224 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4z848_d54d05c4-b074-4189-b1dd-7ff476b824ec/console-operator/1.log" Apr 20 14:56:18.698301 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:18.698293 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-4z848" event={"ID":"d54d05c4-b074-4189-b1dd-7ff476b824ec","Type":"ContainerStarted","Data":"ca84326614451fb8215fb3e00962f8741c3bab7042bf7b3aa17a9320b9d24420"} Apr 20 14:56:18.698787 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:18.698751 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-4z848" Apr 20 14:56:18.714459 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:18.714416 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-4z848" podStartSLOduration=17.439721642 podStartE2EDuration="22.714404924s" podCreationTimestamp="2026-04-20 14:55:56 +0000 UTC" firstStartedPulling="2026-04-20 14:55:57.468701956 +0000 UTC m=+36.541977560" lastFinishedPulling="2026-04-20 14:56:02.743385233 +0000 UTC m=+41.816660842" observedRunningTime="2026-04-20 14:56:18.713364747 +0000 UTC m=+57.786640371" watchObservedRunningTime="2026-04-20 14:56:18.714404924 +0000 UTC m=+57.787680545" Apr 20 14:56:19.472751 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:19.472726 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-4z848" Apr 20 14:56:22.398755 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.398723 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wqkv4"] Apr 20 14:56:22.403335 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.403313 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-qsjv2"] Apr 20 14:56:22.403513 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.403495 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wqkv4" Apr 20 14:56:22.406270 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.406254 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-p9nl9\"" Apr 20 14:56:22.406666 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.406650 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 14:56:22.406977 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.406961 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qsjv2" Apr 20 14:56:22.408215 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.408201 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 14:56:22.409407 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.409390 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-n4j2j\"" Apr 20 14:56:22.409503 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.409487 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 14:56:22.409719 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.409707 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 14:56:22.423714 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.423690 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wqkv4"] Apr 20 14:56:22.425721 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.425693 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qsjv2"] Apr 20 14:56:22.432802 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.432781 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-79bd4b4857-ktbsk"] Apr 20 14:56:22.435875 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.435860 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.439103 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.439079 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 14:56:22.439265 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.439244 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 14:56:22.439697 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.439674 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 14:56:22.440988 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.440969 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fgmdb\"" Apr 20 14:56:22.446756 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.446739 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 14:56:22.453364 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.453334 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-79bd4b4857-ktbsk"] Apr 20 14:56:22.525156 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.525123 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cfb26a94-539f-4773-8a3d-e8350d9e2367-crio-socket\") pod \"insights-runtime-extractor-qsjv2\" (UID: \"cfb26a94-539f-4773-8a3d-e8350d9e2367\") " pod="openshift-insights/insights-runtime-extractor-qsjv2" Apr 20 14:56:22.525309 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.525162 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/022f5820-b316-4891-a62b-9cdbcd1b964e-bound-sa-token\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.525309 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.525237 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/022f5820-b316-4891-a62b-9cdbcd1b964e-registry-tls\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.525309 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.525264 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/022f5820-b316-4891-a62b-9cdbcd1b964e-ca-trust-extracted\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.525467 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.525305 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5bmt\" (UniqueName: \"kubernetes.io/projected/045ed647-e291-4700-83d9-8516e6788286-kube-api-access-j5bmt\") pod \"migrator-74bb7799d9-wqkv4\" (UID: \"045ed647-e291-4700-83d9-8516e6788286\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wqkv4" Apr 20 14:56:22.525467 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.525339 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/022f5820-b316-4891-a62b-9cdbcd1b964e-registry-certificates\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.525467 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.525387 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cfb26a94-539f-4773-8a3d-e8350d9e2367-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qsjv2\" (UID: \"cfb26a94-539f-4773-8a3d-e8350d9e2367\") " pod="openshift-insights/insights-runtime-extractor-qsjv2" Apr 20 14:56:22.525467 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.525409 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/022f5820-b316-4891-a62b-9cdbcd1b964e-image-registry-private-configuration\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.525467 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.525432 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f686l\" (UniqueName: \"kubernetes.io/projected/022f5820-b316-4891-a62b-9cdbcd1b964e-kube-api-access-f686l\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.525467 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.525464 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cfb26a94-539f-4773-8a3d-e8350d9e2367-data-volume\") pod \"insights-runtime-extractor-qsjv2\" (UID: \"cfb26a94-539f-4773-8a3d-e8350d9e2367\") " pod="openshift-insights/insights-runtime-extractor-qsjv2" Apr 20 14:56:22.525669 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.525492 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cfb26a94-539f-4773-8a3d-e8350d9e2367-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qsjv2\" (UID: \"cfb26a94-539f-4773-8a3d-e8350d9e2367\") " pod="openshift-insights/insights-runtime-extractor-qsjv2" Apr 20 14:56:22.525669 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.525532 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/022f5820-b316-4891-a62b-9cdbcd1b964e-trusted-ca\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.525669 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.525557 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/022f5820-b316-4891-a62b-9cdbcd1b964e-installation-pull-secrets\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.525669 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.525579 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdsn2\" (UniqueName: \"kubernetes.io/projected/cfb26a94-539f-4773-8a3d-e8350d9e2367-kube-api-access-pdsn2\") pod \"insights-runtime-extractor-qsjv2\" (UID: \"cfb26a94-539f-4773-8a3d-e8350d9e2367\") " pod="openshift-insights/insights-runtime-extractor-qsjv2" Apr 20 14:56:22.626185 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.626154 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cfb26a94-539f-4773-8a3d-e8350d9e2367-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qsjv2\" (UID: \"cfb26a94-539f-4773-8a3d-e8350d9e2367\") " pod="openshift-insights/insights-runtime-extractor-qsjv2" Apr 20 14:56:22.626185 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.626195 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/022f5820-b316-4891-a62b-9cdbcd1b964e-trusted-ca\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.626486 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.626214 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/022f5820-b316-4891-a62b-9cdbcd1b964e-installation-pull-secrets\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.626486 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.626244 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdsn2\" (UniqueName: \"kubernetes.io/projected/cfb26a94-539f-4773-8a3d-e8350d9e2367-kube-api-access-pdsn2\") pod \"insights-runtime-extractor-qsjv2\" (UID: \"cfb26a94-539f-4773-8a3d-e8350d9e2367\") " pod="openshift-insights/insights-runtime-extractor-qsjv2" Apr 20 14:56:22.626486 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.626291 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cfb26a94-539f-4773-8a3d-e8350d9e2367-crio-socket\") pod \"insights-runtime-extractor-qsjv2\" (UID: \"cfb26a94-539f-4773-8a3d-e8350d9e2367\") " pod="openshift-insights/insights-runtime-extractor-qsjv2" Apr 20 14:56:22.626486 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.626322 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/022f5820-b316-4891-a62b-9cdbcd1b964e-bound-sa-token\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.626486 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.626397 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/022f5820-b316-4891-a62b-9cdbcd1b964e-registry-tls\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.626486 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.626455 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cfb26a94-539f-4773-8a3d-e8350d9e2367-crio-socket\") pod \"insights-runtime-extractor-qsjv2\" (UID: \"cfb26a94-539f-4773-8a3d-e8350d9e2367\") " pod="openshift-insights/insights-runtime-extractor-qsjv2" Apr 20 14:56:22.626776 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.626571 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/022f5820-b316-4891-a62b-9cdbcd1b964e-ca-trust-extracted\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.626776 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.626626 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5bmt\" (UniqueName: \"kubernetes.io/projected/045ed647-e291-4700-83d9-8516e6788286-kube-api-access-j5bmt\") pod \"migrator-74bb7799d9-wqkv4\" (UID: \"045ed647-e291-4700-83d9-8516e6788286\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wqkv4" Apr 20 14:56:22.626776 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.626657 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/022f5820-b316-4891-a62b-9cdbcd1b964e-registry-certificates\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.626776 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.626722 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cfb26a94-539f-4773-8a3d-e8350d9e2367-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qsjv2\" (UID: \"cfb26a94-539f-4773-8a3d-e8350d9e2367\") " pod="openshift-insights/insights-runtime-extractor-qsjv2" Apr 20 14:56:22.626987 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.626818 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/022f5820-b316-4891-a62b-9cdbcd1b964e-image-registry-private-configuration\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.626987 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.626877 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f686l\" (UniqueName: \"kubernetes.io/projected/022f5820-b316-4891-a62b-9cdbcd1b964e-kube-api-access-f686l\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.626987 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.626925 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/022f5820-b316-4891-a62b-9cdbcd1b964e-ca-trust-extracted\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.626987 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.626920 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cfb26a94-539f-4773-8a3d-e8350d9e2367-data-volume\") pod \"insights-runtime-extractor-qsjv2\" (UID: \"cfb26a94-539f-4773-8a3d-e8350d9e2367\") " pod="openshift-insights/insights-runtime-extractor-qsjv2" Apr 20 14:56:22.627298 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.627274 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/022f5820-b316-4891-a62b-9cdbcd1b964e-trusted-ca\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.627500 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.627460 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cfb26a94-539f-4773-8a3d-e8350d9e2367-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qsjv2\" (UID: \"cfb26a94-539f-4773-8a3d-e8350d9e2367\") " pod="openshift-insights/insights-runtime-extractor-qsjv2" Apr 20 14:56:22.627729 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.627709 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cfb26a94-539f-4773-8a3d-e8350d9e2367-data-volume\") pod \"insights-runtime-extractor-qsjv2\" (UID: \"cfb26a94-539f-4773-8a3d-e8350d9e2367\") " pod="openshift-insights/insights-runtime-extractor-qsjv2" Apr 20 14:56:22.627806 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.627761 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/022f5820-b316-4891-a62b-9cdbcd1b964e-registry-certificates\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.629020 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.628996 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cfb26a94-539f-4773-8a3d-e8350d9e2367-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qsjv2\" (UID: \"cfb26a94-539f-4773-8a3d-e8350d9e2367\") " pod="openshift-insights/insights-runtime-extractor-qsjv2" Apr 20 14:56:22.629222 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.629206 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/022f5820-b316-4891-a62b-9cdbcd1b964e-registry-tls\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.629301 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.629282 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/022f5820-b316-4891-a62b-9cdbcd1b964e-installation-pull-secrets\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.629345 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.629285 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/022f5820-b316-4891-a62b-9cdbcd1b964e-image-registry-private-configuration\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.635455 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.635430 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/022f5820-b316-4891-a62b-9cdbcd1b964e-bound-sa-token\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.635829 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.635704 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdsn2\" (UniqueName: \"kubernetes.io/projected/cfb26a94-539f-4773-8a3d-e8350d9e2367-kube-api-access-pdsn2\") pod \"insights-runtime-extractor-qsjv2\" (UID: \"cfb26a94-539f-4773-8a3d-e8350d9e2367\") " pod="openshift-insights/insights-runtime-extractor-qsjv2" Apr 20 14:56:22.636037 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.636018 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5bmt\" (UniqueName: \"kubernetes.io/projected/045ed647-e291-4700-83d9-8516e6788286-kube-api-access-j5bmt\") pod \"migrator-74bb7799d9-wqkv4\" (UID: \"045ed647-e291-4700-83d9-8516e6788286\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wqkv4" Apr 20 14:56:22.636302 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.636288 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f686l\" (UniqueName: \"kubernetes.io/projected/022f5820-b316-4891-a62b-9cdbcd1b964e-kube-api-access-f686l\") pod \"image-registry-79bd4b4857-ktbsk\" (UID: \"022f5820-b316-4891-a62b-9cdbcd1b964e\") " pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.684780 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.684725 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pq8qx" Apr 20 14:56:22.713478 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.713447 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wqkv4" Apr 20 14:56:22.718425 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.718405 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qsjv2" Apr 20 14:56:22.745441 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.745385 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:22.858171 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:22.858141 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-wqkv4"] Apr 20 14:56:22.863637 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:56:22.863611 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod045ed647_e291_4700_83d9_8516e6788286.slice/crio-1e0ff35c78eba8ae8e8cf27612b0673f664ed0895f4b3b7ba125a0490206e596 WatchSource:0}: Error finding container 1e0ff35c78eba8ae8e8cf27612b0673f664ed0895f4b3b7ba125a0490206e596: Status 404 returned error can't find the container with id 1e0ff35c78eba8ae8e8cf27612b0673f664ed0895f4b3b7ba125a0490206e596 Apr 20 14:56:23.084491 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:23.084463 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qsjv2"] Apr 20 14:56:23.087384 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:23.087349 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-79bd4b4857-ktbsk"] Apr 20 14:56:23.087476 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:56:23.087455 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfb26a94_539f_4773_8a3d_e8350d9e2367.slice/crio-3e4288957d6a43c4777f8028877c43ef6c8d3c58d61a187e659a31b959179525 WatchSource:0}: Error finding container 3e4288957d6a43c4777f8028877c43ef6c8d3c58d61a187e659a31b959179525: Status 404 returned error can't find the container with id 3e4288957d6a43c4777f8028877c43ef6c8d3c58d61a187e659a31b959179525 Apr 20 14:56:23.091361 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:56:23.091341 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod022f5820_b316_4891_a62b_9cdbcd1b964e.slice/crio-3e5a992304068405d1687a2aa023244fcb6d1953268b27d037853a6e41dd6a72 WatchSource:0}: Error finding container 3e5a992304068405d1687a2aa023244fcb6d1953268b27d037853a6e41dd6a72: Status 404 returned error can't find the container with id 3e5a992304068405d1687a2aa023244fcb6d1953268b27d037853a6e41dd6a72 Apr 20 14:56:23.716851 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:23.716816 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qsjv2" event={"ID":"cfb26a94-539f-4773-8a3d-e8350d9e2367","Type":"ContainerStarted","Data":"afb6b2dbb62d941b2077efccf63c930455ded1672d0ae524176611fb30e48608"} Apr 20 14:56:23.717304 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:23.716860 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qsjv2" event={"ID":"cfb26a94-539f-4773-8a3d-e8350d9e2367","Type":"ContainerStarted","Data":"3e4288957d6a43c4777f8028877c43ef6c8d3c58d61a187e659a31b959179525"} Apr 20 14:56:23.717968 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:23.717938 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wqkv4" event={"ID":"045ed647-e291-4700-83d9-8516e6788286","Type":"ContainerStarted","Data":"1e0ff35c78eba8ae8e8cf27612b0673f664ed0895f4b3b7ba125a0490206e596"} Apr 20 14:56:23.719360 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:23.719335 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" event={"ID":"022f5820-b316-4891-a62b-9cdbcd1b964e","Type":"ContainerStarted","Data":"7b5e6635bd9fa53ab13cea5d54c342eed38534843153a87f2ddb83c7415ea960"} Apr 20 14:56:23.719463 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:23.719389 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" event={"ID":"022f5820-b316-4891-a62b-9cdbcd1b964e","Type":"ContainerStarted","Data":"3e5a992304068405d1687a2aa023244fcb6d1953268b27d037853a6e41dd6a72"} Apr 20 14:56:23.719510 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:23.719486 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:23.738356 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:23.738306 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" podStartSLOduration=1.7382889289999999 podStartE2EDuration="1.738288929s" podCreationTimestamp="2026-04-20 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:56:23.736887768 +0000 UTC m=+62.810163394" watchObservedRunningTime="2026-04-20 14:56:23.738288929 +0000 UTC m=+62.811564555" Apr 20 14:56:24.723984 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:24.723951 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qsjv2" event={"ID":"cfb26a94-539f-4773-8a3d-e8350d9e2367","Type":"ContainerStarted","Data":"d7c8ad79f82456faaf38fece0ddf8a3b672a5e2f21d7b5e808443b58b5c47b8d"} Apr 20 14:56:24.725495 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:24.725474 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wqkv4" event={"ID":"045ed647-e291-4700-83d9-8516e6788286","Type":"ContainerStarted","Data":"05b711085c20f2741ea603b0927c40119bd5c1f6fc7402b70fb8bccae8fa2efb"} Apr 20 14:56:24.725603 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:24.725500 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wqkv4" event={"ID":"045ed647-e291-4700-83d9-8516e6788286","Type":"ContainerStarted","Data":"41c0c83a85f0f69a1c50a4f4f0f038f6d3bbe3e553557f159dad4d7e4a45e5ef"} Apr 20 14:56:24.740454 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:24.740409 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-wqkv4" podStartSLOduration=1.239083056 podStartE2EDuration="2.740397039s" podCreationTimestamp="2026-04-20 14:56:22 +0000 UTC" firstStartedPulling="2026-04-20 14:56:22.867541838 +0000 UTC m=+61.940817457" lastFinishedPulling="2026-04-20 14:56:24.368855835 +0000 UTC m=+63.442131440" observedRunningTime="2026-04-20 14:56:24.740163834 +0000 UTC m=+63.813439458" watchObservedRunningTime="2026-04-20 14:56:24.740397039 +0000 UTC m=+63.813672665" Apr 20 14:56:26.732249 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:26.732219 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qsjv2" event={"ID":"cfb26a94-539f-4773-8a3d-e8350d9e2367","Type":"ContainerStarted","Data":"665fc71b704272986998602b52be80f8da24650c14c6b01530e908a1daa27ef3"} Apr 20 14:56:26.752560 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:26.752517 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-qsjv2" podStartSLOduration=1.621938178 podStartE2EDuration="4.75250451s" podCreationTimestamp="2026-04-20 14:56:22 +0000 UTC" firstStartedPulling="2026-04-20 14:56:23.15266174 +0000 UTC m=+62.225937349" lastFinishedPulling="2026-04-20 14:56:26.283228074 +0000 UTC m=+65.356503681" observedRunningTime="2026-04-20 14:56:26.751333688 +0000 UTC m=+65.824609313" watchObservedRunningTime="2026-04-20 14:56:26.75250451 +0000 UTC m=+65.825780135" Apr 20 14:56:27.162999 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:27.162961 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs\") pod \"network-metrics-daemon-h24vc\" (UID: \"90e2aae6-6b60-4b8e-a0ba-12474f425b1d\") " pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:56:27.165583 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:27.165566 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 14:56:27.175946 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:27.175923 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90e2aae6-6b60-4b8e-a0ba-12474f425b1d-metrics-certs\") pod \"network-metrics-daemon-h24vc\" (UID: \"90e2aae6-6b60-4b8e-a0ba-12474f425b1d\") " pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:56:27.256153 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:27.256123 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6rqtq\"" Apr 20 14:56:27.264183 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:27.264167 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h24vc" Apr 20 14:56:27.364305 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:27.364271 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cckvm\" (UniqueName: \"kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm\") pod \"network-check-target-j2mjp\" (UID: \"0ca0f6c2-6280-464c-8916-90374e2c88b8\") " pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:56:27.366704 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:27.366679 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cckvm\" (UniqueName: \"kubernetes.io/projected/0ca0f6c2-6280-464c-8916-90374e2c88b8-kube-api-access-cckvm\") pod \"network-check-target-j2mjp\" (UID: \"0ca0f6c2-6280-464c-8916-90374e2c88b8\") " pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:56:27.377015 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:27.376992 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h24vc"] Apr 20 14:56:27.380122 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:56:27.380097 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90e2aae6_6b60_4b8e_a0ba_12474f425b1d.slice/crio-3f5ad462449a70986e4b146d2b1bda8f58425a24c1b0783acf77d5d3c67614c8 WatchSource:0}: Error finding container 3f5ad462449a70986e4b146d2b1bda8f58425a24c1b0783acf77d5d3c67614c8: Status 404 returned error can't find the container with id 3f5ad462449a70986e4b146d2b1bda8f58425a24c1b0783acf77d5d3c67614c8 Apr 20 14:56:27.561324 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:27.561297 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-6x78v\"" Apr 20 14:56:27.569570 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:27.569548 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:56:27.700992 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:27.700962 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-j2mjp"] Apr 20 14:56:27.703947 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:56:27.703920 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ca0f6c2_6280_464c_8916_90374e2c88b8.slice/crio-95cc748e30f5dfa2401a27fd3e941bce6660a73c9b2557de24bedcb285e0f6a9 WatchSource:0}: Error finding container 95cc748e30f5dfa2401a27fd3e941bce6660a73c9b2557de24bedcb285e0f6a9: Status 404 returned error can't find the container with id 95cc748e30f5dfa2401a27fd3e941bce6660a73c9b2557de24bedcb285e0f6a9 Apr 20 14:56:27.735038 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:27.735009 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-j2mjp" event={"ID":"0ca0f6c2-6280-464c-8916-90374e2c88b8","Type":"ContainerStarted","Data":"95cc748e30f5dfa2401a27fd3e941bce6660a73c9b2557de24bedcb285e0f6a9"} Apr 20 14:56:27.736257 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:27.736229 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h24vc" event={"ID":"90e2aae6-6b60-4b8e-a0ba-12474f425b1d","Type":"ContainerStarted","Data":"3f5ad462449a70986e4b146d2b1bda8f58425a24c1b0783acf77d5d3c67614c8"} Apr 20 14:56:28.740385 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:28.740282 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-j2mjp" event={"ID":"0ca0f6c2-6280-464c-8916-90374e2c88b8","Type":"ContainerStarted","Data":"a75ede76cbd4bb8dd0763f06457f9c1e5ba36acbd9697c3ad53f8727a70ee3d1"} Apr 20 14:56:28.740790 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:28.740363 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:56:28.741718 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:28.741698 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h24vc" event={"ID":"90e2aae6-6b60-4b8e-a0ba-12474f425b1d","Type":"ContainerStarted","Data":"e86a198db68f8734cd001d079f9bac47dfcbc31b3ddf24ead90eb90c559b1b65"} Apr 20 14:56:28.741804 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:28.741722 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h24vc" event={"ID":"90e2aae6-6b60-4b8e-a0ba-12474f425b1d","Type":"ContainerStarted","Data":"0496e0f66f2420785e0857d9a1bfcbe60d3a8243f0eb37434ebe588e2e32f924"} Apr 20 14:56:28.755475 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:28.755436 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-j2mjp" podStartSLOduration=67.755423362 podStartE2EDuration="1m7.755423362s" podCreationTimestamp="2026-04-20 14:55:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:56:28.754255944 +0000 UTC m=+67.827531570" watchObservedRunningTime="2026-04-20 14:56:28.755423362 +0000 UTC m=+67.828698986" Apr 20 14:56:28.769818 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:28.769781 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-h24vc" podStartSLOduration=66.825270941 podStartE2EDuration="1m7.769770098s" podCreationTimestamp="2026-04-20 14:55:21 +0000 UTC" firstStartedPulling="2026-04-20 14:56:27.382119485 +0000 UTC m=+66.455395088" lastFinishedPulling="2026-04-20 14:56:28.326618638 +0000 UTC m=+67.399894245" observedRunningTime="2026-04-20 14:56:28.768775679 +0000 UTC m=+67.842051306" watchObservedRunningTime="2026-04-20 14:56:28.769770098 +0000 UTC m=+67.843045723" Apr 20 14:56:28.875750 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:28.875713 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2whnr\" (UID: \"27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr" Apr 20 14:56:28.878115 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:28.878088 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-2whnr\" (UID: \"27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr" Apr 20 14:56:28.976487 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:28.976451 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9444ff6f-3ede-40a2-a63c-97c92b90d755-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-l7992\" (UID: \"9444ff6f-3ede-40a2-a63c-97c92b90d755\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l7992" Apr 20 14:56:28.978687 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:28.978661 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9444ff6f-3ede-40a2-a63c-97c92b90d755-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-l7992\" (UID: \"9444ff6f-3ede-40a2-a63c-97c92b90d755\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-l7992" Apr 20 14:56:29.156639 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.156613 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-nsgjj\"" Apr 20 14:56:29.164719 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.164702 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr" Apr 20 14:56:29.267073 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.267043 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-2ljbw\"" Apr 20 14:56:29.273029 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.273005 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr"] Apr 20 14:56:29.275464 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.275440 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-l7992" Apr 20 14:56:29.275776 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:56:29.275755 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27348b4c_3c6d_4f5c_aecc_ee4f7ea4eac8.slice/crio-7d841d8d0867be9d0eb26ad77cce6ce62e85f81d319b80225bc6f71c520c7008 WatchSource:0}: Error finding container 7d841d8d0867be9d0eb26ad77cce6ce62e85f81d319b80225bc6f71c520c7008: Status 404 returned error can't find the container with id 7d841d8d0867be9d0eb26ad77cce6ce62e85f81d319b80225bc6f71c520c7008 Apr 20 14:56:29.392318 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.392286 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-l7992"] Apr 20 14:56:29.395088 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:56:29.395059 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9444ff6f_3ede_40a2_a63c_97c92b90d755.slice/crio-611f75ad759bf80cb2907e709d0d2c45ec4e345d8258526b80abe314b6956483 WatchSource:0}: Error finding container 611f75ad759bf80cb2907e709d0d2c45ec4e345d8258526b80abe314b6956483: Status 404 returned error can't find the container with id 611f75ad759bf80cb2907e709d0d2c45ec4e345d8258526b80abe314b6956483 Apr 20 14:56:29.746079 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.746039 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr" event={"ID":"27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8","Type":"ContainerStarted","Data":"7d841d8d0867be9d0eb26ad77cce6ce62e85f81d319b80225bc6f71c520c7008"} Apr 20 14:56:29.747158 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.747119 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-l7992" event={"ID":"9444ff6f-3ede-40a2-a63c-97c92b90d755","Type":"ContainerStarted","Data":"611f75ad759bf80cb2907e709d0d2c45ec4e345d8258526b80abe314b6956483"} Apr 20 14:56:29.765688 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.765661 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-844d9bc4d-pbbvz"] Apr 20 14:56:29.768914 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.768894 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:29.771450 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.771431 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 14:56:29.771559 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.771542 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 14:56:29.773029 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.772696 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 14:56:29.773029 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.772787 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 14:56:29.773029 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.772814 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 14:56:29.773029 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.772880 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 14:56:29.773029 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.772816 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-qxlxn\"" Apr 20 14:56:29.773308 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.773096 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 14:56:29.777044 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.777022 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-844d9bc4d-pbbvz"] Apr 20 14:56:29.882242 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.882215 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1df0262d-3921-42b7-a79c-4f7340fcfe7e-oauth-serving-cert\") pod \"console-844d9bc4d-pbbvz\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:29.882412 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.882258 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1df0262d-3921-42b7-a79c-4f7340fcfe7e-console-serving-cert\") pod \"console-844d9bc4d-pbbvz\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:29.882412 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.882398 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmq6v\" (UniqueName: \"kubernetes.io/projected/1df0262d-3921-42b7-a79c-4f7340fcfe7e-kube-api-access-cmq6v\") pod \"console-844d9bc4d-pbbvz\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:29.882533 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.882450 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1df0262d-3921-42b7-a79c-4f7340fcfe7e-console-oauth-config\") pod \"console-844d9bc4d-pbbvz\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:29.882533 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.882483 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1df0262d-3921-42b7-a79c-4f7340fcfe7e-console-config\") pod \"console-844d9bc4d-pbbvz\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:29.882533 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.882522 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1df0262d-3921-42b7-a79c-4f7340fcfe7e-service-ca\") pod \"console-844d9bc4d-pbbvz\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:29.983686 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.983648 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmq6v\" (UniqueName: \"kubernetes.io/projected/1df0262d-3921-42b7-a79c-4f7340fcfe7e-kube-api-access-cmq6v\") pod \"console-844d9bc4d-pbbvz\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:29.983860 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.983696 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1df0262d-3921-42b7-a79c-4f7340fcfe7e-console-oauth-config\") pod \"console-844d9bc4d-pbbvz\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:29.983860 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.983719 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1df0262d-3921-42b7-a79c-4f7340fcfe7e-console-config\") pod \"console-844d9bc4d-pbbvz\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:29.983860 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.983746 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1df0262d-3921-42b7-a79c-4f7340fcfe7e-service-ca\") pod \"console-844d9bc4d-pbbvz\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:29.983860 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.983800 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1df0262d-3921-42b7-a79c-4f7340fcfe7e-oauth-serving-cert\") pod \"console-844d9bc4d-pbbvz\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:29.983860 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.983841 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1df0262d-3921-42b7-a79c-4f7340fcfe7e-console-serving-cert\") pod \"console-844d9bc4d-pbbvz\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:29.984712 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.984688 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1df0262d-3921-42b7-a79c-4f7340fcfe7e-console-config\") pod \"console-844d9bc4d-pbbvz\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:29.984712 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.984694 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1df0262d-3921-42b7-a79c-4f7340fcfe7e-service-ca\") pod \"console-844d9bc4d-pbbvz\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:29.984907 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.984694 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1df0262d-3921-42b7-a79c-4f7340fcfe7e-oauth-serving-cert\") pod \"console-844d9bc4d-pbbvz\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:29.986484 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.986461 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1df0262d-3921-42b7-a79c-4f7340fcfe7e-console-oauth-config\") pod \"console-844d9bc4d-pbbvz\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:29.986610 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.986593 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1df0262d-3921-42b7-a79c-4f7340fcfe7e-console-serving-cert\") pod \"console-844d9bc4d-pbbvz\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:29.992580 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:29.992555 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmq6v\" (UniqueName: \"kubernetes.io/projected/1df0262d-3921-42b7-a79c-4f7340fcfe7e-kube-api-access-cmq6v\") pod \"console-844d9bc4d-pbbvz\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:30.081651 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:30.081602 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:30.222076 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:30.222044 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-844d9bc4d-pbbvz"] Apr 20 14:56:30.408390 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:56:30.408294 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1df0262d_3921_42b7_a79c_4f7340fcfe7e.slice/crio-4faf431c7b6937627056add9646eec0236f10f14e664f3dd06dce913a7030501 WatchSource:0}: Error finding container 4faf431c7b6937627056add9646eec0236f10f14e664f3dd06dce913a7030501: Status 404 returned error can't find the container with id 4faf431c7b6937627056add9646eec0236f10f14e664f3dd06dce913a7030501 Apr 20 14:56:30.752096 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:30.752015 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-844d9bc4d-pbbvz" event={"ID":"1df0262d-3921-42b7-a79c-4f7340fcfe7e","Type":"ContainerStarted","Data":"4faf431c7b6937627056add9646eec0236f10f14e664f3dd06dce913a7030501"} Apr 20 14:56:31.756980 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:31.756939 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-l7992" event={"ID":"9444ff6f-3ede-40a2-a63c-97c92b90d755","Type":"ContainerStarted","Data":"2cb6ebd2f60939828d3bb2f23261a5e6fa2d10215c32bfec709f6d4c5ea0ddbe"} Apr 20 14:56:31.758310 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:31.758283 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr" event={"ID":"27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8","Type":"ContainerStarted","Data":"035fde574ab949d6937f4e6a60f60aa7ed3c920b4c6b37cc7272a9245fa7c5ec"} Apr 20 14:56:31.772666 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:31.772617 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-l7992" podStartSLOduration=33.716857364 podStartE2EDuration="34.772599592s" podCreationTimestamp="2026-04-20 14:55:57 +0000 UTC" firstStartedPulling="2026-04-20 14:56:29.397129749 +0000 UTC m=+68.470405367" lastFinishedPulling="2026-04-20 14:56:30.452871992 +0000 UTC m=+69.526147595" observedRunningTime="2026-04-20 14:56:31.771099196 +0000 UTC m=+70.844374823" watchObservedRunningTime="2026-04-20 14:56:31.772599592 +0000 UTC m=+70.845875219" Apr 20 14:56:31.790572 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:31.789853 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-2whnr" podStartSLOduration=33.110499214 podStartE2EDuration="34.789840024s" podCreationTimestamp="2026-04-20 14:55:57 +0000 UTC" firstStartedPulling="2026-04-20 14:56:29.279586018 +0000 UTC m=+68.352861620" lastFinishedPulling="2026-04-20 14:56:30.95892681 +0000 UTC m=+70.032202430" observedRunningTime="2026-04-20 14:56:31.788960832 +0000 UTC m=+70.862236457" watchObservedRunningTime="2026-04-20 14:56:31.789840024 +0000 UTC m=+70.863115652" Apr 20 14:56:33.765005 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:33.764921 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-844d9bc4d-pbbvz" event={"ID":"1df0262d-3921-42b7-a79c-4f7340fcfe7e","Type":"ContainerStarted","Data":"9d06e4cd34ebe7986d575f79d39cc05a8403dba90f78692c179a758d56d8a502"} Apr 20 14:56:33.781455 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:33.781411 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-844d9bc4d-pbbvz" podStartSLOduration=1.727809577 podStartE2EDuration="4.781395066s" podCreationTimestamp="2026-04-20 14:56:29 +0000 UTC" firstStartedPulling="2026-04-20 14:56:30.41079755 +0000 UTC m=+69.484073155" lastFinishedPulling="2026-04-20 14:56:33.464383037 +0000 UTC m=+72.537658644" observedRunningTime="2026-04-20 14:56:33.780953284 +0000 UTC m=+72.854228909" watchObservedRunningTime="2026-04-20 14:56:33.781395066 +0000 UTC m=+72.854670689" Apr 20 14:56:38.552092 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.552055 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c55d4464d-gjrx7"] Apr 20 14:56:38.554154 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.554132 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.562849 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.562828 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 14:56:38.563402 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.563384 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c55d4464d-gjrx7"] Apr 20 14:56:38.646343 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.646309 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-console-config\") pod \"console-5c55d4464d-gjrx7\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.646343 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.646348 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-service-ca\") pod \"console-5c55d4464d-gjrx7\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.646544 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.646387 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b961b38-b91c-4c91-8747-427f78e55420-console-serving-cert\") pod \"console-5c55d4464d-gjrx7\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.646544 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.646438 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8b961b38-b91c-4c91-8747-427f78e55420-console-oauth-config\") pod \"console-5c55d4464d-gjrx7\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.646544 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.646454 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjkft\" (UniqueName: \"kubernetes.io/projected/8b961b38-b91c-4c91-8747-427f78e55420-kube-api-access-xjkft\") pod \"console-5c55d4464d-gjrx7\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.646544 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.646473 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-trusted-ca-bundle\") pod \"console-5c55d4464d-gjrx7\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.646544 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.646507 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-oauth-serving-cert\") pod \"console-5c55d4464d-gjrx7\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.747636 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.747605 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8b961b38-b91c-4c91-8747-427f78e55420-console-oauth-config\") pod \"console-5c55d4464d-gjrx7\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.747636 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.747645 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjkft\" (UniqueName: \"kubernetes.io/projected/8b961b38-b91c-4c91-8747-427f78e55420-kube-api-access-xjkft\") pod \"console-5c55d4464d-gjrx7\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.747903 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.747674 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-trusted-ca-bundle\") pod \"console-5c55d4464d-gjrx7\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.747903 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.747717 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-oauth-serving-cert\") pod \"console-5c55d4464d-gjrx7\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.747903 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.747767 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-console-config\") pod \"console-5c55d4464d-gjrx7\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.747903 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.747797 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-service-ca\") pod \"console-5c55d4464d-gjrx7\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.747903 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.747823 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b961b38-b91c-4c91-8747-427f78e55420-console-serving-cert\") pod \"console-5c55d4464d-gjrx7\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.748579 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.748553 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-service-ca\") pod \"console-5c55d4464d-gjrx7\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.748579 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.748568 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-oauth-serving-cert\") pod \"console-5c55d4464d-gjrx7\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.748726 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.748558 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-console-config\") pod \"console-5c55d4464d-gjrx7\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.748988 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.748971 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-trusted-ca-bundle\") pod \"console-5c55d4464d-gjrx7\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.750145 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.750117 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8b961b38-b91c-4c91-8747-427f78e55420-console-oauth-config\") pod \"console-5c55d4464d-gjrx7\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.750352 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.750332 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b961b38-b91c-4c91-8747-427f78e55420-console-serving-cert\") pod \"console-5c55d4464d-gjrx7\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.755526 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.755506 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjkft\" (UniqueName: \"kubernetes.io/projected/8b961b38-b91c-4c91-8747-427f78e55420-kube-api-access-xjkft\") pod \"console-5c55d4464d-gjrx7\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.862968 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.862931 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:38.977380 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:38.977329 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c55d4464d-gjrx7"] Apr 20 14:56:38.980894 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:56:38.980868 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b961b38_b91c_4c91_8747_427f78e55420.slice/crio-bd037cc65a99bb8f8c9db3a980473df704cf4bcbd07a06547f99ded177362f04 WatchSource:0}: Error finding container bd037cc65a99bb8f8c9db3a980473df704cf4bcbd07a06547f99ded177362f04: Status 404 returned error can't find the container with id bd037cc65a99bb8f8c9db3a980473df704cf4bcbd07a06547f99ded177362f04 Apr 20 14:56:39.782787 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:39.782755 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c55d4464d-gjrx7" event={"ID":"8b961b38-b91c-4c91-8747-427f78e55420","Type":"ContainerStarted","Data":"8233260e2c00d7375ef03a62c092ab185aa75bf597fe6c9c666efc3fbd40d5cc"} Apr 20 14:56:39.782787 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:39.782788 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c55d4464d-gjrx7" event={"ID":"8b961b38-b91c-4c91-8747-427f78e55420","Type":"ContainerStarted","Data":"bd037cc65a99bb8f8c9db3a980473df704cf4bcbd07a06547f99ded177362f04"} Apr 20 14:56:39.799652 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:39.799615 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c55d4464d-gjrx7" podStartSLOduration=1.79960261 podStartE2EDuration="1.79960261s" podCreationTimestamp="2026-04-20 14:56:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:56:39.799066144 +0000 UTC m=+78.872341787" watchObservedRunningTime="2026-04-20 14:56:39.79960261 +0000 UTC m=+78.872878234" Apr 20 14:56:40.081784 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:40.081756 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:40.081912 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:40.081802 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:40.086335 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:40.086311 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:40.789211 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:40.789186 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:56:44.729656 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:44.729627 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-79bd4b4857-ktbsk" Apr 20 14:56:45.914773 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:45.914743 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-n9dx7"] Apr 20 14:56:45.919845 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:45.919821 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:45.922523 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:45.922497 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 14:56:45.922523 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:45.922510 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 14:56:45.922672 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:45.922586 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-cx2mf\"" Apr 20 14:56:45.923655 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:45.923639 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 14:56:45.923747 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:45.923660 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 14:56:46.006641 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.006614 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4717f11-2104-4860-9b31-d3171a0eacef-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.006754 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.006649 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f4717f11-2104-4860-9b31-d3171a0eacef-node-exporter-accelerators-collector-config\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.006754 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.006675 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f4717f11-2104-4860-9b31-d3171a0eacef-node-exporter-textfile\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.006754 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.006705 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f4717f11-2104-4860-9b31-d3171a0eacef-node-exporter-wtmp\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.006886 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.006793 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f4717f11-2104-4860-9b31-d3171a0eacef-root\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.006886 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.006824 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4717f11-2104-4860-9b31-d3171a0eacef-sys\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.006886 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.006840 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f4717f11-2104-4860-9b31-d3171a0eacef-node-exporter-tls\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.007025 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.006902 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ss2d\" (UniqueName: \"kubernetes.io/projected/f4717f11-2104-4860-9b31-d3171a0eacef-kube-api-access-5ss2d\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.007025 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.006966 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4717f11-2104-4860-9b31-d3171a0eacef-metrics-client-ca\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.108054 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.108027 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4717f11-2104-4860-9b31-d3171a0eacef-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.108190 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.108063 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f4717f11-2104-4860-9b31-d3171a0eacef-node-exporter-accelerators-collector-config\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.108190 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.108085 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f4717f11-2104-4860-9b31-d3171a0eacef-node-exporter-textfile\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.108190 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.108112 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f4717f11-2104-4860-9b31-d3171a0eacef-node-exporter-wtmp\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.108190 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.108135 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f4717f11-2104-4860-9b31-d3171a0eacef-root\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.108190 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.108151 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4717f11-2104-4860-9b31-d3171a0eacef-sys\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.108190 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.108164 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f4717f11-2104-4860-9b31-d3171a0eacef-node-exporter-tls\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.108190 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.108185 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ss2d\" (UniqueName: \"kubernetes.io/projected/f4717f11-2104-4860-9b31-d3171a0eacef-kube-api-access-5ss2d\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.108550 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.108223 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4717f11-2104-4860-9b31-d3171a0eacef-metrics-client-ca\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.108550 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.108239 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f4717f11-2104-4860-9b31-d3171a0eacef-root\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.108550 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.108260 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f4717f11-2104-4860-9b31-d3171a0eacef-sys\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.108550 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:46.108335 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 14:56:46.108550 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:46.108432 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f4717f11-2104-4860-9b31-d3171a0eacef-node-exporter-tls podName:f4717f11-2104-4860-9b31-d3171a0eacef nodeName:}" failed. No retries permitted until 2026-04-20 14:56:46.608410409 +0000 UTC m=+85.681686023 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/f4717f11-2104-4860-9b31-d3171a0eacef-node-exporter-tls") pod "node-exporter-n9dx7" (UID: "f4717f11-2104-4860-9b31-d3171a0eacef") : secret "node-exporter-tls" not found Apr 20 14:56:46.108550 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.108476 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f4717f11-2104-4860-9b31-d3171a0eacef-node-exporter-wtmp\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.108550 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.108519 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f4717f11-2104-4860-9b31-d3171a0eacef-node-exporter-textfile\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.108807 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.108681 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f4717f11-2104-4860-9b31-d3171a0eacef-node-exporter-accelerators-collector-config\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.108853 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.108843 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f4717f11-2104-4860-9b31-d3171a0eacef-metrics-client-ca\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.110384 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.110355 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f4717f11-2104-4860-9b31-d3171a0eacef-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.119710 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.119682 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ss2d\" (UniqueName: \"kubernetes.io/projected/f4717f11-2104-4860-9b31-d3171a0eacef-kube-api-access-5ss2d\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.612772 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.612744 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f4717f11-2104-4860-9b31-d3171a0eacef-node-exporter-tls\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.614815 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.614785 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f4717f11-2104-4860-9b31-d3171a0eacef-node-exporter-tls\") pod \"node-exporter-n9dx7\" (UID: \"f4717f11-2104-4860-9b31-d3171a0eacef\") " pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.829680 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:46.829649 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-n9dx7" Apr 20 14:56:46.838948 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:56:46.838922 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4717f11_2104_4860_9b31_d3171a0eacef.slice/crio-823d4323ae3bdb2ddb280400637f74a9d726de400c27009d9f5e6f13b459325f WatchSource:0}: Error finding container 823d4323ae3bdb2ddb280400637f74a9d726de400c27009d9f5e6f13b459325f: Status 404 returned error can't find the container with id 823d4323ae3bdb2ddb280400637f74a9d726de400c27009d9f5e6f13b459325f Apr 20 14:56:47.805956 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:47.805926 2574 generic.go:358] "Generic (PLEG): container finished" podID="f4717f11-2104-4860-9b31-d3171a0eacef" containerID="8582313947f447193ece51c130c5e05cc8ec81ad84e4f164b3c29a1c4af20723" exitCode=0 Apr 20 14:56:47.806304 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:47.805994 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n9dx7" event={"ID":"f4717f11-2104-4860-9b31-d3171a0eacef","Type":"ContainerDied","Data":"8582313947f447193ece51c130c5e05cc8ec81ad84e4f164b3c29a1c4af20723"} Apr 20 14:56:47.806304 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:47.806033 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n9dx7" event={"ID":"f4717f11-2104-4860-9b31-d3171a0eacef","Type":"ContainerStarted","Data":"823d4323ae3bdb2ddb280400637f74a9d726de400c27009d9f5e6f13b459325f"} Apr 20 14:56:48.810433 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:48.810397 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n9dx7" event={"ID":"f4717f11-2104-4860-9b31-d3171a0eacef","Type":"ContainerStarted","Data":"30086ba12a358f8a9a41820cb834d562d07a80dc7f5d1f98bf28468d7728bbcc"} Apr 20 14:56:48.810433 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:48.810438 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-n9dx7" event={"ID":"f4717f11-2104-4860-9b31-d3171a0eacef","Type":"ContainerStarted","Data":"758518ef9185975988ab26a6f23d8491be8ecede58e80a4c57ed919c60d21c7c"} Apr 20 14:56:48.846253 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:48.846202 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-n9dx7" podStartSLOduration=3.106788775 podStartE2EDuration="3.846185068s" podCreationTimestamp="2026-04-20 14:56:45 +0000 UTC" firstStartedPulling="2026-04-20 14:56:46.840900592 +0000 UTC m=+85.914176199" lastFinishedPulling="2026-04-20 14:56:47.580296885 +0000 UTC m=+86.653572492" observedRunningTime="2026-04-20 14:56:48.84483097 +0000 UTC m=+87.918106595" watchObservedRunningTime="2026-04-20 14:56:48.846185068 +0000 UTC m=+87.919460692" Apr 20 14:56:48.863845 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:48.863808 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:48.863967 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:48.863872 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:48.868307 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:48.868282 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:49.816751 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:49.816725 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:56:49.867292 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:49.867266 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-844d9bc4d-pbbvz"] Apr 20 14:56:50.676886 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:50.676850 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-h6r27"] Apr 20 14:56:50.679947 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:50.679931 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h6r27" Apr 20 14:56:50.682678 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:50.682656 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 20 14:56:50.682766 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:50.682690 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-4bg88\"" Apr 20 14:56:50.687652 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:50.687631 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-h6r27"] Apr 20 14:56:50.852182 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:50.852152 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1b00210f-5d58-409a-91ec-366aaeae3d8f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-h6r27\" (UID: \"1b00210f-5d58-409a-91ec-366aaeae3d8f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h6r27" Apr 20 14:56:50.953158 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:50.953084 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1b00210f-5d58-409a-91ec-366aaeae3d8f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-h6r27\" (UID: \"1b00210f-5d58-409a-91ec-366aaeae3d8f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h6r27" Apr 20 14:56:50.953435 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:50.953404 2574 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 20 14:56:50.953567 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:56:50.953507 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1b00210f-5d58-409a-91ec-366aaeae3d8f-monitoring-plugin-cert podName:1b00210f-5d58-409a-91ec-366aaeae3d8f nodeName:}" failed. No retries permitted until 2026-04-20 14:56:51.45348347 +0000 UTC m=+90.526759116 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/1b00210f-5d58-409a-91ec-366aaeae3d8f-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-h6r27" (UID: "1b00210f-5d58-409a-91ec-366aaeae3d8f") : secret "monitoring-plugin-cert" not found Apr 20 14:56:51.457804 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:51.457775 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1b00210f-5d58-409a-91ec-366aaeae3d8f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-h6r27\" (UID: \"1b00210f-5d58-409a-91ec-366aaeae3d8f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h6r27" Apr 20 14:56:51.460105 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:51.460076 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1b00210f-5d58-409a-91ec-366aaeae3d8f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-h6r27\" (UID: \"1b00210f-5d58-409a-91ec-366aaeae3d8f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h6r27" Apr 20 14:56:51.588890 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:51.588863 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h6r27" Apr 20 14:56:51.705080 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:51.705058 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-h6r27"] Apr 20 14:56:51.707651 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:56:51.707620 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b00210f_5d58_409a_91ec_366aaeae3d8f.slice/crio-589939c7419dfd10c482dc07408cf7152ac94fc570c7c9c4e117262de5132378 WatchSource:0}: Error finding container 589939c7419dfd10c482dc07408cf7152ac94fc570c7c9c4e117262de5132378: Status 404 returned error can't find the container with id 589939c7419dfd10c482dc07408cf7152ac94fc570c7c9c4e117262de5132378 Apr 20 14:56:51.818971 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:51.818936 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h6r27" event={"ID":"1b00210f-5d58-409a-91ec-366aaeae3d8f","Type":"ContainerStarted","Data":"589939c7419dfd10c482dc07408cf7152ac94fc570c7c9c4e117262de5132378"} Apr 20 14:56:52.210452 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.210418 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5fdb9b4f89-ckw6j"] Apr 20 14:56:52.213625 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.213603 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.230037 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.230009 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fdb9b4f89-ckw6j"] Apr 20 14:56:52.366554 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.366516 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-console-oauth-config\") pod \"console-5fdb9b4f89-ckw6j\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.366732 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.366593 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-console-serving-cert\") pod \"console-5fdb9b4f89-ckw6j\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.366732 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.366663 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-console-config\") pod \"console-5fdb9b4f89-ckw6j\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.366732 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.366701 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-trusted-ca-bundle\") pod \"console-5fdb9b4f89-ckw6j\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.366732 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.366721 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7x69\" (UniqueName: \"kubernetes.io/projected/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-kube-api-access-b7x69\") pod \"console-5fdb9b4f89-ckw6j\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.366927 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.366742 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-oauth-serving-cert\") pod \"console-5fdb9b4f89-ckw6j\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.366927 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.366778 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-service-ca\") pod \"console-5fdb9b4f89-ckw6j\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.468051 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.467956 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-console-config\") pod \"console-5fdb9b4f89-ckw6j\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.468051 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.467995 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-trusted-ca-bundle\") pod \"console-5fdb9b4f89-ckw6j\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.468051 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.468022 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7x69\" (UniqueName: \"kubernetes.io/projected/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-kube-api-access-b7x69\") pod \"console-5fdb9b4f89-ckw6j\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.468051 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.468048 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-oauth-serving-cert\") pod \"console-5fdb9b4f89-ckw6j\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.468402 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.468086 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-service-ca\") pod \"console-5fdb9b4f89-ckw6j\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.469442 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.468869 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-console-oauth-config\") pod \"console-5fdb9b4f89-ckw6j\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.469442 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.468985 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-console-serving-cert\") pod \"console-5fdb9b4f89-ckw6j\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.469608 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.469459 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-service-ca\") pod \"console-5fdb9b4f89-ckw6j\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.469608 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.469481 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-trusted-ca-bundle\") pod \"console-5fdb9b4f89-ckw6j\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.474931 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.470103 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-oauth-serving-cert\") pod \"console-5fdb9b4f89-ckw6j\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.474931 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.470427 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-console-config\") pod \"console-5fdb9b4f89-ckw6j\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.474931 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.472802 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-console-serving-cert\") pod \"console-5fdb9b4f89-ckw6j\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.475732 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.475553 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-console-oauth-config\") pod \"console-5fdb9b4f89-ckw6j\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.482027 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.482005 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7x69\" (UniqueName: \"kubernetes.io/projected/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-kube-api-access-b7x69\") pod \"console-5fdb9b4f89-ckw6j\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.524906 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.524876 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:56:52.655968 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:52.655589 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fdb9b4f89-ckw6j"] Apr 20 14:56:52.906383 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:56:52.906344 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d23fa62_b327_4aaa_a8cd_4fc4484a2273.slice/crio-2b75297d034876fd3cfc5a263d31a91434b0440cba6aa94bd578214d03edc6c8 WatchSource:0}: Error finding container 2b75297d034876fd3cfc5a263d31a91434b0440cba6aa94bd578214d03edc6c8: Status 404 returned error can't find the container with id 2b75297d034876fd3cfc5a263d31a91434b0440cba6aa94bd578214d03edc6c8 Apr 20 14:56:53.826519 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:53.826481 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h6r27" event={"ID":"1b00210f-5d58-409a-91ec-366aaeae3d8f","Type":"ContainerStarted","Data":"39828dd9c6d5279978fdd846d7fbc5ed65db35a6a7d9d466ba5fd333b3613788"} Apr 20 14:56:53.826965 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:53.826646 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h6r27" Apr 20 14:56:53.827992 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:53.827965 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fdb9b4f89-ckw6j" event={"ID":"0d23fa62-b327-4aaa-a8cd-4fc4484a2273","Type":"ContainerStarted","Data":"491c86dddecc9a1dad17d4b34161aef302d3f18d76c0fd156dfc9962658a8c5b"} Apr 20 14:56:53.827992 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:53.827994 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fdb9b4f89-ckw6j" event={"ID":"0d23fa62-b327-4aaa-a8cd-4fc4484a2273","Type":"ContainerStarted","Data":"2b75297d034876fd3cfc5a263d31a91434b0440cba6aa94bd578214d03edc6c8"} Apr 20 14:56:53.831651 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:53.831631 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h6r27" Apr 20 14:56:53.843448 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:53.843411 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-h6r27" podStartSLOduration=2.599693798 podStartE2EDuration="3.843400871s" podCreationTimestamp="2026-04-20 14:56:50 +0000 UTC" firstStartedPulling="2026-04-20 14:56:51.709556797 +0000 UTC m=+90.782832414" lastFinishedPulling="2026-04-20 14:56:52.953263868 +0000 UTC m=+92.026539487" observedRunningTime="2026-04-20 14:56:53.841856439 +0000 UTC m=+92.915132076" watchObservedRunningTime="2026-04-20 14:56:53.843400871 +0000 UTC m=+92.916676495" Apr 20 14:56:53.859436 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:53.859393 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5fdb9b4f89-ckw6j" podStartSLOduration=1.859360694 podStartE2EDuration="1.859360694s" podCreationTimestamp="2026-04-20 14:56:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:56:53.858412003 +0000 UTC m=+92.931687628" watchObservedRunningTime="2026-04-20 14:56:53.859360694 +0000 UTC m=+92.932636318" Apr 20 14:56:59.750360 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:56:59.750328 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-j2mjp" Apr 20 14:57:02.525482 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:02.525445 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:57:02.525482 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:02.525487 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:57:02.530416 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:02.530396 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:57:02.860807 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:02.860776 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:57:02.904278 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:02.904246 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c55d4464d-gjrx7"] Apr 20 14:57:14.887780 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:14.887731 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-844d9bc4d-pbbvz" podUID="1df0262d-3921-42b7-a79c-4f7340fcfe7e" containerName="console" containerID="cri-o://9d06e4cd34ebe7986d575f79d39cc05a8403dba90f78692c179a758d56d8a502" gracePeriod=15 Apr 20 14:57:15.124995 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.124974 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-844d9bc4d-pbbvz_1df0262d-3921-42b7-a79c-4f7340fcfe7e/console/0.log" Apr 20 14:57:15.125109 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.125042 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:57:15.243534 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.243474 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmq6v\" (UniqueName: \"kubernetes.io/projected/1df0262d-3921-42b7-a79c-4f7340fcfe7e-kube-api-access-cmq6v\") pod \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " Apr 20 14:57:15.243642 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.243545 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1df0262d-3921-42b7-a79c-4f7340fcfe7e-oauth-serving-cert\") pod \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " Apr 20 14:57:15.243642 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.243569 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1df0262d-3921-42b7-a79c-4f7340fcfe7e-service-ca\") pod \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " Apr 20 14:57:15.243727 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.243639 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1df0262d-3921-42b7-a79c-4f7340fcfe7e-console-config\") pod \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " Apr 20 14:57:15.243727 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.243672 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1df0262d-3921-42b7-a79c-4f7340fcfe7e-console-oauth-config\") pod \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " Apr 20 14:57:15.243727 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.243700 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1df0262d-3921-42b7-a79c-4f7340fcfe7e-console-serving-cert\") pod \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\" (UID: \"1df0262d-3921-42b7-a79c-4f7340fcfe7e\") " Apr 20 14:57:15.244064 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.244030 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1df0262d-3921-42b7-a79c-4f7340fcfe7e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1df0262d-3921-42b7-a79c-4f7340fcfe7e" (UID: "1df0262d-3921-42b7-a79c-4f7340fcfe7e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:57:15.244178 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.244058 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1df0262d-3921-42b7-a79c-4f7340fcfe7e-service-ca" (OuterVolumeSpecName: "service-ca") pod "1df0262d-3921-42b7-a79c-4f7340fcfe7e" (UID: "1df0262d-3921-42b7-a79c-4f7340fcfe7e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:57:15.244178 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.244038 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1df0262d-3921-42b7-a79c-4f7340fcfe7e-console-config" (OuterVolumeSpecName: "console-config") pod "1df0262d-3921-42b7-a79c-4f7340fcfe7e" (UID: "1df0262d-3921-42b7-a79c-4f7340fcfe7e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:57:15.245787 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.245755 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df0262d-3921-42b7-a79c-4f7340fcfe7e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1df0262d-3921-42b7-a79c-4f7340fcfe7e" (UID: "1df0262d-3921-42b7-a79c-4f7340fcfe7e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:15.245876 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.245811 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1df0262d-3921-42b7-a79c-4f7340fcfe7e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1df0262d-3921-42b7-a79c-4f7340fcfe7e" (UID: "1df0262d-3921-42b7-a79c-4f7340fcfe7e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:15.245876 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.245816 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1df0262d-3921-42b7-a79c-4f7340fcfe7e-kube-api-access-cmq6v" (OuterVolumeSpecName: "kube-api-access-cmq6v") pod "1df0262d-3921-42b7-a79c-4f7340fcfe7e" (UID: "1df0262d-3921-42b7-a79c-4f7340fcfe7e"). InnerVolumeSpecName "kube-api-access-cmq6v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:57:15.344901 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.344878 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1df0262d-3921-42b7-a79c-4f7340fcfe7e-oauth-serving-cert\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:57:15.344901 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.344900 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1df0262d-3921-42b7-a79c-4f7340fcfe7e-service-ca\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:57:15.345032 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.344909 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1df0262d-3921-42b7-a79c-4f7340fcfe7e-console-config\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:57:15.345032 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.344918 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1df0262d-3921-42b7-a79c-4f7340fcfe7e-console-oauth-config\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:57:15.345032 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.344926 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1df0262d-3921-42b7-a79c-4f7340fcfe7e-console-serving-cert\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:57:15.345032 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.344935 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cmq6v\" (UniqueName: \"kubernetes.io/projected/1df0262d-3921-42b7-a79c-4f7340fcfe7e-kube-api-access-cmq6v\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:57:15.892994 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.892969 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-844d9bc4d-pbbvz_1df0262d-3921-42b7-a79c-4f7340fcfe7e/console/0.log" Apr 20 14:57:15.893456 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.893006 2574 generic.go:358] "Generic (PLEG): container finished" podID="1df0262d-3921-42b7-a79c-4f7340fcfe7e" containerID="9d06e4cd34ebe7986d575f79d39cc05a8403dba90f78692c179a758d56d8a502" exitCode=2 Apr 20 14:57:15.893456 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.893076 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-844d9bc4d-pbbvz" event={"ID":"1df0262d-3921-42b7-a79c-4f7340fcfe7e","Type":"ContainerDied","Data":"9d06e4cd34ebe7986d575f79d39cc05a8403dba90f78692c179a758d56d8a502"} Apr 20 14:57:15.893456 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.893091 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-844d9bc4d-pbbvz" Apr 20 14:57:15.893456 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.893101 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-844d9bc4d-pbbvz" event={"ID":"1df0262d-3921-42b7-a79c-4f7340fcfe7e","Type":"ContainerDied","Data":"4faf431c7b6937627056add9646eec0236f10f14e664f3dd06dce913a7030501"} Apr 20 14:57:15.893456 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.893114 2574 scope.go:117] "RemoveContainer" containerID="9d06e4cd34ebe7986d575f79d39cc05a8403dba90f78692c179a758d56d8a502" Apr 20 14:57:15.900886 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.900865 2574 scope.go:117] "RemoveContainer" containerID="9d06e4cd34ebe7986d575f79d39cc05a8403dba90f78692c179a758d56d8a502" Apr 20 14:57:15.901160 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:57:15.901130 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d06e4cd34ebe7986d575f79d39cc05a8403dba90f78692c179a758d56d8a502\": container with ID starting with 9d06e4cd34ebe7986d575f79d39cc05a8403dba90f78692c179a758d56d8a502 not found: ID does not exist" containerID="9d06e4cd34ebe7986d575f79d39cc05a8403dba90f78692c179a758d56d8a502" Apr 20 14:57:15.901226 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.901167 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d06e4cd34ebe7986d575f79d39cc05a8403dba90f78692c179a758d56d8a502"} err="failed to get container status \"9d06e4cd34ebe7986d575f79d39cc05a8403dba90f78692c179a758d56d8a502\": rpc error: code = NotFound desc = could not find container \"9d06e4cd34ebe7986d575f79d39cc05a8403dba90f78692c179a758d56d8a502\": container with ID starting with 9d06e4cd34ebe7986d575f79d39cc05a8403dba90f78692c179a758d56d8a502 not found: ID does not exist" Apr 20 14:57:15.908451 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.908429 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-844d9bc4d-pbbvz"] Apr 20 14:57:15.914169 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:15.914152 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-844d9bc4d-pbbvz"] Apr 20 14:57:17.446708 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:17.446673 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1df0262d-3921-42b7-a79c-4f7340fcfe7e" path="/var/lib/kubelet/pods/1df0262d-3921-42b7-a79c-4f7340fcfe7e/volumes" Apr 20 14:57:18.903151 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:18.903122 2574 generic.go:358] "Generic (PLEG): container finished" podID="332fde80-3942-477a-918e-84086221c09b" containerID="ecb7d6ffa9c5a6ab9ec8fcab7950d8d28559598291a7cb9552a1908ccbfe9d98" exitCode=0 Apr 20 14:57:18.903489 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:18.903158 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mv44d" event={"ID":"332fde80-3942-477a-918e-84086221c09b","Type":"ContainerDied","Data":"ecb7d6ffa9c5a6ab9ec8fcab7950d8d28559598291a7cb9552a1908ccbfe9d98"} Apr 20 14:57:18.903489 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:18.903482 2574 scope.go:117] "RemoveContainer" containerID="ecb7d6ffa9c5a6ab9ec8fcab7950d8d28559598291a7cb9552a1908ccbfe9d98" Apr 20 14:57:19.907906 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:19.907864 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-mv44d" event={"ID":"332fde80-3942-477a-918e-84086221c09b","Type":"ContainerStarted","Data":"97da47257dca51eadee82962f1743b7e1a9cc33bc93d8e3b404bfbca92df00c1"} Apr 20 14:57:27.924719 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:27.924678 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5c55d4464d-gjrx7" podUID="8b961b38-b91c-4c91-8747-427f78e55420" containerName="console" containerID="cri-o://8233260e2c00d7375ef03a62c092ab185aa75bf597fe6c9c666efc3fbd40d5cc" gracePeriod=15 Apr 20 14:57:28.164001 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.163978 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c55d4464d-gjrx7_8b961b38-b91c-4c91-8747-427f78e55420/console/0.log" Apr 20 14:57:28.164113 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.164038 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:57:28.239632 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.239560 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b961b38-b91c-4c91-8747-427f78e55420-console-serving-cert\") pod \"8b961b38-b91c-4c91-8747-427f78e55420\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " Apr 20 14:57:28.239632 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.239590 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-oauth-serving-cert\") pod \"8b961b38-b91c-4c91-8747-427f78e55420\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " Apr 20 14:57:28.239632 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.239609 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8b961b38-b91c-4c91-8747-427f78e55420-console-oauth-config\") pod \"8b961b38-b91c-4c91-8747-427f78e55420\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " Apr 20 14:57:28.239884 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.239642 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-service-ca\") pod \"8b961b38-b91c-4c91-8747-427f78e55420\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " Apr 20 14:57:28.239884 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.239704 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-console-config\") pod \"8b961b38-b91c-4c91-8747-427f78e55420\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " Apr 20 14:57:28.239884 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.239737 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjkft\" (UniqueName: \"kubernetes.io/projected/8b961b38-b91c-4c91-8747-427f78e55420-kube-api-access-xjkft\") pod \"8b961b38-b91c-4c91-8747-427f78e55420\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " Apr 20 14:57:28.239884 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.239772 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-trusted-ca-bundle\") pod \"8b961b38-b91c-4c91-8747-427f78e55420\" (UID: \"8b961b38-b91c-4c91-8747-427f78e55420\") " Apr 20 14:57:28.240107 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.240077 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8b961b38-b91c-4c91-8747-427f78e55420" (UID: "8b961b38-b91c-4c91-8747-427f78e55420"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:57:28.240193 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.240130 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-console-config" (OuterVolumeSpecName: "console-config") pod "8b961b38-b91c-4c91-8747-427f78e55420" (UID: "8b961b38-b91c-4c91-8747-427f78e55420"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:57:28.240249 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.240136 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-service-ca" (OuterVolumeSpecName: "service-ca") pod "8b961b38-b91c-4c91-8747-427f78e55420" (UID: "8b961b38-b91c-4c91-8747-427f78e55420"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:57:28.240352 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.240335 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8b961b38-b91c-4c91-8747-427f78e55420" (UID: "8b961b38-b91c-4c91-8747-427f78e55420"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:57:28.241787 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.241762 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b961b38-b91c-4c91-8747-427f78e55420-kube-api-access-xjkft" (OuterVolumeSpecName: "kube-api-access-xjkft") pod "8b961b38-b91c-4c91-8747-427f78e55420" (UID: "8b961b38-b91c-4c91-8747-427f78e55420"). InnerVolumeSpecName "kube-api-access-xjkft". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:57:28.241894 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.241832 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b961b38-b91c-4c91-8747-427f78e55420-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8b961b38-b91c-4c91-8747-427f78e55420" (UID: "8b961b38-b91c-4c91-8747-427f78e55420"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:28.241894 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.241852 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b961b38-b91c-4c91-8747-427f78e55420-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8b961b38-b91c-4c91-8747-427f78e55420" (UID: "8b961b38-b91c-4c91-8747-427f78e55420"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:57:28.341276 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.341247 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8b961b38-b91c-4c91-8747-427f78e55420-console-serving-cert\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:57:28.341276 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.341272 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-oauth-serving-cert\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:57:28.341451 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.341285 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8b961b38-b91c-4c91-8747-427f78e55420-console-oauth-config\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:57:28.341451 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.341298 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-service-ca\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:57:28.341451 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.341309 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-console-config\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:57:28.341451 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.341323 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xjkft\" (UniqueName: \"kubernetes.io/projected/8b961b38-b91c-4c91-8747-427f78e55420-kube-api-access-xjkft\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:57:28.341451 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.341337 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b961b38-b91c-4c91-8747-427f78e55420-trusted-ca-bundle\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:57:28.937055 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.937026 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5c55d4464d-gjrx7_8b961b38-b91c-4c91-8747-427f78e55420/console/0.log" Apr 20 14:57:28.937460 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.937068 2574 generic.go:358] "Generic (PLEG): container finished" podID="8b961b38-b91c-4c91-8747-427f78e55420" containerID="8233260e2c00d7375ef03a62c092ab185aa75bf597fe6c9c666efc3fbd40d5cc" exitCode=2 Apr 20 14:57:28.937460 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.937130 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c55d4464d-gjrx7" Apr 20 14:57:28.937460 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.937149 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c55d4464d-gjrx7" event={"ID":"8b961b38-b91c-4c91-8747-427f78e55420","Type":"ContainerDied","Data":"8233260e2c00d7375ef03a62c092ab185aa75bf597fe6c9c666efc3fbd40d5cc"} Apr 20 14:57:28.937460 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.937183 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c55d4464d-gjrx7" event={"ID":"8b961b38-b91c-4c91-8747-427f78e55420","Type":"ContainerDied","Data":"bd037cc65a99bb8f8c9db3a980473df704cf4bcbd07a06547f99ded177362f04"} Apr 20 14:57:28.937460 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.937198 2574 scope.go:117] "RemoveContainer" containerID="8233260e2c00d7375ef03a62c092ab185aa75bf597fe6c9c666efc3fbd40d5cc" Apr 20 14:57:28.945329 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.945298 2574 scope.go:117] "RemoveContainer" containerID="8233260e2c00d7375ef03a62c092ab185aa75bf597fe6c9c666efc3fbd40d5cc" Apr 20 14:57:28.945667 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:57:28.945648 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8233260e2c00d7375ef03a62c092ab185aa75bf597fe6c9c666efc3fbd40d5cc\": container with ID starting with 8233260e2c00d7375ef03a62c092ab185aa75bf597fe6c9c666efc3fbd40d5cc not found: ID does not exist" containerID="8233260e2c00d7375ef03a62c092ab185aa75bf597fe6c9c666efc3fbd40d5cc" Apr 20 14:57:28.945728 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.945678 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8233260e2c00d7375ef03a62c092ab185aa75bf597fe6c9c666efc3fbd40d5cc"} err="failed to get container status \"8233260e2c00d7375ef03a62c092ab185aa75bf597fe6c9c666efc3fbd40d5cc\": rpc error: code = NotFound desc = could not find container \"8233260e2c00d7375ef03a62c092ab185aa75bf597fe6c9c666efc3fbd40d5cc\": container with ID starting with 8233260e2c00d7375ef03a62c092ab185aa75bf597fe6c9c666efc3fbd40d5cc not found: ID does not exist" Apr 20 14:57:28.957544 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.957524 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5c55d4464d-gjrx7"] Apr 20 14:57:28.960996 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:28.960977 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5c55d4464d-gjrx7"] Apr 20 14:57:29.450200 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:29.450158 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b961b38-b91c-4c91-8747-427f78e55420" path="/var/lib/kubelet/pods/8b961b38-b91c-4c91-8747-427f78e55420/volumes" Apr 20 14:57:38.965039 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:38.965003 2574 generic.go:358] "Generic (PLEG): container finished" podID="1b93bf46-c126-4ef5-9add-d72c0cbb7dae" containerID="60f85cde85c2af646080cfbe88b07f8163e29a110e2f38ecdeb6d396dcc00969" exitCode=0 Apr 20 14:57:38.965457 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:38.965079 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-xc2s2" event={"ID":"1b93bf46-c126-4ef5-9add-d72c0cbb7dae","Type":"ContainerDied","Data":"60f85cde85c2af646080cfbe88b07f8163e29a110e2f38ecdeb6d396dcc00969"} Apr 20 14:57:38.965457 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:38.965431 2574 scope.go:117] "RemoveContainer" containerID="60f85cde85c2af646080cfbe88b07f8163e29a110e2f38ecdeb6d396dcc00969" Apr 20 14:57:39.969829 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:57:39.969794 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-xc2s2" event={"ID":"1b93bf46-c126-4ef5-9add-d72c0cbb7dae","Type":"ContainerStarted","Data":"b668a6f755d9646d417741865c28487094f11dd07ce3e359cb4d2a7c68d77e74"} Apr 20 14:58:13.253315 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.253239 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-f945bf794-r7kxr"] Apr 20 14:58:13.253711 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.253519 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8b961b38-b91c-4c91-8747-427f78e55420" containerName="console" Apr 20 14:58:13.253711 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.253530 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b961b38-b91c-4c91-8747-427f78e55420" containerName="console" Apr 20 14:58:13.253711 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.253540 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1df0262d-3921-42b7-a79c-4f7340fcfe7e" containerName="console" Apr 20 14:58:13.253711 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.253545 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df0262d-3921-42b7-a79c-4f7340fcfe7e" containerName="console" Apr 20 14:58:13.253711 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.253603 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="8b961b38-b91c-4c91-8747-427f78e55420" containerName="console" Apr 20 14:58:13.253711 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.253611 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="1df0262d-3921-42b7-a79c-4f7340fcfe7e" containerName="console" Apr 20 14:58:13.256284 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.256268 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.271414 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.271390 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdd2de80-8f28-43db-b489-d8d85e551a61-console-serving-cert\") pod \"console-f945bf794-r7kxr\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.271523 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.271431 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-service-ca\") pod \"console-f945bf794-r7kxr\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.271523 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.271491 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-trusted-ca-bundle\") pod \"console-f945bf794-r7kxr\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.271600 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.271520 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5bp9\" (UniqueName: \"kubernetes.io/projected/bdd2de80-8f28-43db-b489-d8d85e551a61-kube-api-access-b5bp9\") pod \"console-f945bf794-r7kxr\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.271600 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.271557 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-oauth-serving-cert\") pod \"console-f945bf794-r7kxr\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.271600 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.271574 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-console-config\") pod \"console-f945bf794-r7kxr\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.271691 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.271598 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bdd2de80-8f28-43db-b489-d8d85e551a61-console-oauth-config\") pod \"console-f945bf794-r7kxr\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.292141 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.292120 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f945bf794-r7kxr"] Apr 20 14:58:13.372598 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.372570 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-service-ca\") pod \"console-f945bf794-r7kxr\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.372727 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.372615 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-trusted-ca-bundle\") pod \"console-f945bf794-r7kxr\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.372727 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.372640 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b5bp9\" (UniqueName: \"kubernetes.io/projected/bdd2de80-8f28-43db-b489-d8d85e551a61-kube-api-access-b5bp9\") pod \"console-f945bf794-r7kxr\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.372727 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.372683 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-oauth-serving-cert\") pod \"console-f945bf794-r7kxr\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.372727 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.372715 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-console-config\") pod \"console-f945bf794-r7kxr\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.372898 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.372763 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bdd2de80-8f28-43db-b489-d8d85e551a61-console-oauth-config\") pod \"console-f945bf794-r7kxr\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.372898 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.372815 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdd2de80-8f28-43db-b489-d8d85e551a61-console-serving-cert\") pod \"console-f945bf794-r7kxr\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.373311 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.373287 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-service-ca\") pod \"console-f945bf794-r7kxr\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.373476 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.373452 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-console-config\") pod \"console-f945bf794-r7kxr\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.373476 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.373470 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-oauth-serving-cert\") pod \"console-f945bf794-r7kxr\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.373612 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.373480 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-trusted-ca-bundle\") pod \"console-f945bf794-r7kxr\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.375155 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.375134 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdd2de80-8f28-43db-b489-d8d85e551a61-console-serving-cert\") pod \"console-f945bf794-r7kxr\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.375228 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.375177 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bdd2de80-8f28-43db-b489-d8d85e551a61-console-oauth-config\") pod \"console-f945bf794-r7kxr\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.385294 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.385275 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5bp9\" (UniqueName: \"kubernetes.io/projected/bdd2de80-8f28-43db-b489-d8d85e551a61-kube-api-access-b5bp9\") pod \"console-f945bf794-r7kxr\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.564796 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.564764 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:13.679172 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:13.679132 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f945bf794-r7kxr"] Apr 20 14:58:13.681782 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:58:13.681750 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdd2de80_8f28_43db_b489_d8d85e551a61.slice/crio-cc1616a04886fff0b595b8472390255d4bed92929625f500ee243ec3d10317c3 WatchSource:0}: Error finding container cc1616a04886fff0b595b8472390255d4bed92929625f500ee243ec3d10317c3: Status 404 returned error can't find the container with id cc1616a04886fff0b595b8472390255d4bed92929625f500ee243ec3d10317c3 Apr 20 14:58:14.072383 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:14.072338 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f945bf794-r7kxr" event={"ID":"bdd2de80-8f28-43db-b489-d8d85e551a61","Type":"ContainerStarted","Data":"4bdcfadd1a631de8b82675f11b0cb6ffefda6735a44d52b78778ec601d0edeb6"} Apr 20 14:58:14.072568 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:14.072398 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f945bf794-r7kxr" event={"ID":"bdd2de80-8f28-43db-b489-d8d85e551a61","Type":"ContainerStarted","Data":"cc1616a04886fff0b595b8472390255d4bed92929625f500ee243ec3d10317c3"} Apr 20 14:58:14.110893 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:14.110848 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f945bf794-r7kxr" podStartSLOduration=1.110835734 podStartE2EDuration="1.110835734s" podCreationTimestamp="2026-04-20 14:58:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:58:14.110788782 +0000 UTC m=+173.184064408" watchObservedRunningTime="2026-04-20 14:58:14.110835734 +0000 UTC m=+173.184111359" Apr 20 14:58:21.217424 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.217390 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f945bf794-r7kxr"] Apr 20 14:58:21.245871 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.245843 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-844794986-75jsk"] Apr 20 14:58:21.250850 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.250458 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.262310 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.262289 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-844794986-75jsk"] Apr 20 14:58:21.318509 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.318481 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-service-ca\") pod \"console-844794986-75jsk\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.318681 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.318518 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-trusted-ca-bundle\") pod \"console-844794986-75jsk\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.318681 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.318554 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-oauth-serving-cert\") pod \"console-844794986-75jsk\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.318681 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.318617 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a549cdfd-6a71-4221-aab3-acad67d5bbc0-console-oauth-config\") pod \"console-844794986-75jsk\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.318681 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.318673 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gqkg\" (UniqueName: \"kubernetes.io/projected/a549cdfd-6a71-4221-aab3-acad67d5bbc0-kube-api-access-7gqkg\") pod \"console-844794986-75jsk\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.318921 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.318723 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a549cdfd-6a71-4221-aab3-acad67d5bbc0-console-serving-cert\") pod \"console-844794986-75jsk\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.318921 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.318767 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-console-config\") pod \"console-844794986-75jsk\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.419103 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.419076 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-service-ca\") pod \"console-844794986-75jsk\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.419244 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.419108 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-trusted-ca-bundle\") pod \"console-844794986-75jsk\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.419244 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.419128 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-oauth-serving-cert\") pod \"console-844794986-75jsk\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.419244 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.419151 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a549cdfd-6a71-4221-aab3-acad67d5bbc0-console-oauth-config\") pod \"console-844794986-75jsk\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.419244 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.419185 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gqkg\" (UniqueName: \"kubernetes.io/projected/a549cdfd-6a71-4221-aab3-acad67d5bbc0-kube-api-access-7gqkg\") pod \"console-844794986-75jsk\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.419244 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.419220 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a549cdfd-6a71-4221-aab3-acad67d5bbc0-console-serving-cert\") pod \"console-844794986-75jsk\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.419549 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.419258 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-console-config\") pod \"console-844794986-75jsk\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.419881 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.419849 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-oauth-serving-cert\") pod \"console-844794986-75jsk\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.419994 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.419877 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-service-ca\") pod \"console-844794986-75jsk\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.420163 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.420141 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-trusted-ca-bundle\") pod \"console-844794986-75jsk\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.420357 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.420339 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-console-config\") pod \"console-844794986-75jsk\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.422165 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.422146 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a549cdfd-6a71-4221-aab3-acad67d5bbc0-console-oauth-config\") pod \"console-844794986-75jsk\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.422260 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.422145 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a549cdfd-6a71-4221-aab3-acad67d5bbc0-console-serving-cert\") pod \"console-844794986-75jsk\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.427549 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.427528 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gqkg\" (UniqueName: \"kubernetes.io/projected/a549cdfd-6a71-4221-aab3-acad67d5bbc0-kube-api-access-7gqkg\") pod \"console-844794986-75jsk\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.560147 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.560125 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:21.678247 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:21.678213 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-844794986-75jsk"] Apr 20 14:58:21.681407 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:58:21.681361 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda549cdfd_6a71_4221_aab3_acad67d5bbc0.slice/crio-f0e6693a2dcf06370096d5bd8df7d87e62537219779814d5ebdcc3ac9dbe05ae WatchSource:0}: Error finding container f0e6693a2dcf06370096d5bd8df7d87e62537219779814d5ebdcc3ac9dbe05ae: Status 404 returned error can't find the container with id f0e6693a2dcf06370096d5bd8df7d87e62537219779814d5ebdcc3ac9dbe05ae Apr 20 14:58:22.095973 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:22.095936 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-844794986-75jsk" event={"ID":"a549cdfd-6a71-4221-aab3-acad67d5bbc0","Type":"ContainerStarted","Data":"df658dd3cbb2caf69837ed795c483f9923af8c92f3593007e7bc157e5103e507"} Apr 20 14:58:22.096168 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:22.095981 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-844794986-75jsk" event={"ID":"a549cdfd-6a71-4221-aab3-acad67d5bbc0","Type":"ContainerStarted","Data":"f0e6693a2dcf06370096d5bd8df7d87e62537219779814d5ebdcc3ac9dbe05ae"} Apr 20 14:58:22.115230 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:22.115186 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-844794986-75jsk" podStartSLOduration=1.115172774 podStartE2EDuration="1.115172774s" podCreationTimestamp="2026-04-20 14:58:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:58:22.113777481 +0000 UTC m=+181.187053143" watchObservedRunningTime="2026-04-20 14:58:22.115172774 +0000 UTC m=+181.188448398" Apr 20 14:58:23.565902 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:23.565851 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:31.560777 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:31.560741 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:31.561237 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:31.560789 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:31.565445 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:31.565425 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:32.127977 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:32.127952 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-844794986-75jsk" Apr 20 14:58:32.176970 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:32.176940 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fdb9b4f89-ckw6j"] Apr 20 14:58:46.236363 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.236305 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-f945bf794-r7kxr" podUID="bdd2de80-8f28-43db-b489-d8d85e551a61" containerName="console" containerID="cri-o://4bdcfadd1a631de8b82675f11b0cb6ffefda6735a44d52b78778ec601d0edeb6" gracePeriod=15 Apr 20 14:58:46.464007 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.463986 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f945bf794-r7kxr_bdd2de80-8f28-43db-b489-d8d85e551a61/console/0.log" Apr 20 14:58:46.464113 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.464042 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:46.497324 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.497274 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-service-ca\") pod \"bdd2de80-8f28-43db-b489-d8d85e551a61\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " Apr 20 14:58:46.497324 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.497317 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5bp9\" (UniqueName: \"kubernetes.io/projected/bdd2de80-8f28-43db-b489-d8d85e551a61-kube-api-access-b5bp9\") pod \"bdd2de80-8f28-43db-b489-d8d85e551a61\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " Apr 20 14:58:46.497482 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.497335 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bdd2de80-8f28-43db-b489-d8d85e551a61-console-oauth-config\") pod \"bdd2de80-8f28-43db-b489-d8d85e551a61\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " Apr 20 14:58:46.497482 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.497353 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdd2de80-8f28-43db-b489-d8d85e551a61-console-serving-cert\") pod \"bdd2de80-8f28-43db-b489-d8d85e551a61\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " Apr 20 14:58:46.497482 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.497409 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-trusted-ca-bundle\") pod \"bdd2de80-8f28-43db-b489-d8d85e551a61\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " Apr 20 14:58:46.497482 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.497445 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-oauth-serving-cert\") pod \"bdd2de80-8f28-43db-b489-d8d85e551a61\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " Apr 20 14:58:46.497482 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.497469 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-console-config\") pod \"bdd2de80-8f28-43db-b489-d8d85e551a61\" (UID: \"bdd2de80-8f28-43db-b489-d8d85e551a61\") " Apr 20 14:58:46.497911 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.497723 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-service-ca" (OuterVolumeSpecName: "service-ca") pod "bdd2de80-8f28-43db-b489-d8d85e551a61" (UID: "bdd2de80-8f28-43db-b489-d8d85e551a61"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:58:46.498021 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.497913 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bdd2de80-8f28-43db-b489-d8d85e551a61" (UID: "bdd2de80-8f28-43db-b489-d8d85e551a61"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:58:46.498021 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.497967 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-console-config" (OuterVolumeSpecName: "console-config") pod "bdd2de80-8f28-43db-b489-d8d85e551a61" (UID: "bdd2de80-8f28-43db-b489-d8d85e551a61"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:58:46.498411 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.498387 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bdd2de80-8f28-43db-b489-d8d85e551a61" (UID: "bdd2de80-8f28-43db-b489-d8d85e551a61"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:58:46.499550 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.499525 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdd2de80-8f28-43db-b489-d8d85e551a61-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bdd2de80-8f28-43db-b489-d8d85e551a61" (UID: "bdd2de80-8f28-43db-b489-d8d85e551a61"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:58:46.499698 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.499678 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdd2de80-8f28-43db-b489-d8d85e551a61-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bdd2de80-8f28-43db-b489-d8d85e551a61" (UID: "bdd2de80-8f28-43db-b489-d8d85e551a61"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:58:46.499854 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.499830 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdd2de80-8f28-43db-b489-d8d85e551a61-kube-api-access-b5bp9" (OuterVolumeSpecName: "kube-api-access-b5bp9") pod "bdd2de80-8f28-43db-b489-d8d85e551a61" (UID: "bdd2de80-8f28-43db-b489-d8d85e551a61"). InnerVolumeSpecName "kube-api-access-b5bp9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:58:46.598081 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.598057 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-oauth-serving-cert\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:58:46.598081 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.598080 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-console-config\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:58:46.598212 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.598090 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-service-ca\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:58:46.598212 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.598101 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b5bp9\" (UniqueName: \"kubernetes.io/projected/bdd2de80-8f28-43db-b489-d8d85e551a61-kube-api-access-b5bp9\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:58:46.598212 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.598110 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bdd2de80-8f28-43db-b489-d8d85e551a61-console-oauth-config\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:58:46.598212 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.598119 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bdd2de80-8f28-43db-b489-d8d85e551a61-console-serving-cert\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:58:46.598212 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:46.598129 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bdd2de80-8f28-43db-b489-d8d85e551a61-trusted-ca-bundle\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:58:47.165650 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:47.165621 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f945bf794-r7kxr_bdd2de80-8f28-43db-b489-d8d85e551a61/console/0.log" Apr 20 14:58:47.165816 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:47.165664 2574 generic.go:358] "Generic (PLEG): container finished" podID="bdd2de80-8f28-43db-b489-d8d85e551a61" containerID="4bdcfadd1a631de8b82675f11b0cb6ffefda6735a44d52b78778ec601d0edeb6" exitCode=2 Apr 20 14:58:47.165816 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:47.165695 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f945bf794-r7kxr" event={"ID":"bdd2de80-8f28-43db-b489-d8d85e551a61","Type":"ContainerDied","Data":"4bdcfadd1a631de8b82675f11b0cb6ffefda6735a44d52b78778ec601d0edeb6"} Apr 20 14:58:47.165816 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:47.165733 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f945bf794-r7kxr" event={"ID":"bdd2de80-8f28-43db-b489-d8d85e551a61","Type":"ContainerDied","Data":"cc1616a04886fff0b595b8472390255d4bed92929625f500ee243ec3d10317c3"} Apr 20 14:58:47.165816 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:47.165733 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f945bf794-r7kxr" Apr 20 14:58:47.165816 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:47.165750 2574 scope.go:117] "RemoveContainer" containerID="4bdcfadd1a631de8b82675f11b0cb6ffefda6735a44d52b78778ec601d0edeb6" Apr 20 14:58:47.174147 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:47.174130 2574 scope.go:117] "RemoveContainer" containerID="4bdcfadd1a631de8b82675f11b0cb6ffefda6735a44d52b78778ec601d0edeb6" Apr 20 14:58:47.174413 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:58:47.174391 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bdcfadd1a631de8b82675f11b0cb6ffefda6735a44d52b78778ec601d0edeb6\": container with ID starting with 4bdcfadd1a631de8b82675f11b0cb6ffefda6735a44d52b78778ec601d0edeb6 not found: ID does not exist" containerID="4bdcfadd1a631de8b82675f11b0cb6ffefda6735a44d52b78778ec601d0edeb6" Apr 20 14:58:47.174505 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:47.174419 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bdcfadd1a631de8b82675f11b0cb6ffefda6735a44d52b78778ec601d0edeb6"} err="failed to get container status \"4bdcfadd1a631de8b82675f11b0cb6ffefda6735a44d52b78778ec601d0edeb6\": rpc error: code = NotFound desc = could not find container \"4bdcfadd1a631de8b82675f11b0cb6ffefda6735a44d52b78778ec601d0edeb6\": container with ID starting with 4bdcfadd1a631de8b82675f11b0cb6ffefda6735a44d52b78778ec601d0edeb6 not found: ID does not exist" Apr 20 14:58:47.187502 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:47.187464 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f945bf794-r7kxr"] Apr 20 14:58:47.190847 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:47.190829 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f945bf794-r7kxr"] Apr 20 14:58:47.446611 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:47.446539 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdd2de80-8f28-43db-b489-d8d85e551a61" path="/var/lib/kubelet/pods/bdd2de80-8f28-43db-b489-d8d85e551a61/volumes" Apr 20 14:58:57.199351 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.199298 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5fdb9b4f89-ckw6j" podUID="0d23fa62-b327-4aaa-a8cd-4fc4484a2273" containerName="console" containerID="cri-o://491c86dddecc9a1dad17d4b34161aef302d3f18d76c0fd156dfc9962658a8c5b" gracePeriod=15 Apr 20 14:58:57.441091 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.441072 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fdb9b4f89-ckw6j_0d23fa62-b327-4aaa-a8cd-4fc4484a2273/console/0.log" Apr 20 14:58:57.441188 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.441130 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:58:57.585114 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.585082 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7x69\" (UniqueName: \"kubernetes.io/projected/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-kube-api-access-b7x69\") pod \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " Apr 20 14:58:57.585297 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.585126 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-service-ca\") pod \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " Apr 20 14:58:57.585297 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.585159 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-console-config\") pod \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " Apr 20 14:58:57.585297 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.585243 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-oauth-serving-cert\") pod \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " Apr 20 14:58:57.585491 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.585312 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-trusted-ca-bundle\") pod \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " Apr 20 14:58:57.585491 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.585342 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-console-serving-cert\") pod \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " Apr 20 14:58:57.585491 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.585399 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-console-oauth-config\") pod \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\" (UID: \"0d23fa62-b327-4aaa-a8cd-4fc4484a2273\") " Apr 20 14:58:57.585637 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.585616 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-console-config" (OuterVolumeSpecName: "console-config") pod "0d23fa62-b327-4aaa-a8cd-4fc4484a2273" (UID: "0d23fa62-b327-4aaa-a8cd-4fc4484a2273"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:58:57.585695 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.585629 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-service-ca" (OuterVolumeSpecName: "service-ca") pod "0d23fa62-b327-4aaa-a8cd-4fc4484a2273" (UID: "0d23fa62-b327-4aaa-a8cd-4fc4484a2273"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:58:57.585695 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.585625 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0d23fa62-b327-4aaa-a8cd-4fc4484a2273" (UID: "0d23fa62-b327-4aaa-a8cd-4fc4484a2273"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:58:57.585695 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.585675 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0d23fa62-b327-4aaa-a8cd-4fc4484a2273" (UID: "0d23fa62-b327-4aaa-a8cd-4fc4484a2273"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:58:57.587323 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.587299 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-kube-api-access-b7x69" (OuterVolumeSpecName: "kube-api-access-b7x69") pod "0d23fa62-b327-4aaa-a8cd-4fc4484a2273" (UID: "0d23fa62-b327-4aaa-a8cd-4fc4484a2273"). InnerVolumeSpecName "kube-api-access-b7x69". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:58:57.587449 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.587299 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0d23fa62-b327-4aaa-a8cd-4fc4484a2273" (UID: "0d23fa62-b327-4aaa-a8cd-4fc4484a2273"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:58:57.587508 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.587488 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0d23fa62-b327-4aaa-a8cd-4fc4484a2273" (UID: "0d23fa62-b327-4aaa-a8cd-4fc4484a2273"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:58:57.686941 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.686908 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b7x69\" (UniqueName: \"kubernetes.io/projected/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-kube-api-access-b7x69\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:58:57.686941 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.686938 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-service-ca\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:58:57.686941 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.686948 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-console-config\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:58:57.687147 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.686958 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-oauth-serving-cert\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:58:57.687147 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.686966 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-trusted-ca-bundle\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:58:57.687147 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.686977 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-console-serving-cert\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:58:57.687147 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:57.686985 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d23fa62-b327-4aaa-a8cd-4fc4484a2273-console-oauth-config\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 14:58:58.203378 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:58.203341 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fdb9b4f89-ckw6j_0d23fa62-b327-4aaa-a8cd-4fc4484a2273/console/0.log" Apr 20 14:58:58.203807 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:58.203395 2574 generic.go:358] "Generic (PLEG): container finished" podID="0d23fa62-b327-4aaa-a8cd-4fc4484a2273" containerID="491c86dddecc9a1dad17d4b34161aef302d3f18d76c0fd156dfc9962658a8c5b" exitCode=2 Apr 20 14:58:58.203807 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:58.203426 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fdb9b4f89-ckw6j" event={"ID":"0d23fa62-b327-4aaa-a8cd-4fc4484a2273","Type":"ContainerDied","Data":"491c86dddecc9a1dad17d4b34161aef302d3f18d76c0fd156dfc9962658a8c5b"} Apr 20 14:58:58.203807 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:58.203466 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fdb9b4f89-ckw6j" event={"ID":"0d23fa62-b327-4aaa-a8cd-4fc4484a2273","Type":"ContainerDied","Data":"2b75297d034876fd3cfc5a263d31a91434b0440cba6aa94bd578214d03edc6c8"} Apr 20 14:58:58.203807 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:58.203474 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fdb9b4f89-ckw6j" Apr 20 14:58:58.203807 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:58.203483 2574 scope.go:117] "RemoveContainer" containerID="491c86dddecc9a1dad17d4b34161aef302d3f18d76c0fd156dfc9962658a8c5b" Apr 20 14:58:58.211516 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:58.211497 2574 scope.go:117] "RemoveContainer" containerID="491c86dddecc9a1dad17d4b34161aef302d3f18d76c0fd156dfc9962658a8c5b" Apr 20 14:58:58.211770 ip-10-0-130-249 kubenswrapper[2574]: E0420 14:58:58.211750 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"491c86dddecc9a1dad17d4b34161aef302d3f18d76c0fd156dfc9962658a8c5b\": container with ID starting with 491c86dddecc9a1dad17d4b34161aef302d3f18d76c0fd156dfc9962658a8c5b not found: ID does not exist" containerID="491c86dddecc9a1dad17d4b34161aef302d3f18d76c0fd156dfc9962658a8c5b" Apr 20 14:58:58.211824 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:58.211779 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"491c86dddecc9a1dad17d4b34161aef302d3f18d76c0fd156dfc9962658a8c5b"} err="failed to get container status \"491c86dddecc9a1dad17d4b34161aef302d3f18d76c0fd156dfc9962658a8c5b\": rpc error: code = NotFound desc = could not find container \"491c86dddecc9a1dad17d4b34161aef302d3f18d76c0fd156dfc9962658a8c5b\": container with ID starting with 491c86dddecc9a1dad17d4b34161aef302d3f18d76c0fd156dfc9962658a8c5b not found: ID does not exist" Apr 20 14:58:58.232337 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:58.232312 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fdb9b4f89-ckw6j"] Apr 20 14:58:58.233572 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:58.233550 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5fdb9b4f89-ckw6j"] Apr 20 14:58:59.446378 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:58:59.446334 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d23fa62-b327-4aaa-a8cd-4fc4484a2273" path="/var/lib/kubelet/pods/0d23fa62-b327-4aaa-a8cd-4fc4484a2273/volumes" Apr 20 14:59:13.361148 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:13.361118 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-7pz5g"] Apr 20 14:59:13.361576 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:13.361448 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bdd2de80-8f28-43db-b489-d8d85e551a61" containerName="console" Apr 20 14:59:13.361576 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:13.361466 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdd2de80-8f28-43db-b489-d8d85e551a61" containerName="console" Apr 20 14:59:13.361576 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:13.361488 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d23fa62-b327-4aaa-a8cd-4fc4484a2273" containerName="console" Apr 20 14:59:13.361576 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:13.361497 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d23fa62-b327-4aaa-a8cd-4fc4484a2273" containerName="console" Apr 20 14:59:13.361576 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:13.361573 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="bdd2de80-8f28-43db-b489-d8d85e551a61" containerName="console" Apr 20 14:59:13.361742 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:13.361583 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d23fa62-b327-4aaa-a8cd-4fc4484a2273" containerName="console" Apr 20 14:59:13.364561 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:13.364543 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7pz5g" Apr 20 14:59:13.366958 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:13.366935 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 14:59:13.371704 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:13.371420 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7pz5g"] Apr 20 14:59:13.491772 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:13.491742 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/16fab89b-6034-44ce-9e43-24eea5f7402c-original-pull-secret\") pod \"global-pull-secret-syncer-7pz5g\" (UID: \"16fab89b-6034-44ce-9e43-24eea5f7402c\") " pod="kube-system/global-pull-secret-syncer-7pz5g" Apr 20 14:59:13.491912 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:13.491787 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/16fab89b-6034-44ce-9e43-24eea5f7402c-dbus\") pod \"global-pull-secret-syncer-7pz5g\" (UID: \"16fab89b-6034-44ce-9e43-24eea5f7402c\") " pod="kube-system/global-pull-secret-syncer-7pz5g" Apr 20 14:59:13.491912 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:13.491818 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/16fab89b-6034-44ce-9e43-24eea5f7402c-kubelet-config\") pod \"global-pull-secret-syncer-7pz5g\" (UID: \"16fab89b-6034-44ce-9e43-24eea5f7402c\") " pod="kube-system/global-pull-secret-syncer-7pz5g" Apr 20 14:59:13.592738 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:13.592709 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/16fab89b-6034-44ce-9e43-24eea5f7402c-kubelet-config\") pod \"global-pull-secret-syncer-7pz5g\" (UID: \"16fab89b-6034-44ce-9e43-24eea5f7402c\") " pod="kube-system/global-pull-secret-syncer-7pz5g" Apr 20 14:59:13.592867 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:13.592759 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/16fab89b-6034-44ce-9e43-24eea5f7402c-original-pull-secret\") pod \"global-pull-secret-syncer-7pz5g\" (UID: \"16fab89b-6034-44ce-9e43-24eea5f7402c\") " pod="kube-system/global-pull-secret-syncer-7pz5g" Apr 20 14:59:13.592867 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:13.592789 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/16fab89b-6034-44ce-9e43-24eea5f7402c-dbus\") pod \"global-pull-secret-syncer-7pz5g\" (UID: \"16fab89b-6034-44ce-9e43-24eea5f7402c\") " pod="kube-system/global-pull-secret-syncer-7pz5g" Apr 20 14:59:13.592867 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:13.592828 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/16fab89b-6034-44ce-9e43-24eea5f7402c-kubelet-config\") pod \"global-pull-secret-syncer-7pz5g\" (UID: \"16fab89b-6034-44ce-9e43-24eea5f7402c\") " pod="kube-system/global-pull-secret-syncer-7pz5g" Apr 20 14:59:13.592967 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:13.592907 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/16fab89b-6034-44ce-9e43-24eea5f7402c-dbus\") pod \"global-pull-secret-syncer-7pz5g\" (UID: \"16fab89b-6034-44ce-9e43-24eea5f7402c\") " pod="kube-system/global-pull-secret-syncer-7pz5g" Apr 20 14:59:13.595050 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:13.595024 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/16fab89b-6034-44ce-9e43-24eea5f7402c-original-pull-secret\") pod \"global-pull-secret-syncer-7pz5g\" (UID: \"16fab89b-6034-44ce-9e43-24eea5f7402c\") " pod="kube-system/global-pull-secret-syncer-7pz5g" Apr 20 14:59:13.674241 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:13.674155 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7pz5g" Apr 20 14:59:13.789182 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:13.789144 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7pz5g"] Apr 20 14:59:13.791954 ip-10-0-130-249 kubenswrapper[2574]: W0420 14:59:13.791928 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16fab89b_6034_44ce_9e43_24eea5f7402c.slice/crio-f00a3976900d53ba53a991018f5a001351c0c05bb427b8f9382fbbe2c2efe90b WatchSource:0}: Error finding container f00a3976900d53ba53a991018f5a001351c0c05bb427b8f9382fbbe2c2efe90b: Status 404 returned error can't find the container with id f00a3976900d53ba53a991018f5a001351c0c05bb427b8f9382fbbe2c2efe90b Apr 20 14:59:14.251235 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:14.251194 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7pz5g" event={"ID":"16fab89b-6034-44ce-9e43-24eea5f7402c","Type":"ContainerStarted","Data":"f00a3976900d53ba53a991018f5a001351c0c05bb427b8f9382fbbe2c2efe90b"} Apr 20 14:59:18.265129 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:18.265091 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7pz5g" event={"ID":"16fab89b-6034-44ce-9e43-24eea5f7402c","Type":"ContainerStarted","Data":"5a3ba76379b40fae7b9d9230029cefba89b8242c54a3bc7df132828abf8e6a51"} Apr 20 14:59:18.280863 ip-10-0-130-249 kubenswrapper[2574]: I0420 14:59:18.280816 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7pz5g" podStartSLOduration=1.295633279 podStartE2EDuration="5.280802294s" podCreationTimestamp="2026-04-20 14:59:13 +0000 UTC" firstStartedPulling="2026-04-20 14:59:13.793713853 +0000 UTC m=+232.866989455" lastFinishedPulling="2026-04-20 14:59:17.778882867 +0000 UTC m=+236.852158470" observedRunningTime="2026-04-20 14:59:18.279684281 +0000 UTC m=+237.352959908" watchObservedRunningTime="2026-04-20 14:59:18.280802294 +0000 UTC m=+237.354077918" Apr 20 15:00:21.337882 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:21.337848 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4z848_d54d05c4-b074-4189-b1dd-7ff476b824ec/console-operator/1.log" Apr 20 15:00:21.339273 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:21.339251 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4z848_d54d05c4-b074-4189-b1dd-7ff476b824ec/console-operator/1.log" Apr 20 15:00:21.346473 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:21.346447 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 15:00:22.428360 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:22.428324 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr"] Apr 20 15:00:22.431637 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:22.431620 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr" Apr 20 15:00:22.434309 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:22.434287 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 15:00:22.434453 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:22.434290 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ngpdb\"" Apr 20 15:00:22.436060 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:22.436043 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 15:00:22.439269 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:22.439249 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr"] Apr 20 15:00:22.486436 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:22.486412 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6948b4e3-8090-4fff-aa60-344656406b39-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr\" (UID: \"6948b4e3-8090-4fff-aa60-344656406b39\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr" Apr 20 15:00:22.486528 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:22.486460 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhq4w\" (UniqueName: \"kubernetes.io/projected/6948b4e3-8090-4fff-aa60-344656406b39-kube-api-access-qhq4w\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr\" (UID: \"6948b4e3-8090-4fff-aa60-344656406b39\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr" Apr 20 15:00:22.486528 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:22.486519 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6948b4e3-8090-4fff-aa60-344656406b39-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr\" (UID: \"6948b4e3-8090-4fff-aa60-344656406b39\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr" Apr 20 15:00:22.587776 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:22.587749 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhq4w\" (UniqueName: \"kubernetes.io/projected/6948b4e3-8090-4fff-aa60-344656406b39-kube-api-access-qhq4w\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr\" (UID: \"6948b4e3-8090-4fff-aa60-344656406b39\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr" Apr 20 15:00:22.587882 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:22.587791 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6948b4e3-8090-4fff-aa60-344656406b39-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr\" (UID: \"6948b4e3-8090-4fff-aa60-344656406b39\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr" Apr 20 15:00:22.588068 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:22.588048 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6948b4e3-8090-4fff-aa60-344656406b39-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr\" (UID: \"6948b4e3-8090-4fff-aa60-344656406b39\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr" Apr 20 15:00:22.588203 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:22.588180 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6948b4e3-8090-4fff-aa60-344656406b39-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr\" (UID: \"6948b4e3-8090-4fff-aa60-344656406b39\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr" Apr 20 15:00:22.588310 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:22.588297 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6948b4e3-8090-4fff-aa60-344656406b39-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr\" (UID: \"6948b4e3-8090-4fff-aa60-344656406b39\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr" Apr 20 15:00:22.598331 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:22.598308 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhq4w\" (UniqueName: \"kubernetes.io/projected/6948b4e3-8090-4fff-aa60-344656406b39-kube-api-access-qhq4w\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr\" (UID: \"6948b4e3-8090-4fff-aa60-344656406b39\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr" Apr 20 15:00:22.741749 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:22.741669 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr" Apr 20 15:00:22.857846 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:22.857812 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr"] Apr 20 15:00:22.860727 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:00:22.860695 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6948b4e3_8090_4fff_aa60_344656406b39.slice/crio-2c0a17f71ab06be9244a960ab1168ad78ff4c58c889a67718887391b467214c2 WatchSource:0}: Error finding container 2c0a17f71ab06be9244a960ab1168ad78ff4c58c889a67718887391b467214c2: Status 404 returned error can't find the container with id 2c0a17f71ab06be9244a960ab1168ad78ff4c58c889a67718887391b467214c2 Apr 20 15:00:22.862445 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:22.862428 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:00:23.438291 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:23.438253 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr" event={"ID":"6948b4e3-8090-4fff-aa60-344656406b39","Type":"ContainerStarted","Data":"2c0a17f71ab06be9244a960ab1168ad78ff4c58c889a67718887391b467214c2"} Apr 20 15:00:28.453887 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:28.453808 2574 generic.go:358] "Generic (PLEG): container finished" podID="6948b4e3-8090-4fff-aa60-344656406b39" containerID="f660a12cfe12581e72d4bc5248ff40ec73cbd145409be0a78cc2ee22c62f879e" exitCode=0 Apr 20 15:00:28.454275 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:28.453902 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr" event={"ID":"6948b4e3-8090-4fff-aa60-344656406b39","Type":"ContainerDied","Data":"f660a12cfe12581e72d4bc5248ff40ec73cbd145409be0a78cc2ee22c62f879e"} Apr 20 15:00:31.462568 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:31.462531 2574 generic.go:358] "Generic (PLEG): container finished" podID="6948b4e3-8090-4fff-aa60-344656406b39" containerID="d956858d66a2f92bc686b0d9cccb97b2a213a42a0eb1fac55310e6c549d0ec31" exitCode=0 Apr 20 15:00:31.462924 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:31.462613 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr" event={"ID":"6948b4e3-8090-4fff-aa60-344656406b39","Type":"ContainerDied","Data":"d956858d66a2f92bc686b0d9cccb97b2a213a42a0eb1fac55310e6c549d0ec31"} Apr 20 15:00:40.488438 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:40.488343 2574 generic.go:358] "Generic (PLEG): container finished" podID="6948b4e3-8090-4fff-aa60-344656406b39" containerID="ac744c7a5b1bc487a046b0dd30e283e7bb895defd6a32a9864b6debfc7f0a9cd" exitCode=0 Apr 20 15:00:40.488829 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:40.488430 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr" event={"ID":"6948b4e3-8090-4fff-aa60-344656406b39","Type":"ContainerDied","Data":"ac744c7a5b1bc487a046b0dd30e283e7bb895defd6a32a9864b6debfc7f0a9cd"} Apr 20 15:00:41.609545 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:41.609523 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr" Apr 20 15:00:41.747113 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:41.747035 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhq4w\" (UniqueName: \"kubernetes.io/projected/6948b4e3-8090-4fff-aa60-344656406b39-kube-api-access-qhq4w\") pod \"6948b4e3-8090-4fff-aa60-344656406b39\" (UID: \"6948b4e3-8090-4fff-aa60-344656406b39\") " Apr 20 15:00:41.747113 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:41.747077 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6948b4e3-8090-4fff-aa60-344656406b39-bundle\") pod \"6948b4e3-8090-4fff-aa60-344656406b39\" (UID: \"6948b4e3-8090-4fff-aa60-344656406b39\") " Apr 20 15:00:41.747344 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:41.747134 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6948b4e3-8090-4fff-aa60-344656406b39-util\") pod \"6948b4e3-8090-4fff-aa60-344656406b39\" (UID: \"6948b4e3-8090-4fff-aa60-344656406b39\") " Apr 20 15:00:41.747756 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:41.747729 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6948b4e3-8090-4fff-aa60-344656406b39-bundle" (OuterVolumeSpecName: "bundle") pod "6948b4e3-8090-4fff-aa60-344656406b39" (UID: "6948b4e3-8090-4fff-aa60-344656406b39"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:00:41.749123 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:41.749101 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6948b4e3-8090-4fff-aa60-344656406b39-kube-api-access-qhq4w" (OuterVolumeSpecName: "kube-api-access-qhq4w") pod "6948b4e3-8090-4fff-aa60-344656406b39" (UID: "6948b4e3-8090-4fff-aa60-344656406b39"). InnerVolumeSpecName "kube-api-access-qhq4w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:00:41.751155 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:41.751132 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6948b4e3-8090-4fff-aa60-344656406b39-util" (OuterVolumeSpecName: "util") pod "6948b4e3-8090-4fff-aa60-344656406b39" (UID: "6948b4e3-8090-4fff-aa60-344656406b39"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:00:41.848135 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:41.848112 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qhq4w\" (UniqueName: \"kubernetes.io/projected/6948b4e3-8090-4fff-aa60-344656406b39-kube-api-access-qhq4w\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:00:41.848244 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:41.848156 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6948b4e3-8090-4fff-aa60-344656406b39-bundle\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:00:41.848244 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:41.848166 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6948b4e3-8090-4fff-aa60-344656406b39-util\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:00:42.496263 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:42.496227 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr" event={"ID":"6948b4e3-8090-4fff-aa60-344656406b39","Type":"ContainerDied","Data":"2c0a17f71ab06be9244a960ab1168ad78ff4c58c889a67718887391b467214c2"} Apr 20 15:00:42.496263 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:42.496261 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c0a17f71ab06be9244a960ab1168ad78ff4c58c889a67718887391b467214c2" Apr 20 15:00:42.496263 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:42.496263 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e574sxr" Apr 20 15:00:49.941902 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:49.941870 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5gxz9"] Apr 20 15:00:49.942478 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:49.942149 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6948b4e3-8090-4fff-aa60-344656406b39" containerName="extract" Apr 20 15:00:49.942478 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:49.942161 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6948b4e3-8090-4fff-aa60-344656406b39" containerName="extract" Apr 20 15:00:49.942478 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:49.942169 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6948b4e3-8090-4fff-aa60-344656406b39" containerName="util" Apr 20 15:00:49.942478 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:49.942174 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6948b4e3-8090-4fff-aa60-344656406b39" containerName="util" Apr 20 15:00:49.942478 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:49.942180 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6948b4e3-8090-4fff-aa60-344656406b39" containerName="pull" Apr 20 15:00:49.942478 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:49.942186 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6948b4e3-8090-4fff-aa60-344656406b39" containerName="pull" Apr 20 15:00:49.942478 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:49.942233 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="6948b4e3-8090-4fff-aa60-344656406b39" containerName="extract" Apr 20 15:00:49.947260 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:49.947243 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5gxz9" Apr 20 15:00:49.950334 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:49.950316 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 20 15:00:49.950437 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:49.950329 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-nz8hl\"" Apr 20 15:00:49.950437 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:49.950317 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 20 15:00:49.957985 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:49.957964 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5gxz9"] Apr 20 15:00:50.114102 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:50.114062 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fc8cf31b-e58c-4583-94a5-30c145e4bab2-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-5gxz9\" (UID: \"fc8cf31b-e58c-4583-94a5-30c145e4bab2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5gxz9" Apr 20 15:00:50.114266 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:50.114129 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xggb\" (UniqueName: \"kubernetes.io/projected/fc8cf31b-e58c-4583-94a5-30c145e4bab2-kube-api-access-8xggb\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-5gxz9\" (UID: \"fc8cf31b-e58c-4583-94a5-30c145e4bab2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5gxz9" Apr 20 15:00:50.215505 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:50.215430 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fc8cf31b-e58c-4583-94a5-30c145e4bab2-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-5gxz9\" (UID: \"fc8cf31b-e58c-4583-94a5-30c145e4bab2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5gxz9" Apr 20 15:00:50.215505 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:50.215493 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xggb\" (UniqueName: \"kubernetes.io/projected/fc8cf31b-e58c-4583-94a5-30c145e4bab2-kube-api-access-8xggb\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-5gxz9\" (UID: \"fc8cf31b-e58c-4583-94a5-30c145e4bab2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5gxz9" Apr 20 15:00:50.215862 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:50.215841 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fc8cf31b-e58c-4583-94a5-30c145e4bab2-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-5gxz9\" (UID: \"fc8cf31b-e58c-4583-94a5-30c145e4bab2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5gxz9" Apr 20 15:00:50.224689 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:50.224658 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xggb\" (UniqueName: \"kubernetes.io/projected/fc8cf31b-e58c-4583-94a5-30c145e4bab2-kube-api-access-8xggb\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-5gxz9\" (UID: \"fc8cf31b-e58c-4583-94a5-30c145e4bab2\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5gxz9" Apr 20 15:00:50.256138 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:50.256114 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5gxz9" Apr 20 15:00:50.379822 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:50.379792 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5gxz9"] Apr 20 15:00:50.382596 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:00:50.382565 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc8cf31b_e58c_4583_94a5_30c145e4bab2.slice/crio-aaeac7fafabd836e87d5f3c4dc0316b8c151f5b7530af98b29584005564c5841 WatchSource:0}: Error finding container aaeac7fafabd836e87d5f3c4dc0316b8c151f5b7530af98b29584005564c5841: Status 404 returned error can't find the container with id aaeac7fafabd836e87d5f3c4dc0316b8c151f5b7530af98b29584005564c5841 Apr 20 15:00:50.518427 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:50.518336 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5gxz9" event={"ID":"fc8cf31b-e58c-4583-94a5-30c145e4bab2","Type":"ContainerStarted","Data":"aaeac7fafabd836e87d5f3c4dc0316b8c151f5b7530af98b29584005564c5841"} Apr 20 15:00:53.529126 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:53.529091 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5gxz9" event={"ID":"fc8cf31b-e58c-4583-94a5-30c145e4bab2","Type":"ContainerStarted","Data":"8d2e2b8b96056ce0328d607fa08831459e8b5a4c79c8056104413f51d09296a0"} Apr 20 15:00:53.551319 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:53.551274 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-5gxz9" podStartSLOduration=2.03359084 podStartE2EDuration="4.551260406s" podCreationTimestamp="2026-04-20 15:00:49 +0000 UTC" firstStartedPulling="2026-04-20 15:00:50.384836312 +0000 UTC m=+329.458111915" lastFinishedPulling="2026-04-20 15:00:52.902505873 +0000 UTC m=+331.975781481" observedRunningTime="2026-04-20 15:00:53.5488326 +0000 UTC m=+332.622108225" watchObservedRunningTime="2026-04-20 15:00:53.551260406 +0000 UTC m=+332.624536034" Apr 20 15:00:54.618452 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:54.618419 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q"] Apr 20 15:00:54.622068 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:54.622043 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q" Apr 20 15:00:54.625296 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:54.625273 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 15:00:54.625481 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:54.625454 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ngpdb\"" Apr 20 15:00:54.626553 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:54.626529 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 15:00:54.628598 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:54.628576 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q"] Apr 20 15:00:54.753301 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:54.753267 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8447df8c-564a-4bf7-8169-1e905e55763e-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q\" (UID: \"8447df8c-564a-4bf7-8169-1e905e55763e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q" Apr 20 15:00:54.753490 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:54.753362 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5fqn\" (UniqueName: \"kubernetes.io/projected/8447df8c-564a-4bf7-8169-1e905e55763e-kube-api-access-g5fqn\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q\" (UID: \"8447df8c-564a-4bf7-8169-1e905e55763e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q" Apr 20 15:00:54.753490 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:54.753442 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8447df8c-564a-4bf7-8169-1e905e55763e-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q\" (UID: \"8447df8c-564a-4bf7-8169-1e905e55763e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q" Apr 20 15:00:54.854564 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:54.854527 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8447df8c-564a-4bf7-8169-1e905e55763e-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q\" (UID: \"8447df8c-564a-4bf7-8169-1e905e55763e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q" Apr 20 15:00:54.854704 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:54.854571 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8447df8c-564a-4bf7-8169-1e905e55763e-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q\" (UID: \"8447df8c-564a-4bf7-8169-1e905e55763e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q" Apr 20 15:00:54.854704 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:54.854626 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5fqn\" (UniqueName: \"kubernetes.io/projected/8447df8c-564a-4bf7-8169-1e905e55763e-kube-api-access-g5fqn\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q\" (UID: \"8447df8c-564a-4bf7-8169-1e905e55763e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q" Apr 20 15:00:54.854954 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:54.854935 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8447df8c-564a-4bf7-8169-1e905e55763e-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q\" (UID: \"8447df8c-564a-4bf7-8169-1e905e55763e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q" Apr 20 15:00:54.854999 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:54.854983 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8447df8c-564a-4bf7-8169-1e905e55763e-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q\" (UID: \"8447df8c-564a-4bf7-8169-1e905e55763e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q" Apr 20 15:00:54.869722 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:54.869661 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5fqn\" (UniqueName: \"kubernetes.io/projected/8447df8c-564a-4bf7-8169-1e905e55763e-kube-api-access-g5fqn\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q\" (UID: \"8447df8c-564a-4bf7-8169-1e905e55763e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q" Apr 20 15:00:54.933436 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:54.933410 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q" Apr 20 15:00:55.051995 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:55.051859 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q"] Apr 20 15:00:55.054591 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:00:55.054568 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8447df8c_564a_4bf7_8169_1e905e55763e.slice/crio-fba19e19a0697deaaa63493cd2f2da16c1f5361d8e530b2ee54a5a697f42282d WatchSource:0}: Error finding container fba19e19a0697deaaa63493cd2f2da16c1f5361d8e530b2ee54a5a697f42282d: Status 404 returned error can't find the container with id fba19e19a0697deaaa63493cd2f2da16c1f5361d8e530b2ee54a5a697f42282d Apr 20 15:00:55.537022 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:55.536925 2574 generic.go:358] "Generic (PLEG): container finished" podID="8447df8c-564a-4bf7-8169-1e905e55763e" containerID="d3cd3802d04a0693e812bfbdfa691e1299dfd20eb02630b9ee0f18c4da59cf46" exitCode=0 Apr 20 15:00:55.537022 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:55.536975 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q" event={"ID":"8447df8c-564a-4bf7-8169-1e905e55763e","Type":"ContainerDied","Data":"d3cd3802d04a0693e812bfbdfa691e1299dfd20eb02630b9ee0f18c4da59cf46"} Apr 20 15:00:55.537022 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:55.537008 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q" event={"ID":"8447df8c-564a-4bf7-8169-1e905e55763e","Type":"ContainerStarted","Data":"fba19e19a0697deaaa63493cd2f2da16c1f5361d8e530b2ee54a5a697f42282d"} Apr 20 15:00:57.074623 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:57.074577 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-8gtnv"] Apr 20 15:00:57.081328 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:57.081303 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-8gtnv" Apr 20 15:00:57.084035 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:57.083972 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-ksrpl\"" Apr 20 15:00:57.084035 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:57.084006 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 15:00:57.084035 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:57.084013 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 15:00:57.087552 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:57.087530 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-8gtnv"] Apr 20 15:00:57.174227 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:57.174186 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc51693e-fc8b-4293-b55b-5402f8bd1a0a-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-8gtnv\" (UID: \"bc51693e-fc8b-4293-b55b-5402f8bd1a0a\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8gtnv" Apr 20 15:00:57.174428 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:57.174289 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9s5v\" (UniqueName: \"kubernetes.io/projected/bc51693e-fc8b-4293-b55b-5402f8bd1a0a-kube-api-access-r9s5v\") pod \"cert-manager-webhook-597b96b99b-8gtnv\" (UID: \"bc51693e-fc8b-4293-b55b-5402f8bd1a0a\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8gtnv" Apr 20 15:00:57.275119 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:57.275081 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9s5v\" (UniqueName: \"kubernetes.io/projected/bc51693e-fc8b-4293-b55b-5402f8bd1a0a-kube-api-access-r9s5v\") pod \"cert-manager-webhook-597b96b99b-8gtnv\" (UID: \"bc51693e-fc8b-4293-b55b-5402f8bd1a0a\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8gtnv" Apr 20 15:00:57.275286 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:57.275183 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc51693e-fc8b-4293-b55b-5402f8bd1a0a-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-8gtnv\" (UID: \"bc51693e-fc8b-4293-b55b-5402f8bd1a0a\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8gtnv" Apr 20 15:00:57.284621 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:57.284596 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bc51693e-fc8b-4293-b55b-5402f8bd1a0a-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-8gtnv\" (UID: \"bc51693e-fc8b-4293-b55b-5402f8bd1a0a\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8gtnv" Apr 20 15:00:57.284763 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:57.284746 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9s5v\" (UniqueName: \"kubernetes.io/projected/bc51693e-fc8b-4293-b55b-5402f8bd1a0a-kube-api-access-r9s5v\") pod \"cert-manager-webhook-597b96b99b-8gtnv\" (UID: \"bc51693e-fc8b-4293-b55b-5402f8bd1a0a\") " pod="cert-manager/cert-manager-webhook-597b96b99b-8gtnv" Apr 20 15:00:57.401422 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:57.401318 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-8gtnv" Apr 20 15:00:57.770071 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:57.770049 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-8gtnv"] Apr 20 15:00:57.774435 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:00:57.774411 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc51693e_fc8b_4293_b55b_5402f8bd1a0a.slice/crio-20931f569ddc0136a05488b9322063f0eaaa1929c476b4bd2485cd4601a30b83 WatchSource:0}: Error finding container 20931f569ddc0136a05488b9322063f0eaaa1929c476b4bd2485cd4601a30b83: Status 404 returned error can't find the container with id 20931f569ddc0136a05488b9322063f0eaaa1929c476b4bd2485cd4601a30b83 Apr 20 15:00:58.549243 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:58.549166 2574 generic.go:358] "Generic (PLEG): container finished" podID="8447df8c-564a-4bf7-8169-1e905e55763e" containerID="52740cf958362f58f281603066fde0d5e99daf55a3632294dc6d9ec76d27d421" exitCode=0 Apr 20 15:00:58.549692 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:58.549265 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q" event={"ID":"8447df8c-564a-4bf7-8169-1e905e55763e","Type":"ContainerDied","Data":"52740cf958362f58f281603066fde0d5e99daf55a3632294dc6d9ec76d27d421"} Apr 20 15:00:58.550567 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:58.550538 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-8gtnv" event={"ID":"bc51693e-fc8b-4293-b55b-5402f8bd1a0a","Type":"ContainerStarted","Data":"20931f569ddc0136a05488b9322063f0eaaa1929c476b4bd2485cd4601a30b83"} Apr 20 15:00:59.556498 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:59.556465 2574 generic.go:358] "Generic (PLEG): container finished" podID="8447df8c-564a-4bf7-8169-1e905e55763e" containerID="500fe8e2ebd1c3c60df73b177e8dadb4a210a9bbcb12803762b1e426483a8940" exitCode=0 Apr 20 15:00:59.556914 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:00:59.556553 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q" event={"ID":"8447df8c-564a-4bf7-8169-1e905e55763e","Type":"ContainerDied","Data":"500fe8e2ebd1c3c60df73b177e8dadb4a210a9bbcb12803762b1e426483a8940"} Apr 20 15:01:00.693855 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:00.693820 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q" Apr 20 15:01:00.702822 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:00.702799 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8447df8c-564a-4bf7-8169-1e905e55763e-util\") pod \"8447df8c-564a-4bf7-8169-1e905e55763e\" (UID: \"8447df8c-564a-4bf7-8169-1e905e55763e\") " Apr 20 15:01:00.702908 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:00.702838 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8447df8c-564a-4bf7-8169-1e905e55763e-bundle\") pod \"8447df8c-564a-4bf7-8169-1e905e55763e\" (UID: \"8447df8c-564a-4bf7-8169-1e905e55763e\") " Apr 20 15:01:00.702908 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:00.702873 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5fqn\" (UniqueName: \"kubernetes.io/projected/8447df8c-564a-4bf7-8169-1e905e55763e-kube-api-access-g5fqn\") pod \"8447df8c-564a-4bf7-8169-1e905e55763e\" (UID: \"8447df8c-564a-4bf7-8169-1e905e55763e\") " Apr 20 15:01:00.703274 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:00.703247 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8447df8c-564a-4bf7-8169-1e905e55763e-bundle" (OuterVolumeSpecName: "bundle") pod "8447df8c-564a-4bf7-8169-1e905e55763e" (UID: "8447df8c-564a-4bf7-8169-1e905e55763e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:01:00.704948 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:00.704924 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8447df8c-564a-4bf7-8169-1e905e55763e-kube-api-access-g5fqn" (OuterVolumeSpecName: "kube-api-access-g5fqn") pod "8447df8c-564a-4bf7-8169-1e905e55763e" (UID: "8447df8c-564a-4bf7-8169-1e905e55763e"). InnerVolumeSpecName "kube-api-access-g5fqn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:01:00.706793 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:00.706756 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8447df8c-564a-4bf7-8169-1e905e55763e-util" (OuterVolumeSpecName: "util") pod "8447df8c-564a-4bf7-8169-1e905e55763e" (UID: "8447df8c-564a-4bf7-8169-1e905e55763e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:01:00.804156 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:00.804127 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8447df8c-564a-4bf7-8169-1e905e55763e-util\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:01:00.804156 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:00.804151 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8447df8c-564a-4bf7-8169-1e905e55763e-bundle\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:01:00.804307 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:00.804164 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g5fqn\" (UniqueName: \"kubernetes.io/projected/8447df8c-564a-4bf7-8169-1e905e55763e-kube-api-access-g5fqn\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:01:01.564447 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:01.564404 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-8gtnv" event={"ID":"bc51693e-fc8b-4293-b55b-5402f8bd1a0a","Type":"ContainerStarted","Data":"06e199ad9648d61b125c1a447ef26e0ca32b3a937f112c3287d09b60426d96fa"} Apr 20 15:01:01.564634 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:01.564477 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-8gtnv" Apr 20 15:01:01.566003 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:01.565979 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q" event={"ID":"8447df8c-564a-4bf7-8169-1e905e55763e","Type":"ContainerDied","Data":"fba19e19a0697deaaa63493cd2f2da16c1f5361d8e530b2ee54a5a697f42282d"} Apr 20 15:01:01.566003 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:01.566003 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fnvh9q" Apr 20 15:01:01.566140 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:01.566009 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fba19e19a0697deaaa63493cd2f2da16c1f5361d8e530b2ee54a5a697f42282d" Apr 20 15:01:01.581276 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:01.581233 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-8gtnv" podStartSLOduration=1.63749746 podStartE2EDuration="4.581221748s" podCreationTimestamp="2026-04-20 15:00:57 +0000 UTC" firstStartedPulling="2026-04-20 15:00:57.776230208 +0000 UTC m=+336.849505810" lastFinishedPulling="2026-04-20 15:01:00.719954495 +0000 UTC m=+339.793230098" observedRunningTime="2026-04-20 15:01:01.579797495 +0000 UTC m=+340.653073121" watchObservedRunningTime="2026-04-20 15:01:01.581221748 +0000 UTC m=+340.654497373" Apr 20 15:01:04.914862 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:04.914827 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-57nfs"] Apr 20 15:01:04.915253 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:04.915131 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8447df8c-564a-4bf7-8169-1e905e55763e" containerName="pull" Apr 20 15:01:04.915253 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:04.915142 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8447df8c-564a-4bf7-8169-1e905e55763e" containerName="pull" Apr 20 15:01:04.915253 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:04.915152 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8447df8c-564a-4bf7-8169-1e905e55763e" containerName="util" Apr 20 15:01:04.915253 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:04.915157 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8447df8c-564a-4bf7-8169-1e905e55763e" containerName="util" Apr 20 15:01:04.915253 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:04.915166 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8447df8c-564a-4bf7-8169-1e905e55763e" containerName="extract" Apr 20 15:01:04.915253 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:04.915171 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8447df8c-564a-4bf7-8169-1e905e55763e" containerName="extract" Apr 20 15:01:04.915253 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:04.915223 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="8447df8c-564a-4bf7-8169-1e905e55763e" containerName="extract" Apr 20 15:01:04.917129 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:04.917113 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-57nfs" Apr 20 15:01:04.919755 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:04.919732 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 15:01:04.919885 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:04.919830 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-mxtlk\"" Apr 20 15:01:04.921123 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:04.921108 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 15:01:04.924957 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:04.924937 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-57nfs"] Apr 20 15:01:04.933203 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:04.933178 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/70145afa-8dad-481b-be4d-ced870b4f8c8-tmp\") pod \"openshift-lws-operator-bfc7f696d-57nfs\" (UID: \"70145afa-8dad-481b-be4d-ced870b4f8c8\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-57nfs" Apr 20 15:01:04.933319 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:04.933216 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gngrj\" (UniqueName: \"kubernetes.io/projected/70145afa-8dad-481b-be4d-ced870b4f8c8-kube-api-access-gngrj\") pod \"openshift-lws-operator-bfc7f696d-57nfs\" (UID: \"70145afa-8dad-481b-be4d-ced870b4f8c8\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-57nfs" Apr 20 15:01:05.034563 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:05.034532 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/70145afa-8dad-481b-be4d-ced870b4f8c8-tmp\") pod \"openshift-lws-operator-bfc7f696d-57nfs\" (UID: \"70145afa-8dad-481b-be4d-ced870b4f8c8\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-57nfs" Apr 20 15:01:05.034696 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:05.034575 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gngrj\" (UniqueName: \"kubernetes.io/projected/70145afa-8dad-481b-be4d-ced870b4f8c8-kube-api-access-gngrj\") pod \"openshift-lws-operator-bfc7f696d-57nfs\" (UID: \"70145afa-8dad-481b-be4d-ced870b4f8c8\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-57nfs" Apr 20 15:01:05.034915 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:05.034894 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/70145afa-8dad-481b-be4d-ced870b4f8c8-tmp\") pod \"openshift-lws-operator-bfc7f696d-57nfs\" (UID: \"70145afa-8dad-481b-be4d-ced870b4f8c8\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-57nfs" Apr 20 15:01:05.042763 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:05.042740 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gngrj\" (UniqueName: \"kubernetes.io/projected/70145afa-8dad-481b-be4d-ced870b4f8c8-kube-api-access-gngrj\") pod \"openshift-lws-operator-bfc7f696d-57nfs\" (UID: \"70145afa-8dad-481b-be4d-ced870b4f8c8\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-57nfs" Apr 20 15:01:05.226606 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:05.226531 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-57nfs" Apr 20 15:01:05.341968 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:05.341942 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-57nfs"] Apr 20 15:01:05.344695 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:01:05.344667 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70145afa_8dad_481b_be4d_ced870b4f8c8.slice/crio-32e6b3d86af9c9abbbbe9cbe5a9c8297af8aa46e9c01805655dbd7d2cdc63ce5 WatchSource:0}: Error finding container 32e6b3d86af9c9abbbbe9cbe5a9c8297af8aa46e9c01805655dbd7d2cdc63ce5: Status 404 returned error can't find the container with id 32e6b3d86af9c9abbbbe9cbe5a9c8297af8aa46e9c01805655dbd7d2cdc63ce5 Apr 20 15:01:05.579554 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:05.579522 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-57nfs" event={"ID":"70145afa-8dad-481b-be4d-ced870b4f8c8","Type":"ContainerStarted","Data":"32e6b3d86af9c9abbbbe9cbe5a9c8297af8aa46e9c01805655dbd7d2cdc63ce5"} Apr 20 15:01:07.571012 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:07.570980 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-8gtnv" Apr 20 15:01:07.587294 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:07.587219 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-57nfs" event={"ID":"70145afa-8dad-481b-be4d-ced870b4f8c8","Type":"ContainerStarted","Data":"db6965012aa833753ca4e87f0be6ff894ec4c678bd9b7f020e511aa1d4041651"} Apr 20 15:01:07.607578 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:07.607530 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-57nfs" podStartSLOduration=1.587445556 podStartE2EDuration="3.607517283s" podCreationTimestamp="2026-04-20 15:01:04 +0000 UTC" firstStartedPulling="2026-04-20 15:01:05.346083244 +0000 UTC m=+344.419358847" lastFinishedPulling="2026-04-20 15:01:07.366154968 +0000 UTC m=+346.439430574" observedRunningTime="2026-04-20 15:01:07.606691066 +0000 UTC m=+346.679966712" watchObservedRunningTime="2026-04-20 15:01:07.607517283 +0000 UTC m=+346.680792908" Apr 20 15:01:09.808198 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:09.808167 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f"] Apr 20 15:01:09.810378 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:09.810350 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f" Apr 20 15:01:09.812972 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:09.812950 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 15:01:09.812972 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:09.812965 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 15:01:09.813146 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:09.813074 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ngpdb\"" Apr 20 15:01:09.818286 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:09.818264 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f"] Apr 20 15:01:09.873547 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:09.873518 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n6lv\" (UniqueName: \"kubernetes.io/projected/52077f48-6086-468b-b91d-86d1c3a149bf-kube-api-access-9n6lv\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f\" (UID: \"52077f48-6086-468b-b91d-86d1c3a149bf\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f" Apr 20 15:01:09.873686 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:09.873560 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52077f48-6086-468b-b91d-86d1c3a149bf-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f\" (UID: \"52077f48-6086-468b-b91d-86d1c3a149bf\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f" Apr 20 15:01:09.873686 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:09.873592 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52077f48-6086-468b-b91d-86d1c3a149bf-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f\" (UID: \"52077f48-6086-468b-b91d-86d1c3a149bf\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f" Apr 20 15:01:09.974817 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:09.974785 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52077f48-6086-468b-b91d-86d1c3a149bf-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f\" (UID: \"52077f48-6086-468b-b91d-86d1c3a149bf\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f" Apr 20 15:01:09.974974 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:09.974857 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9n6lv\" (UniqueName: \"kubernetes.io/projected/52077f48-6086-468b-b91d-86d1c3a149bf-kube-api-access-9n6lv\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f\" (UID: \"52077f48-6086-468b-b91d-86d1c3a149bf\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f" Apr 20 15:01:09.974974 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:09.974900 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52077f48-6086-468b-b91d-86d1c3a149bf-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f\" (UID: \"52077f48-6086-468b-b91d-86d1c3a149bf\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f" Apr 20 15:01:09.975175 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:09.975153 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52077f48-6086-468b-b91d-86d1c3a149bf-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f\" (UID: \"52077f48-6086-468b-b91d-86d1c3a149bf\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f" Apr 20 15:01:09.975251 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:09.975208 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52077f48-6086-468b-b91d-86d1c3a149bf-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f\" (UID: \"52077f48-6086-468b-b91d-86d1c3a149bf\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f" Apr 20 15:01:09.984208 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:09.984186 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n6lv\" (UniqueName: \"kubernetes.io/projected/52077f48-6086-468b-b91d-86d1c3a149bf-kube-api-access-9n6lv\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f\" (UID: \"52077f48-6086-468b-b91d-86d1c3a149bf\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f" Apr 20 15:01:10.119929 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:10.119839 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f" Apr 20 15:01:10.238235 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:10.238211 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f"] Apr 20 15:01:10.240580 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:01:10.240554 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52077f48_6086_468b_b91d_86d1c3a149bf.slice/crio-e6c1f9d6add3ed9541c8107b303f5d2f2caa9d4d68c057d53e56a8155b379924 WatchSource:0}: Error finding container e6c1f9d6add3ed9541c8107b303f5d2f2caa9d4d68c057d53e56a8155b379924: Status 404 returned error can't find the container with id e6c1f9d6add3ed9541c8107b303f5d2f2caa9d4d68c057d53e56a8155b379924 Apr 20 15:01:10.597283 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:10.597248 2574 generic.go:358] "Generic (PLEG): container finished" podID="52077f48-6086-468b-b91d-86d1c3a149bf" containerID="310fbb1d3418a9a730eb292ee4678ed5b58cb615163b7f05529c1c2f2dc64b7e" exitCode=0 Apr 20 15:01:10.597508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:10.597355 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f" event={"ID":"52077f48-6086-468b-b91d-86d1c3a149bf","Type":"ContainerDied","Data":"310fbb1d3418a9a730eb292ee4678ed5b58cb615163b7f05529c1c2f2dc64b7e"} Apr 20 15:01:10.597508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:10.597416 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f" event={"ID":"52077f48-6086-468b-b91d-86d1c3a149bf","Type":"ContainerStarted","Data":"e6c1f9d6add3ed9541c8107b303f5d2f2caa9d4d68c057d53e56a8155b379924"} Apr 20 15:01:11.602918 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:11.602828 2574 generic.go:358] "Generic (PLEG): container finished" podID="52077f48-6086-468b-b91d-86d1c3a149bf" containerID="a1db7dfd7d2bc90dc1c47b418dc742444beaa2c173e851ad6e6f0af044ca4ef8" exitCode=0 Apr 20 15:01:11.603266 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:11.602913 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f" event={"ID":"52077f48-6086-468b-b91d-86d1c3a149bf","Type":"ContainerDied","Data":"a1db7dfd7d2bc90dc1c47b418dc742444beaa2c173e851ad6e6f0af044ca4ef8"} Apr 20 15:01:12.608081 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:12.608047 2574 generic.go:358] "Generic (PLEG): container finished" podID="52077f48-6086-468b-b91d-86d1c3a149bf" containerID="97652efc0a44dfeeace6c623f210106d43d457bafa432ae26bcf08655604c874" exitCode=0 Apr 20 15:01:12.608472 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:12.608131 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f" event={"ID":"52077f48-6086-468b-b91d-86d1c3a149bf","Type":"ContainerDied","Data":"97652efc0a44dfeeace6c623f210106d43d457bafa432ae26bcf08655604c874"} Apr 20 15:01:13.730816 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:13.730795 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f" Apr 20 15:01:13.804873 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:13.804845 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52077f48-6086-468b-b91d-86d1c3a149bf-bundle\") pod \"52077f48-6086-468b-b91d-86d1c3a149bf\" (UID: \"52077f48-6086-468b-b91d-86d1c3a149bf\") " Apr 20 15:01:13.805014 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:13.804918 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52077f48-6086-468b-b91d-86d1c3a149bf-util\") pod \"52077f48-6086-468b-b91d-86d1c3a149bf\" (UID: \"52077f48-6086-468b-b91d-86d1c3a149bf\") " Apr 20 15:01:13.805014 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:13.804985 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n6lv\" (UniqueName: \"kubernetes.io/projected/52077f48-6086-468b-b91d-86d1c3a149bf-kube-api-access-9n6lv\") pod \"52077f48-6086-468b-b91d-86d1c3a149bf\" (UID: \"52077f48-6086-468b-b91d-86d1c3a149bf\") " Apr 20 15:01:13.805605 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:13.805573 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52077f48-6086-468b-b91d-86d1c3a149bf-bundle" (OuterVolumeSpecName: "bundle") pod "52077f48-6086-468b-b91d-86d1c3a149bf" (UID: "52077f48-6086-468b-b91d-86d1c3a149bf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:01:13.807078 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:13.807049 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52077f48-6086-468b-b91d-86d1c3a149bf-kube-api-access-9n6lv" (OuterVolumeSpecName: "kube-api-access-9n6lv") pod "52077f48-6086-468b-b91d-86d1c3a149bf" (UID: "52077f48-6086-468b-b91d-86d1c3a149bf"). InnerVolumeSpecName "kube-api-access-9n6lv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:01:13.810587 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:13.810562 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52077f48-6086-468b-b91d-86d1c3a149bf-util" (OuterVolumeSpecName: "util") pod "52077f48-6086-468b-b91d-86d1c3a149bf" (UID: "52077f48-6086-468b-b91d-86d1c3a149bf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:01:13.906550 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:13.906500 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9n6lv\" (UniqueName: \"kubernetes.io/projected/52077f48-6086-468b-b91d-86d1c3a149bf-kube-api-access-9n6lv\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:01:13.906550 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:13.906525 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52077f48-6086-468b-b91d-86d1c3a149bf-bundle\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:01:13.906550 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:13.906536 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52077f48-6086-468b-b91d-86d1c3a149bf-util\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:01:14.616036 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:14.615996 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f" event={"ID":"52077f48-6086-468b-b91d-86d1c3a149bf","Type":"ContainerDied","Data":"e6c1f9d6add3ed9541c8107b303f5d2f2caa9d4d68c057d53e56a8155b379924"} Apr 20 15:01:14.616036 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:14.616037 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6c1f9d6add3ed9541c8107b303f5d2f2caa9d4d68c057d53e56a8155b379924" Apr 20 15:01:14.616220 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:14.616071 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c596q7f" Apr 20 15:01:23.860092 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:23.860063 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv"] Apr 20 15:01:23.860560 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:23.860330 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52077f48-6086-468b-b91d-86d1c3a149bf" containerName="util" Apr 20 15:01:23.860560 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:23.860340 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="52077f48-6086-468b-b91d-86d1c3a149bf" containerName="util" Apr 20 15:01:23.860560 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:23.860358 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52077f48-6086-468b-b91d-86d1c3a149bf" containerName="extract" Apr 20 15:01:23.860560 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:23.860364 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="52077f48-6086-468b-b91d-86d1c3a149bf" containerName="extract" Apr 20 15:01:23.860560 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:23.860397 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52077f48-6086-468b-b91d-86d1c3a149bf" containerName="pull" Apr 20 15:01:23.860560 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:23.860405 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="52077f48-6086-468b-b91d-86d1c3a149bf" containerName="pull" Apr 20 15:01:23.860560 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:23.860456 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="52077f48-6086-468b-b91d-86d1c3a149bf" containerName="extract" Apr 20 15:01:23.864714 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:23.864698 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv" Apr 20 15:01:23.867179 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:23.867154 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ngpdb\"" Apr 20 15:01:23.867300 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:23.867222 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 15:01:23.868324 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:23.868306 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 15:01:23.871844 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:23.871490 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv"] Apr 20 15:01:23.980519 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:23.980477 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42jdd\" (UniqueName: \"kubernetes.io/projected/a5911674-8d6f-4509-8507-3a8b80d66349-kube-api-access-42jdd\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv\" (UID: \"a5911674-8d6f-4509-8507-3a8b80d66349\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv" Apr 20 15:01:23.980660 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:23.980583 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5911674-8d6f-4509-8507-3a8b80d66349-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv\" (UID: \"a5911674-8d6f-4509-8507-3a8b80d66349\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv" Apr 20 15:01:23.980660 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:23.980642 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5911674-8d6f-4509-8507-3a8b80d66349-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv\" (UID: \"a5911674-8d6f-4509-8507-3a8b80d66349\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv" Apr 20 15:01:24.081181 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.081153 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5911674-8d6f-4509-8507-3a8b80d66349-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv\" (UID: \"a5911674-8d6f-4509-8507-3a8b80d66349\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv" Apr 20 15:01:24.081302 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.081189 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5911674-8d6f-4509-8507-3a8b80d66349-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv\" (UID: \"a5911674-8d6f-4509-8507-3a8b80d66349\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv" Apr 20 15:01:24.081302 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.081208 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42jdd\" (UniqueName: \"kubernetes.io/projected/a5911674-8d6f-4509-8507-3a8b80d66349-kube-api-access-42jdd\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv\" (UID: \"a5911674-8d6f-4509-8507-3a8b80d66349\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv" Apr 20 15:01:24.081559 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.081536 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5911674-8d6f-4509-8507-3a8b80d66349-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv\" (UID: \"a5911674-8d6f-4509-8507-3a8b80d66349\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv" Apr 20 15:01:24.081643 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.081611 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5911674-8d6f-4509-8507-3a8b80d66349-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv\" (UID: \"a5911674-8d6f-4509-8507-3a8b80d66349\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv" Apr 20 15:01:24.092467 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.092439 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42jdd\" (UniqueName: \"kubernetes.io/projected/a5911674-8d6f-4509-8507-3a8b80d66349-kube-api-access-42jdd\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv\" (UID: \"a5911674-8d6f-4509-8507-3a8b80d66349\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv" Apr 20 15:01:24.175265 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.175197 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv" Apr 20 15:01:24.292857 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.292832 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv"] Apr 20 15:01:24.295141 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:01:24.295111 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5911674_8d6f_4509_8507_3a8b80d66349.slice/crio-67d48c39ecfbfc298b6d658b35d51b23a0580655f35301b6a233f5d7c3f1b792 WatchSource:0}: Error finding container 67d48c39ecfbfc298b6d658b35d51b23a0580655f35301b6a233f5d7c3f1b792: Status 404 returned error can't find the container with id 67d48c39ecfbfc298b6d658b35d51b23a0580655f35301b6a233f5d7c3f1b792 Apr 20 15:01:24.613938 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.613906 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-99ff97f7d-zvp7j"] Apr 20 15:01:24.617101 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.617083 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-zvp7j" Apr 20 15:01:24.619861 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.619834 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 15:01:24.620020 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.620002 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-vsz6j\"" Apr 20 15:01:24.620080 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.620047 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 15:01:24.620080 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.620060 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 15:01:24.620176 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.620080 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 15:01:24.626042 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.626016 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-99ff97f7d-zvp7j"] Apr 20 15:01:24.648217 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.648187 2574 generic.go:358] "Generic (PLEG): container finished" podID="a5911674-8d6f-4509-8507-3a8b80d66349" containerID="3b5168c50ae30875309cae7cc88293ad36d15dac3de48bdb6408f71852fd9c15" exitCode=0 Apr 20 15:01:24.648351 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.648248 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv" event={"ID":"a5911674-8d6f-4509-8507-3a8b80d66349","Type":"ContainerDied","Data":"3b5168c50ae30875309cae7cc88293ad36d15dac3de48bdb6408f71852fd9c15"} Apr 20 15:01:24.648351 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.648277 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv" event={"ID":"a5911674-8d6f-4509-8507-3a8b80d66349","Type":"ContainerStarted","Data":"67d48c39ecfbfc298b6d658b35d51b23a0580655f35301b6a233f5d7c3f1b792"} Apr 20 15:01:24.685054 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.685020 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a64a35a4-d683-4459-a09a-c300ce5b4faf-webhook-cert\") pod \"opendatahub-operator-controller-manager-99ff97f7d-zvp7j\" (UID: \"a64a35a4-d683-4459-a09a-c300ce5b4faf\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-zvp7j" Apr 20 15:01:24.685054 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.685055 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a64a35a4-d683-4459-a09a-c300ce5b4faf-apiservice-cert\") pod \"opendatahub-operator-controller-manager-99ff97f7d-zvp7j\" (UID: \"a64a35a4-d683-4459-a09a-c300ce5b4faf\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-zvp7j" Apr 20 15:01:24.685220 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.685129 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkn98\" (UniqueName: \"kubernetes.io/projected/a64a35a4-d683-4459-a09a-c300ce5b4faf-kube-api-access-jkn98\") pod \"opendatahub-operator-controller-manager-99ff97f7d-zvp7j\" (UID: \"a64a35a4-d683-4459-a09a-c300ce5b4faf\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-zvp7j" Apr 20 15:01:24.785927 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.785897 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkn98\" (UniqueName: \"kubernetes.io/projected/a64a35a4-d683-4459-a09a-c300ce5b4faf-kube-api-access-jkn98\") pod \"opendatahub-operator-controller-manager-99ff97f7d-zvp7j\" (UID: \"a64a35a4-d683-4459-a09a-c300ce5b4faf\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-zvp7j" Apr 20 15:01:24.786076 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.785950 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a64a35a4-d683-4459-a09a-c300ce5b4faf-webhook-cert\") pod \"opendatahub-operator-controller-manager-99ff97f7d-zvp7j\" (UID: \"a64a35a4-d683-4459-a09a-c300ce5b4faf\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-zvp7j" Apr 20 15:01:24.786076 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.785970 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a64a35a4-d683-4459-a09a-c300ce5b4faf-apiservice-cert\") pod \"opendatahub-operator-controller-manager-99ff97f7d-zvp7j\" (UID: \"a64a35a4-d683-4459-a09a-c300ce5b4faf\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-zvp7j" Apr 20 15:01:24.788418 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.788387 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a64a35a4-d683-4459-a09a-c300ce5b4faf-apiservice-cert\") pod \"opendatahub-operator-controller-manager-99ff97f7d-zvp7j\" (UID: \"a64a35a4-d683-4459-a09a-c300ce5b4faf\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-zvp7j" Apr 20 15:01:24.788418 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.788404 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a64a35a4-d683-4459-a09a-c300ce5b4faf-webhook-cert\") pod \"opendatahub-operator-controller-manager-99ff97f7d-zvp7j\" (UID: \"a64a35a4-d683-4459-a09a-c300ce5b4faf\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-zvp7j" Apr 20 15:01:24.794833 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.794810 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkn98\" (UniqueName: \"kubernetes.io/projected/a64a35a4-d683-4459-a09a-c300ce5b4faf-kube-api-access-jkn98\") pod \"opendatahub-operator-controller-manager-99ff97f7d-zvp7j\" (UID: \"a64a35a4-d683-4459-a09a-c300ce5b4faf\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-zvp7j" Apr 20 15:01:24.927317 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:24.927252 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-zvp7j" Apr 20 15:01:25.049125 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:25.049096 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-99ff97f7d-zvp7j"] Apr 20 15:01:25.052074 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:01:25.052045 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda64a35a4_d683_4459_a09a_c300ce5b4faf.slice/crio-6d624248262b2916b8a0798f0e94fd1cc5c398372294d7c0c322f76fdc32c8f0 WatchSource:0}: Error finding container 6d624248262b2916b8a0798f0e94fd1cc5c398372294d7c0c322f76fdc32c8f0: Status 404 returned error can't find the container with id 6d624248262b2916b8a0798f0e94fd1cc5c398372294d7c0c322f76fdc32c8f0 Apr 20 15:01:25.654445 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:25.654410 2574 generic.go:358] "Generic (PLEG): container finished" podID="a5911674-8d6f-4509-8507-3a8b80d66349" containerID="c961b3b0d4dc9d7de0c3a1ba15bb6973e39c2223d4017d27011c156573c36bfd" exitCode=0 Apr 20 15:01:25.654648 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:25.654524 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv" event={"ID":"a5911674-8d6f-4509-8507-3a8b80d66349","Type":"ContainerDied","Data":"c961b3b0d4dc9d7de0c3a1ba15bb6973e39c2223d4017d27011c156573c36bfd"} Apr 20 15:01:25.656273 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:25.656053 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-zvp7j" event={"ID":"a64a35a4-d683-4459-a09a-c300ce5b4faf","Type":"ContainerStarted","Data":"6d624248262b2916b8a0798f0e94fd1cc5c398372294d7c0c322f76fdc32c8f0"} Apr 20 15:01:26.662611 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:26.662576 2574 generic.go:358] "Generic (PLEG): container finished" podID="a5911674-8d6f-4509-8507-3a8b80d66349" containerID="e8757bf101bf671736289507d71c4987abafbeed8c90d88c671371e8dcb2042d" exitCode=0 Apr 20 15:01:26.663013 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:26.662653 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv" event={"ID":"a5911674-8d6f-4509-8507-3a8b80d66349","Type":"ContainerDied","Data":"e8757bf101bf671736289507d71c4987abafbeed8c90d88c671371e8dcb2042d"} Apr 20 15:01:27.667108 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:27.667074 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-zvp7j" event={"ID":"a64a35a4-d683-4459-a09a-c300ce5b4faf","Type":"ContainerStarted","Data":"d08de0c99407c51beafd326099c7b07ead8da15f6ab085633160ac1c6de0c15b"} Apr 20 15:01:27.667532 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:27.667344 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-zvp7j" Apr 20 15:01:27.688527 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:27.688467 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-zvp7j" podStartSLOduration=1.238064064 podStartE2EDuration="3.688446371s" podCreationTimestamp="2026-04-20 15:01:24 +0000 UTC" firstStartedPulling="2026-04-20 15:01:25.053790462 +0000 UTC m=+364.127066070" lastFinishedPulling="2026-04-20 15:01:27.504172755 +0000 UTC m=+366.577448377" observedRunningTime="2026-04-20 15:01:27.685747643 +0000 UTC m=+366.759023268" watchObservedRunningTime="2026-04-20 15:01:27.688446371 +0000 UTC m=+366.761721998" Apr 20 15:01:27.793568 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:27.793543 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv" Apr 20 15:01:27.912488 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:27.912455 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5911674-8d6f-4509-8507-3a8b80d66349-bundle\") pod \"a5911674-8d6f-4509-8507-3a8b80d66349\" (UID: \"a5911674-8d6f-4509-8507-3a8b80d66349\") " Apr 20 15:01:27.912653 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:27.912508 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42jdd\" (UniqueName: \"kubernetes.io/projected/a5911674-8d6f-4509-8507-3a8b80d66349-kube-api-access-42jdd\") pod \"a5911674-8d6f-4509-8507-3a8b80d66349\" (UID: \"a5911674-8d6f-4509-8507-3a8b80d66349\") " Apr 20 15:01:27.912653 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:27.912562 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5911674-8d6f-4509-8507-3a8b80d66349-util\") pod \"a5911674-8d6f-4509-8507-3a8b80d66349\" (UID: \"a5911674-8d6f-4509-8507-3a8b80d66349\") " Apr 20 15:01:27.913192 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:27.913153 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5911674-8d6f-4509-8507-3a8b80d66349-bundle" (OuterVolumeSpecName: "bundle") pod "a5911674-8d6f-4509-8507-3a8b80d66349" (UID: "a5911674-8d6f-4509-8507-3a8b80d66349"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:01:27.914689 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:27.914660 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5911674-8d6f-4509-8507-3a8b80d66349-kube-api-access-42jdd" (OuterVolumeSpecName: "kube-api-access-42jdd") pod "a5911674-8d6f-4509-8507-3a8b80d66349" (UID: "a5911674-8d6f-4509-8507-3a8b80d66349"). InnerVolumeSpecName "kube-api-access-42jdd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:01:27.918267 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:27.918228 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5911674-8d6f-4509-8507-3a8b80d66349-util" (OuterVolumeSpecName: "util") pod "a5911674-8d6f-4509-8507-3a8b80d66349" (UID: "a5911674-8d6f-4509-8507-3a8b80d66349"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:01:28.013578 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:28.013548 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5911674-8d6f-4509-8507-3a8b80d66349-util\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:01:28.013578 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:28.013575 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5911674-8d6f-4509-8507-3a8b80d66349-bundle\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:01:28.013745 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:28.013585 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-42jdd\" (UniqueName: \"kubernetes.io/projected/a5911674-8d6f-4509-8507-3a8b80d66349-kube-api-access-42jdd\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:01:28.672514 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:28.672481 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv" event={"ID":"a5911674-8d6f-4509-8507-3a8b80d66349","Type":"ContainerDied","Data":"67d48c39ecfbfc298b6d658b35d51b23a0580655f35301b6a233f5d7c3f1b792"} Apr 20 15:01:28.672514 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:28.672515 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67d48c39ecfbfc298b6d658b35d51b23a0580655f35301b6a233f5d7c3f1b792" Apr 20 15:01:28.672514 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:28.672516 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c965hdv" Apr 20 15:01:38.675495 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:38.675464 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-zvp7j" Apr 20 15:01:42.895180 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:42.895146 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-85d448cc4f-qxj6q"] Apr 20 15:01:42.895583 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:42.895467 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5911674-8d6f-4509-8507-3a8b80d66349" containerName="pull" Apr 20 15:01:42.895583 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:42.895479 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5911674-8d6f-4509-8507-3a8b80d66349" containerName="pull" Apr 20 15:01:42.895583 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:42.895486 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5911674-8d6f-4509-8507-3a8b80d66349" containerName="extract" Apr 20 15:01:42.895583 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:42.895500 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5911674-8d6f-4509-8507-3a8b80d66349" containerName="extract" Apr 20 15:01:42.895583 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:42.895512 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5911674-8d6f-4509-8507-3a8b80d66349" containerName="util" Apr 20 15:01:42.895583 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:42.895517 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5911674-8d6f-4509-8507-3a8b80d66349" containerName="util" Apr 20 15:01:42.895583 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:42.895574 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="a5911674-8d6f-4509-8507-3a8b80d66349" containerName="extract" Apr 20 15:01:42.897860 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:42.897837 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-85d448cc4f-qxj6q" Apr 20 15:01:42.900812 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:42.900793 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 15:01:42.900926 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:42.900815 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 15:01:42.901988 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:42.901970 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 15:01:42.902085 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:42.902062 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 15:01:42.902136 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:42.902123 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-fsm29\"" Apr 20 15:01:42.910589 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:42.910560 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-85d448cc4f-qxj6q"] Apr 20 15:01:43.041825 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:43.041795 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42d57d48-3d62-4ec2-a13f-104344cd9cd1-tls-certs\") pod \"kube-auth-proxy-85d448cc4f-qxj6q\" (UID: \"42d57d48-3d62-4ec2-a13f-104344cd9cd1\") " pod="openshift-ingress/kube-auth-proxy-85d448cc4f-qxj6q" Apr 20 15:01:43.041996 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:43.041847 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6rt9\" (UniqueName: \"kubernetes.io/projected/42d57d48-3d62-4ec2-a13f-104344cd9cd1-kube-api-access-c6rt9\") pod \"kube-auth-proxy-85d448cc4f-qxj6q\" (UID: \"42d57d48-3d62-4ec2-a13f-104344cd9cd1\") " pod="openshift-ingress/kube-auth-proxy-85d448cc4f-qxj6q" Apr 20 15:01:43.041996 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:43.041938 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/42d57d48-3d62-4ec2-a13f-104344cd9cd1-tmp\") pod \"kube-auth-proxy-85d448cc4f-qxj6q\" (UID: \"42d57d48-3d62-4ec2-a13f-104344cd9cd1\") " pod="openshift-ingress/kube-auth-proxy-85d448cc4f-qxj6q" Apr 20 15:01:43.143086 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:43.143058 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6rt9\" (UniqueName: \"kubernetes.io/projected/42d57d48-3d62-4ec2-a13f-104344cd9cd1-kube-api-access-c6rt9\") pod \"kube-auth-proxy-85d448cc4f-qxj6q\" (UID: \"42d57d48-3d62-4ec2-a13f-104344cd9cd1\") " pod="openshift-ingress/kube-auth-proxy-85d448cc4f-qxj6q" Apr 20 15:01:43.143212 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:43.143093 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/42d57d48-3d62-4ec2-a13f-104344cd9cd1-tmp\") pod \"kube-auth-proxy-85d448cc4f-qxj6q\" (UID: \"42d57d48-3d62-4ec2-a13f-104344cd9cd1\") " pod="openshift-ingress/kube-auth-proxy-85d448cc4f-qxj6q" Apr 20 15:01:43.143212 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:43.143135 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42d57d48-3d62-4ec2-a13f-104344cd9cd1-tls-certs\") pod \"kube-auth-proxy-85d448cc4f-qxj6q\" (UID: \"42d57d48-3d62-4ec2-a13f-104344cd9cd1\") " pod="openshift-ingress/kube-auth-proxy-85d448cc4f-qxj6q" Apr 20 15:01:43.145513 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:43.145457 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/42d57d48-3d62-4ec2-a13f-104344cd9cd1-tmp\") pod \"kube-auth-proxy-85d448cc4f-qxj6q\" (UID: \"42d57d48-3d62-4ec2-a13f-104344cd9cd1\") " pod="openshift-ingress/kube-auth-proxy-85d448cc4f-qxj6q" Apr 20 15:01:43.145609 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:43.145591 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/42d57d48-3d62-4ec2-a13f-104344cd9cd1-tls-certs\") pod \"kube-auth-proxy-85d448cc4f-qxj6q\" (UID: \"42d57d48-3d62-4ec2-a13f-104344cd9cd1\") " pod="openshift-ingress/kube-auth-proxy-85d448cc4f-qxj6q" Apr 20 15:01:43.154118 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:43.154097 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6rt9\" (UniqueName: \"kubernetes.io/projected/42d57d48-3d62-4ec2-a13f-104344cd9cd1-kube-api-access-c6rt9\") pod \"kube-auth-proxy-85d448cc4f-qxj6q\" (UID: \"42d57d48-3d62-4ec2-a13f-104344cd9cd1\") " pod="openshift-ingress/kube-auth-proxy-85d448cc4f-qxj6q" Apr 20 15:01:43.209016 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:43.208992 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-85d448cc4f-qxj6q" Apr 20 15:01:43.326699 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:43.326667 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-85d448cc4f-qxj6q"] Apr 20 15:01:43.329862 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:01:43.329833 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42d57d48_3d62_4ec2_a13f_104344cd9cd1.slice/crio-6d7866a3397755d59ae26806b73f1ade509a2d50c28e00e4355444a18216699b WatchSource:0}: Error finding container 6d7866a3397755d59ae26806b73f1ade509a2d50c28e00e4355444a18216699b: Status 404 returned error can't find the container with id 6d7866a3397755d59ae26806b73f1ade509a2d50c28e00e4355444a18216699b Apr 20 15:01:43.720337 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:43.720296 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-85d448cc4f-qxj6q" event={"ID":"42d57d48-3d62-4ec2-a13f-104344cd9cd1","Type":"ContainerStarted","Data":"6d7866a3397755d59ae26806b73f1ade509a2d50c28e00e4355444a18216699b"} Apr 20 15:01:44.612737 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:44.612701 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm"] Apr 20 15:01:44.615479 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:44.615447 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm" Apr 20 15:01:44.618059 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:44.618028 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 15:01:44.618202 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:44.618031 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ngpdb\"" Apr 20 15:01:44.619045 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:44.619025 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 15:01:44.624821 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:44.624800 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm"] Apr 20 15:01:44.754579 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:44.754542 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bqsb\" (UniqueName: \"kubernetes.io/projected/c2755497-beb2-409a-92e3-e7cac833f48d-kube-api-access-6bqsb\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm\" (UID: \"c2755497-beb2-409a-92e3-e7cac833f48d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm" Apr 20 15:01:44.754755 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:44.754613 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2755497-beb2-409a-92e3-e7cac833f48d-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm\" (UID: \"c2755497-beb2-409a-92e3-e7cac833f48d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm" Apr 20 15:01:44.754755 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:44.754648 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2755497-beb2-409a-92e3-e7cac833f48d-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm\" (UID: \"c2755497-beb2-409a-92e3-e7cac833f48d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm" Apr 20 15:01:44.855949 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:44.855866 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6bqsb\" (UniqueName: \"kubernetes.io/projected/c2755497-beb2-409a-92e3-e7cac833f48d-kube-api-access-6bqsb\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm\" (UID: \"c2755497-beb2-409a-92e3-e7cac833f48d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm" Apr 20 15:01:44.855949 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:44.855945 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2755497-beb2-409a-92e3-e7cac833f48d-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm\" (UID: \"c2755497-beb2-409a-92e3-e7cac833f48d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm" Apr 20 15:01:44.856191 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:44.855982 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2755497-beb2-409a-92e3-e7cac833f48d-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm\" (UID: \"c2755497-beb2-409a-92e3-e7cac833f48d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm" Apr 20 15:01:44.856447 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:44.856394 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2755497-beb2-409a-92e3-e7cac833f48d-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm\" (UID: \"c2755497-beb2-409a-92e3-e7cac833f48d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm" Apr 20 15:01:44.856586 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:44.856496 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2755497-beb2-409a-92e3-e7cac833f48d-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm\" (UID: \"c2755497-beb2-409a-92e3-e7cac833f48d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm" Apr 20 15:01:44.874690 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:44.874606 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bqsb\" (UniqueName: \"kubernetes.io/projected/c2755497-beb2-409a-92e3-e7cac833f48d-kube-api-access-6bqsb\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm\" (UID: \"c2755497-beb2-409a-92e3-e7cac833f48d\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm" Apr 20 15:01:44.928122 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:44.928083 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm" Apr 20 15:01:45.045673 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:45.045640 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6687ffb5c6-njnff"] Apr 20 15:01:45.052990 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:45.052964 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-njnff" Apr 20 15:01:45.056424 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:45.056176 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 15:01:45.056424 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:45.056180 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-nplmq\"" Apr 20 15:01:45.056931 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:45.056747 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 15:01:45.057222 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:45.057189 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 15:01:45.057556 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:45.057524 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6687ffb5c6-njnff"] Apr 20 15:01:45.158426 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:45.158340 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/0135a21a-fcc4-4ab0-9372-650bfce6790c-manager-config\") pod \"lws-controller-manager-6687ffb5c6-njnff\" (UID: \"0135a21a-fcc4-4ab0-9372-650bfce6790c\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-njnff" Apr 20 15:01:45.158580 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:45.158487 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0135a21a-fcc4-4ab0-9372-650bfce6790c-cert\") pod \"lws-controller-manager-6687ffb5c6-njnff\" (UID: \"0135a21a-fcc4-4ab0-9372-650bfce6790c\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-njnff" Apr 20 15:01:45.158580 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:45.158523 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/0135a21a-fcc4-4ab0-9372-650bfce6790c-metrics-cert\") pod \"lws-controller-manager-6687ffb5c6-njnff\" (UID: \"0135a21a-fcc4-4ab0-9372-650bfce6790c\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-njnff" Apr 20 15:01:45.158580 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:45.158556 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5xkc\" (UniqueName: \"kubernetes.io/projected/0135a21a-fcc4-4ab0-9372-650bfce6790c-kube-api-access-v5xkc\") pod \"lws-controller-manager-6687ffb5c6-njnff\" (UID: \"0135a21a-fcc4-4ab0-9372-650bfce6790c\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-njnff" Apr 20 15:01:45.259933 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:45.259848 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0135a21a-fcc4-4ab0-9372-650bfce6790c-cert\") pod \"lws-controller-manager-6687ffb5c6-njnff\" (UID: \"0135a21a-fcc4-4ab0-9372-650bfce6790c\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-njnff" Apr 20 15:01:45.259933 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:45.259911 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/0135a21a-fcc4-4ab0-9372-650bfce6790c-metrics-cert\") pod \"lws-controller-manager-6687ffb5c6-njnff\" (UID: \"0135a21a-fcc4-4ab0-9372-650bfce6790c\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-njnff" Apr 20 15:01:45.260230 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:45.259963 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5xkc\" (UniqueName: \"kubernetes.io/projected/0135a21a-fcc4-4ab0-9372-650bfce6790c-kube-api-access-v5xkc\") pod \"lws-controller-manager-6687ffb5c6-njnff\" (UID: \"0135a21a-fcc4-4ab0-9372-650bfce6790c\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-njnff" Apr 20 15:01:45.260230 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:45.260035 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/0135a21a-fcc4-4ab0-9372-650bfce6790c-manager-config\") pod \"lws-controller-manager-6687ffb5c6-njnff\" (UID: \"0135a21a-fcc4-4ab0-9372-650bfce6790c\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-njnff" Apr 20 15:01:45.260877 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:45.260850 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/0135a21a-fcc4-4ab0-9372-650bfce6790c-manager-config\") pod \"lws-controller-manager-6687ffb5c6-njnff\" (UID: \"0135a21a-fcc4-4ab0-9372-650bfce6790c\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-njnff" Apr 20 15:01:45.263674 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:45.263652 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/0135a21a-fcc4-4ab0-9372-650bfce6790c-metrics-cert\") pod \"lws-controller-manager-6687ffb5c6-njnff\" (UID: \"0135a21a-fcc4-4ab0-9372-650bfce6790c\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-njnff" Apr 20 15:01:45.264077 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:45.264045 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0135a21a-fcc4-4ab0-9372-650bfce6790c-cert\") pod \"lws-controller-manager-6687ffb5c6-njnff\" (UID: \"0135a21a-fcc4-4ab0-9372-650bfce6790c\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-njnff" Apr 20 15:01:45.270621 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:45.270597 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5xkc\" (UniqueName: \"kubernetes.io/projected/0135a21a-fcc4-4ab0-9372-650bfce6790c-kube-api-access-v5xkc\") pod \"lws-controller-manager-6687ffb5c6-njnff\" (UID: \"0135a21a-fcc4-4ab0-9372-650bfce6790c\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-njnff" Apr 20 15:01:45.366995 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:45.366959 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-njnff" Apr 20 15:01:45.487748 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:45.487699 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm"] Apr 20 15:01:45.511745 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:45.511719 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6687ffb5c6-njnff"] Apr 20 15:01:46.082106 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:01:46.081910 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2755497_beb2_409a_92e3_e7cac833f48d.slice/crio-cce8c19044e495707648d517b9a354caae59f14415f7f00b3e0bc3d4437809e9 WatchSource:0}: Error finding container cce8c19044e495707648d517b9a354caae59f14415f7f00b3e0bc3d4437809e9: Status 404 returned error can't find the container with id cce8c19044e495707648d517b9a354caae59f14415f7f00b3e0bc3d4437809e9 Apr 20 15:01:46.734073 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:46.734033 2574 generic.go:358] "Generic (PLEG): container finished" podID="c2755497-beb2-409a-92e3-e7cac833f48d" containerID="35e2d1a0462e55ac8f4541ea41a34ad2e48082c80032c4e81d0da5385fb9fd6d" exitCode=0 Apr 20 15:01:46.734263 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:46.734125 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm" event={"ID":"c2755497-beb2-409a-92e3-e7cac833f48d","Type":"ContainerDied","Data":"35e2d1a0462e55ac8f4541ea41a34ad2e48082c80032c4e81d0da5385fb9fd6d"} Apr 20 15:01:46.734263 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:46.734166 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm" event={"ID":"c2755497-beb2-409a-92e3-e7cac833f48d","Type":"ContainerStarted","Data":"cce8c19044e495707648d517b9a354caae59f14415f7f00b3e0bc3d4437809e9"} Apr 20 15:01:46.736144 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:46.736117 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-85d448cc4f-qxj6q" event={"ID":"42d57d48-3d62-4ec2-a13f-104344cd9cd1","Type":"ContainerStarted","Data":"51538046ed283a29d42770db0a4b1662364299cc330c561b3b50f176efe8b7ea"} Apr 20 15:01:46.737346 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:46.737290 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-njnff" event={"ID":"0135a21a-fcc4-4ab0-9372-650bfce6790c","Type":"ContainerStarted","Data":"d2269412ca40cf76be64d112346b76828b2636fbbaab438ceeec6a56a49fd052"} Apr 20 15:01:46.774641 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:46.774587 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-85d448cc4f-qxj6q" podStartSLOduration=1.987371494 podStartE2EDuration="4.774570149s" podCreationTimestamp="2026-04-20 15:01:42 +0000 UTC" firstStartedPulling="2026-04-20 15:01:43.331434446 +0000 UTC m=+382.404710052" lastFinishedPulling="2026-04-20 15:01:46.118633104 +0000 UTC m=+385.191908707" observedRunningTime="2026-04-20 15:01:46.772849408 +0000 UTC m=+385.846125034" watchObservedRunningTime="2026-04-20 15:01:46.774570149 +0000 UTC m=+385.847845777" Apr 20 15:01:47.747409 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:47.742727 2574 generic.go:358] "Generic (PLEG): container finished" podID="c2755497-beb2-409a-92e3-e7cac833f48d" containerID="df9c621b783b59deae940b4c984011f79725a57e2a7b1b67582281d944410d09" exitCode=0 Apr 20 15:01:47.747409 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:47.742837 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm" event={"ID":"c2755497-beb2-409a-92e3-e7cac833f48d","Type":"ContainerDied","Data":"df9c621b783b59deae940b4c984011f79725a57e2a7b1b67582281d944410d09"} Apr 20 15:01:47.749507 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:47.749474 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-njnff" event={"ID":"0135a21a-fcc4-4ab0-9372-650bfce6790c","Type":"ContainerStarted","Data":"11f0a85fbf52c2e7f2155c52c37f53baa936eebcf31a74116b96755921d5235e"} Apr 20 15:01:47.777264 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:47.777221 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-njnff" podStartSLOduration=1.44939181 podStartE2EDuration="2.777207195s" podCreationTimestamp="2026-04-20 15:01:45 +0000 UTC" firstStartedPulling="2026-04-20 15:01:46.085939161 +0000 UTC m=+385.159214765" lastFinishedPulling="2026-04-20 15:01:47.413754529 +0000 UTC m=+386.487030150" observedRunningTime="2026-04-20 15:01:47.775671476 +0000 UTC m=+386.848947112" watchObservedRunningTime="2026-04-20 15:01:47.777207195 +0000 UTC m=+386.850482821" Apr 20 15:01:48.754820 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:48.754782 2574 generic.go:358] "Generic (PLEG): container finished" podID="c2755497-beb2-409a-92e3-e7cac833f48d" containerID="9e0393af184046a51965da09f9f851b9c8793017383aafd77e38faac2283505f" exitCode=0 Apr 20 15:01:48.755246 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:48.754851 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm" event={"ID":"c2755497-beb2-409a-92e3-e7cac833f48d","Type":"ContainerDied","Data":"9e0393af184046a51965da09f9f851b9c8793017383aafd77e38faac2283505f"} Apr 20 15:01:48.755246 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:48.754936 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-njnff" Apr 20 15:01:49.876647 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:49.876628 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm" Apr 20 15:01:49.998999 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:49.998967 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bqsb\" (UniqueName: \"kubernetes.io/projected/c2755497-beb2-409a-92e3-e7cac833f48d-kube-api-access-6bqsb\") pod \"c2755497-beb2-409a-92e3-e7cac833f48d\" (UID: \"c2755497-beb2-409a-92e3-e7cac833f48d\") " Apr 20 15:01:49.998999 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:49.999005 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2755497-beb2-409a-92e3-e7cac833f48d-util\") pod \"c2755497-beb2-409a-92e3-e7cac833f48d\" (UID: \"c2755497-beb2-409a-92e3-e7cac833f48d\") " Apr 20 15:01:49.999207 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:49.999051 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2755497-beb2-409a-92e3-e7cac833f48d-bundle\") pod \"c2755497-beb2-409a-92e3-e7cac833f48d\" (UID: \"c2755497-beb2-409a-92e3-e7cac833f48d\") " Apr 20 15:01:50.000187 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:50.000146 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2755497-beb2-409a-92e3-e7cac833f48d-bundle" (OuterVolumeSpecName: "bundle") pod "c2755497-beb2-409a-92e3-e7cac833f48d" (UID: "c2755497-beb2-409a-92e3-e7cac833f48d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:01:50.001137 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:50.001108 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2755497-beb2-409a-92e3-e7cac833f48d-kube-api-access-6bqsb" (OuterVolumeSpecName: "kube-api-access-6bqsb") pod "c2755497-beb2-409a-92e3-e7cac833f48d" (UID: "c2755497-beb2-409a-92e3-e7cac833f48d"). InnerVolumeSpecName "kube-api-access-6bqsb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:01:50.004566 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:50.004543 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2755497-beb2-409a-92e3-e7cac833f48d-util" (OuterVolumeSpecName: "util") pod "c2755497-beb2-409a-92e3-e7cac833f48d" (UID: "c2755497-beb2-409a-92e3-e7cac833f48d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:01:50.099778 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:50.099751 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6bqsb\" (UniqueName: \"kubernetes.io/projected/c2755497-beb2-409a-92e3-e7cac833f48d-kube-api-access-6bqsb\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:01:50.099778 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:50.099775 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2755497-beb2-409a-92e3-e7cac833f48d-util\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:01:50.099964 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:50.099785 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2755497-beb2-409a-92e3-e7cac833f48d-bundle\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:01:50.762439 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:50.762410 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm" Apr 20 15:01:50.762631 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:50.762409 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835qt8qm" event={"ID":"c2755497-beb2-409a-92e3-e7cac833f48d","Type":"ContainerDied","Data":"cce8c19044e495707648d517b9a354caae59f14415f7f00b3e0bc3d4437809e9"} Apr 20 15:01:50.762631 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:50.762515 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cce8c19044e495707648d517b9a354caae59f14415f7f00b3e0bc3d4437809e9" Apr 20 15:01:59.522420 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.522392 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv"] Apr 20 15:01:59.522793 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.522706 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2755497-beb2-409a-92e3-e7cac833f48d" containerName="util" Apr 20 15:01:59.522793 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.522718 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2755497-beb2-409a-92e3-e7cac833f48d" containerName="util" Apr 20 15:01:59.522793 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.522728 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2755497-beb2-409a-92e3-e7cac833f48d" containerName="pull" Apr 20 15:01:59.522793 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.522733 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2755497-beb2-409a-92e3-e7cac833f48d" containerName="pull" Apr 20 15:01:59.522793 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.522741 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2755497-beb2-409a-92e3-e7cac833f48d" containerName="extract" Apr 20 15:01:59.522793 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.522746 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2755497-beb2-409a-92e3-e7cac833f48d" containerName="extract" Apr 20 15:01:59.522988 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.522807 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2755497-beb2-409a-92e3-e7cac833f48d" containerName="extract" Apr 20 15:01:59.541279 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.541256 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv" Apr 20 15:01:59.546067 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.546038 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 15:01:59.546578 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.546561 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 15:01:59.547866 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.547845 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ngpdb\"" Apr 20 15:01:59.550414 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.550393 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv"] Apr 20 15:01:59.572258 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.572234 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv\" (UID: \"7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv" Apr 20 15:01:59.572354 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.572290 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cgjk\" (UniqueName: \"kubernetes.io/projected/7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9-kube-api-access-4cgjk\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv\" (UID: \"7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv" Apr 20 15:01:59.572354 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.572313 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv\" (UID: \"7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv" Apr 20 15:01:59.673418 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.673353 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4cgjk\" (UniqueName: \"kubernetes.io/projected/7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9-kube-api-access-4cgjk\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv\" (UID: \"7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv" Apr 20 15:01:59.673418 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.673416 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv\" (UID: \"7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv" Apr 20 15:01:59.673661 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.673459 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv\" (UID: \"7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv" Apr 20 15:01:59.673798 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.673778 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv\" (UID: \"7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv" Apr 20 15:01:59.673866 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.673832 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv\" (UID: \"7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv" Apr 20 15:01:59.681919 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.681893 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cgjk\" (UniqueName: \"kubernetes.io/projected/7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9-kube-api-access-4cgjk\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv\" (UID: \"7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv" Apr 20 15:01:59.760003 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.759975 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-njnff" Apr 20 15:01:59.851232 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.851201 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv" Apr 20 15:01:59.979251 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:01:59.979220 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv"] Apr 20 15:01:59.982246 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:01:59.982211 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7abce5fb_32ca_4d0a_9eb9_7e97ba995dd9.slice/crio-93db902b5cc82d22bbc67c9ec62c42e6c1696bae332a3d98aff6cccf48e21ecb WatchSource:0}: Error finding container 93db902b5cc82d22bbc67c9ec62c42e6c1696bae332a3d98aff6cccf48e21ecb: Status 404 returned error can't find the container with id 93db902b5cc82d22bbc67c9ec62c42e6c1696bae332a3d98aff6cccf48e21ecb Apr 20 15:02:00.797438 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:00.797393 2574 generic.go:358] "Generic (PLEG): container finished" podID="7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9" containerID="952959a4f7038ca2c4a5b86c484e2b6502a31869532df31bc006e20d69b7c60b" exitCode=0 Apr 20 15:02:00.797804 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:00.797462 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv" event={"ID":"7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9","Type":"ContainerDied","Data":"952959a4f7038ca2c4a5b86c484e2b6502a31869532df31bc006e20d69b7c60b"} Apr 20 15:02:00.797804 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:00.797497 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv" event={"ID":"7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9","Type":"ContainerStarted","Data":"93db902b5cc82d22bbc67c9ec62c42e6c1696bae332a3d98aff6cccf48e21ecb"} Apr 20 15:02:01.802445 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:01.802349 2574 generic.go:358] "Generic (PLEG): container finished" podID="7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9" containerID="64b7ac0a75b3f3b58b3c09da0df8aeb1d6837f0f0a19effe0fbfa6946bf2c5ed" exitCode=0 Apr 20 15:02:01.802445 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:01.802433 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv" event={"ID":"7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9","Type":"ContainerDied","Data":"64b7ac0a75b3f3b58b3c09da0df8aeb1d6837f0f0a19effe0fbfa6946bf2c5ed"} Apr 20 15:02:02.807582 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:02.807552 2574 generic.go:358] "Generic (PLEG): container finished" podID="7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9" containerID="1b3ecf258af597edbca3dfd4f17cade0bdff4792efa918fa319abaa3a48904f2" exitCode=0 Apr 20 15:02:02.807955 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:02.807642 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv" event={"ID":"7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9","Type":"ContainerDied","Data":"1b3ecf258af597edbca3dfd4f17cade0bdff4792efa918fa319abaa3a48904f2"} Apr 20 15:02:03.935093 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:03.935073 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv" Apr 20 15:02:04.006595 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:04.006572 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9-bundle\") pod \"7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9\" (UID: \"7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9\") " Apr 20 15:02:04.006777 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:04.006618 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cgjk\" (UniqueName: \"kubernetes.io/projected/7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9-kube-api-access-4cgjk\") pod \"7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9\" (UID: \"7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9\") " Apr 20 15:02:04.006777 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:04.006663 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9-util\") pod \"7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9\" (UID: \"7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9\") " Apr 20 15:02:04.007839 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:04.007810 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9-bundle" (OuterVolumeSpecName: "bundle") pod "7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9" (UID: "7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:02:04.008787 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:04.008765 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9-kube-api-access-4cgjk" (OuterVolumeSpecName: "kube-api-access-4cgjk") pod "7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9" (UID: "7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9"). InnerVolumeSpecName "kube-api-access-4cgjk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:02:04.012133 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:04.012112 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9-util" (OuterVolumeSpecName: "util") pod "7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9" (UID: "7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:02:04.107781 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:04.107719 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9-util\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:02:04.107781 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:04.107742 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9-bundle\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:02:04.107781 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:04.107753 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4cgjk\" (UniqueName: \"kubernetes.io/projected/7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9-kube-api-access-4cgjk\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:02:04.815608 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:04.815573 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv" event={"ID":"7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9","Type":"ContainerDied","Data":"93db902b5cc82d22bbc67c9ec62c42e6c1696bae332a3d98aff6cccf48e21ecb"} Apr 20 15:02:04.815608 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:04.815608 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93db902b5cc82d22bbc67c9ec62c42e6c1696bae332a3d98aff6cccf48e21ecb" Apr 20 15:02:04.815608 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:04.815591 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w4wvv" Apr 20 15:02:58.286243 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.286208 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt"] Apr 20 15:02:58.286712 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.286543 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9" containerName="pull" Apr 20 15:02:58.286712 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.286555 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9" containerName="pull" Apr 20 15:02:58.286712 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.286568 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9" containerName="extract" Apr 20 15:02:58.286712 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.286573 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9" containerName="extract" Apr 20 15:02:58.286712 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.286584 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9" containerName="util" Apr 20 15:02:58.286712 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.286589 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9" containerName="util" Apr 20 15:02:58.286712 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.286643 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="7abce5fb-32ca-4d0a-9eb9-7e97ba995dd9" containerName="extract" Apr 20 15:02:58.288798 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.288783 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt" Apr 20 15:02:58.291434 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.291413 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 15:02:58.291535 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.291414 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 15:02:58.292505 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.292489 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-jwvf8\"" Apr 20 15:02:58.298014 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.297989 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt"] Apr 20 15:02:58.337793 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.337765 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9be176a2-ed3f-43c0-9883-1dac2ee5912a-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt\" (UID: \"9be176a2-ed3f-43c0-9883-1dac2ee5912a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt" Apr 20 15:02:58.337935 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.337802 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vkxw\" (UniqueName: \"kubernetes.io/projected/9be176a2-ed3f-43c0-9883-1dac2ee5912a-kube-api-access-9vkxw\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt\" (UID: \"9be176a2-ed3f-43c0-9883-1dac2ee5912a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt" Apr 20 15:02:58.337935 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.337901 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9be176a2-ed3f-43c0-9883-1dac2ee5912a-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt\" (UID: \"9be176a2-ed3f-43c0-9883-1dac2ee5912a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt" Apr 20 15:02:58.438769 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.438735 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9be176a2-ed3f-43c0-9883-1dac2ee5912a-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt\" (UID: \"9be176a2-ed3f-43c0-9883-1dac2ee5912a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt" Apr 20 15:02:58.438905 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.438791 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9be176a2-ed3f-43c0-9883-1dac2ee5912a-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt\" (UID: \"9be176a2-ed3f-43c0-9883-1dac2ee5912a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt" Apr 20 15:02:58.438905 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.438815 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vkxw\" (UniqueName: \"kubernetes.io/projected/9be176a2-ed3f-43c0-9883-1dac2ee5912a-kube-api-access-9vkxw\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt\" (UID: \"9be176a2-ed3f-43c0-9883-1dac2ee5912a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt" Apr 20 15:02:58.439160 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.439136 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9be176a2-ed3f-43c0-9883-1dac2ee5912a-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt\" (UID: \"9be176a2-ed3f-43c0-9883-1dac2ee5912a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt" Apr 20 15:02:58.439160 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.439149 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9be176a2-ed3f-43c0-9883-1dac2ee5912a-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt\" (UID: \"9be176a2-ed3f-43c0-9883-1dac2ee5912a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt" Apr 20 15:02:58.446279 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.446259 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vkxw\" (UniqueName: \"kubernetes.io/projected/9be176a2-ed3f-43c0-9883-1dac2ee5912a-kube-api-access-9vkxw\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt\" (UID: \"9be176a2-ed3f-43c0-9883-1dac2ee5912a\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt" Apr 20 15:02:58.598376 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.598337 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt" Apr 20 15:02:58.721589 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:58.721565 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt"] Apr 20 15:02:58.724065 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:02:58.724035 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9be176a2_ed3f_43c0_9883_1dac2ee5912a.slice/crio-922cfab1651aaad6413f37a0b488cc08e3ec443c54b83e4497b89d88cc57fd8b WatchSource:0}: Error finding container 922cfab1651aaad6413f37a0b488cc08e3ec443c54b83e4497b89d88cc57fd8b: Status 404 returned error can't find the container with id 922cfab1651aaad6413f37a0b488cc08e3ec443c54b83e4497b89d88cc57fd8b Apr 20 15:02:59.015676 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.015592 2574 generic.go:358] "Generic (PLEG): container finished" podID="9be176a2-ed3f-43c0-9883-1dac2ee5912a" containerID="5057f523b7c3fc333ae3069579be2819cd47693443957ae408e3491280e475cd" exitCode=0 Apr 20 15:02:59.015676 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.015647 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt" event={"ID":"9be176a2-ed3f-43c0-9883-1dac2ee5912a","Type":"ContainerDied","Data":"5057f523b7c3fc333ae3069579be2819cd47693443957ae408e3491280e475cd"} Apr 20 15:02:59.015854 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.015680 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt" event={"ID":"9be176a2-ed3f-43c0-9883-1dac2ee5912a","Type":"ContainerStarted","Data":"922cfab1651aaad6413f37a0b488cc08e3ec443c54b83e4497b89d88cc57fd8b"} Apr 20 15:02:59.091442 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.091410 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9"] Apr 20 15:02:59.093827 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.093811 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9" Apr 20 15:02:59.102284 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.102259 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9"] Apr 20 15:02:59.144490 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.144466 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90d7cc5c-aebb-43f8-880a-22f7a4c687dd-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9\" (UID: \"90d7cc5c-aebb-43f8-880a-22f7a4c687dd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9" Apr 20 15:02:59.144579 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.144523 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90d7cc5c-aebb-43f8-880a-22f7a4c687dd-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9\" (UID: \"90d7cc5c-aebb-43f8-880a-22f7a4c687dd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9" Apr 20 15:02:59.144630 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.144602 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h28kj\" (UniqueName: \"kubernetes.io/projected/90d7cc5c-aebb-43f8-880a-22f7a4c687dd-kube-api-access-h28kj\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9\" (UID: \"90d7cc5c-aebb-43f8-880a-22f7a4c687dd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9" Apr 20 15:02:59.245137 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.245109 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90d7cc5c-aebb-43f8-880a-22f7a4c687dd-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9\" (UID: \"90d7cc5c-aebb-43f8-880a-22f7a4c687dd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9" Apr 20 15:02:59.245255 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.245165 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h28kj\" (UniqueName: \"kubernetes.io/projected/90d7cc5c-aebb-43f8-880a-22f7a4c687dd-kube-api-access-h28kj\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9\" (UID: \"90d7cc5c-aebb-43f8-880a-22f7a4c687dd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9" Apr 20 15:02:59.245255 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.245214 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90d7cc5c-aebb-43f8-880a-22f7a4c687dd-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9\" (UID: \"90d7cc5c-aebb-43f8-880a-22f7a4c687dd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9" Apr 20 15:02:59.245506 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.245487 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90d7cc5c-aebb-43f8-880a-22f7a4c687dd-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9\" (UID: \"90d7cc5c-aebb-43f8-880a-22f7a4c687dd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9" Apr 20 15:02:59.245595 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.245567 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90d7cc5c-aebb-43f8-880a-22f7a4c687dd-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9\" (UID: \"90d7cc5c-aebb-43f8-880a-22f7a4c687dd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9" Apr 20 15:02:59.253599 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.253582 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h28kj\" (UniqueName: \"kubernetes.io/projected/90d7cc5c-aebb-43f8-880a-22f7a4c687dd-kube-api-access-h28kj\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9\" (UID: \"90d7cc5c-aebb-43f8-880a-22f7a4c687dd\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9" Apr 20 15:02:59.403508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.403476 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9" Apr 20 15:02:59.522825 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.522800 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9"] Apr 20 15:02:59.552750 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:02:59.552714 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90d7cc5c_aebb_43f8_880a_22f7a4c687dd.slice/crio-ef48586c7f2a83f267964ca01f39956f6503b70364e8beac1b729c702993bbf6 WatchSource:0}: Error finding container ef48586c7f2a83f267964ca01f39956f6503b70364e8beac1b729c702993bbf6: Status 404 returned error can't find the container with id ef48586c7f2a83f267964ca01f39956f6503b70364e8beac1b729c702993bbf6 Apr 20 15:02:59.690442 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.690420 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn"] Apr 20 15:02:59.699259 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.699236 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn" Apr 20 15:02:59.701329 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.701303 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn"] Apr 20 15:02:59.749850 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.749827 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5996cc60-3e9b-4037-bd19-59b04fe500a8-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn\" (UID: \"5996cc60-3e9b-4037-bd19-59b04fe500a8\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn" Apr 20 15:02:59.749953 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.749870 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5996cc60-3e9b-4037-bd19-59b04fe500a8-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn\" (UID: \"5996cc60-3e9b-4037-bd19-59b04fe500a8\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn" Apr 20 15:02:59.749953 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.749917 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7vr6\" (UniqueName: \"kubernetes.io/projected/5996cc60-3e9b-4037-bd19-59b04fe500a8-kube-api-access-c7vr6\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn\" (UID: \"5996cc60-3e9b-4037-bd19-59b04fe500a8\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn" Apr 20 15:02:59.850435 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.850400 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5996cc60-3e9b-4037-bd19-59b04fe500a8-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn\" (UID: \"5996cc60-3e9b-4037-bd19-59b04fe500a8\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn" Apr 20 15:02:59.850578 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.850445 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7vr6\" (UniqueName: \"kubernetes.io/projected/5996cc60-3e9b-4037-bd19-59b04fe500a8-kube-api-access-c7vr6\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn\" (UID: \"5996cc60-3e9b-4037-bd19-59b04fe500a8\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn" Apr 20 15:02:59.850578 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.850515 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5996cc60-3e9b-4037-bd19-59b04fe500a8-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn\" (UID: \"5996cc60-3e9b-4037-bd19-59b04fe500a8\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn" Apr 20 15:02:59.850719 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.850702 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5996cc60-3e9b-4037-bd19-59b04fe500a8-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn\" (UID: \"5996cc60-3e9b-4037-bd19-59b04fe500a8\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn" Apr 20 15:02:59.850758 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.850745 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5996cc60-3e9b-4037-bd19-59b04fe500a8-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn\" (UID: \"5996cc60-3e9b-4037-bd19-59b04fe500a8\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn" Apr 20 15:02:59.858238 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:02:59.858220 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7vr6\" (UniqueName: \"kubernetes.io/projected/5996cc60-3e9b-4037-bd19-59b04fe500a8-kube-api-access-c7vr6\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn\" (UID: \"5996cc60-3e9b-4037-bd19-59b04fe500a8\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn" Apr 20 15:03:00.021038 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:00.020958 2574 generic.go:358] "Generic (PLEG): container finished" podID="9be176a2-ed3f-43c0-9883-1dac2ee5912a" containerID="0ce21adf8a99e77936c99af36744e9885e995dca146ad0c0778560bd75f2e7a1" exitCode=0 Apr 20 15:03:00.021186 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:00.021055 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt" event={"ID":"9be176a2-ed3f-43c0-9883-1dac2ee5912a","Type":"ContainerDied","Data":"0ce21adf8a99e77936c99af36744e9885e995dca146ad0c0778560bd75f2e7a1"} Apr 20 15:03:00.022396 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:00.022360 2574 generic.go:358] "Generic (PLEG): container finished" podID="90d7cc5c-aebb-43f8-880a-22f7a4c687dd" containerID="b12fbcf2ba5eb63f5e1e3914123f4552a5db4defa25ee8aa9105c5fe999798e4" exitCode=0 Apr 20 15:03:00.022471 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:00.022400 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9" event={"ID":"90d7cc5c-aebb-43f8-880a-22f7a4c687dd","Type":"ContainerDied","Data":"b12fbcf2ba5eb63f5e1e3914123f4552a5db4defa25ee8aa9105c5fe999798e4"} Apr 20 15:03:00.022471 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:00.022432 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9" event={"ID":"90d7cc5c-aebb-43f8-880a-22f7a4c687dd","Type":"ContainerStarted","Data":"ef48586c7f2a83f267964ca01f39956f6503b70364e8beac1b729c702993bbf6"} Apr 20 15:03:00.076661 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:00.076640 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn" Apr 20 15:03:00.091265 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:00.091240 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj"] Apr 20 15:03:00.095316 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:00.095298 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj" Apr 20 15:03:00.106284 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:00.106239 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj"] Apr 20 15:03:00.153228 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:00.153196 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6hvg\" (UniqueName: \"kubernetes.io/projected/c5c2466d-2df4-4d90-9496-8107320cda01-kube-api-access-q6hvg\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj\" (UID: \"c5c2466d-2df4-4d90-9496-8107320cda01\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj" Apr 20 15:03:00.153343 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:00.153311 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5c2466d-2df4-4d90-9496-8107320cda01-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj\" (UID: \"c5c2466d-2df4-4d90-9496-8107320cda01\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj" Apr 20 15:03:00.153396 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:00.153343 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5c2466d-2df4-4d90-9496-8107320cda01-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj\" (UID: \"c5c2466d-2df4-4d90-9496-8107320cda01\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj" Apr 20 15:03:00.204190 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:00.204167 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn"] Apr 20 15:03:00.205928 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:03:00.205904 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5996cc60_3e9b_4037_bd19_59b04fe500a8.slice/crio-66d42d2b0822f998edba2b2dba195958ecb7dfccbac5199358458df035471315 WatchSource:0}: Error finding container 66d42d2b0822f998edba2b2dba195958ecb7dfccbac5199358458df035471315: Status 404 returned error can't find the container with id 66d42d2b0822f998edba2b2dba195958ecb7dfccbac5199358458df035471315 Apr 20 15:03:00.253988 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:00.253965 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5c2466d-2df4-4d90-9496-8107320cda01-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj\" (UID: \"c5c2466d-2df4-4d90-9496-8107320cda01\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj" Apr 20 15:03:00.254068 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:00.253994 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5c2466d-2df4-4d90-9496-8107320cda01-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj\" (UID: \"c5c2466d-2df4-4d90-9496-8107320cda01\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj" Apr 20 15:03:00.254068 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:00.254039 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6hvg\" (UniqueName: \"kubernetes.io/projected/c5c2466d-2df4-4d90-9496-8107320cda01-kube-api-access-q6hvg\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj\" (UID: \"c5c2466d-2df4-4d90-9496-8107320cda01\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj" Apr 20 15:03:00.254293 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:00.254276 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5c2466d-2df4-4d90-9496-8107320cda01-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj\" (UID: \"c5c2466d-2df4-4d90-9496-8107320cda01\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj" Apr 20 15:03:00.254327 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:00.254306 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5c2466d-2df4-4d90-9496-8107320cda01-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj\" (UID: \"c5c2466d-2df4-4d90-9496-8107320cda01\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj" Apr 20 15:03:00.261648 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:00.261629 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6hvg\" (UniqueName: \"kubernetes.io/projected/c5c2466d-2df4-4d90-9496-8107320cda01-kube-api-access-q6hvg\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj\" (UID: \"c5c2466d-2df4-4d90-9496-8107320cda01\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj" Apr 20 15:03:00.405970 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:00.405937 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj" Apr 20 15:03:00.525825 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:00.525801 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj"] Apr 20 15:03:00.527592 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:03:00.527563 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5c2466d_2df4_4d90_9496_8107320cda01.slice/crio-8bf50ea2388d9250a5e2a9a4bdd137ca9eb05d118e357bf6fe84e7438eaa8e46 WatchSource:0}: Error finding container 8bf50ea2388d9250a5e2a9a4bdd137ca9eb05d118e357bf6fe84e7438eaa8e46: Status 404 returned error can't find the container with id 8bf50ea2388d9250a5e2a9a4bdd137ca9eb05d118e357bf6fe84e7438eaa8e46 Apr 20 15:03:01.027969 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:01.027930 2574 generic.go:358] "Generic (PLEG): container finished" podID="90d7cc5c-aebb-43f8-880a-22f7a4c687dd" containerID="f655dd5a3bafde950e024fc638d2068262c65a1acf81d083ab7f0d3613e4be7a" exitCode=0 Apr 20 15:03:01.028148 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:01.028019 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9" event={"ID":"90d7cc5c-aebb-43f8-880a-22f7a4c687dd","Type":"ContainerDied","Data":"f655dd5a3bafde950e024fc638d2068262c65a1acf81d083ab7f0d3613e4be7a"} Apr 20 15:03:01.029453 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:01.029425 2574 generic.go:358] "Generic (PLEG): container finished" podID="5996cc60-3e9b-4037-bd19-59b04fe500a8" containerID="6bb0328db7302a9fb9652d249c34fe074ccdaca27cf965b140169616b6afb720" exitCode=0 Apr 20 15:03:01.029593 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:01.029506 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn" event={"ID":"5996cc60-3e9b-4037-bd19-59b04fe500a8","Type":"ContainerDied","Data":"6bb0328db7302a9fb9652d249c34fe074ccdaca27cf965b140169616b6afb720"} Apr 20 15:03:01.029593 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:01.029536 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn" event={"ID":"5996cc60-3e9b-4037-bd19-59b04fe500a8","Type":"ContainerStarted","Data":"66d42d2b0822f998edba2b2dba195958ecb7dfccbac5199358458df035471315"} Apr 20 15:03:01.030970 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:01.030946 2574 generic.go:358] "Generic (PLEG): container finished" podID="c5c2466d-2df4-4d90-9496-8107320cda01" containerID="4319464d66c584f98f23815f3552cecb88bb869e81d4d4789c4d32bee265821c" exitCode=0 Apr 20 15:03:01.031079 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:01.031032 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj" event={"ID":"c5c2466d-2df4-4d90-9496-8107320cda01","Type":"ContainerDied","Data":"4319464d66c584f98f23815f3552cecb88bb869e81d4d4789c4d32bee265821c"} Apr 20 15:03:01.031079 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:01.031058 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj" event={"ID":"c5c2466d-2df4-4d90-9496-8107320cda01","Type":"ContainerStarted","Data":"8bf50ea2388d9250a5e2a9a4bdd137ca9eb05d118e357bf6fe84e7438eaa8e46"} Apr 20 15:03:01.040361 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:01.040333 2574 generic.go:358] "Generic (PLEG): container finished" podID="9be176a2-ed3f-43c0-9883-1dac2ee5912a" containerID="d7fe42fb0379900be3e4bc91a8a5d7c5617e0295f4dafeec2188486180d4ce8a" exitCode=0 Apr 20 15:03:01.040496 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:01.040418 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt" event={"ID":"9be176a2-ed3f-43c0-9883-1dac2ee5912a","Type":"ContainerDied","Data":"d7fe42fb0379900be3e4bc91a8a5d7c5617e0295f4dafeec2188486180d4ce8a"} Apr 20 15:03:02.047188 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:02.047159 2574 generic.go:358] "Generic (PLEG): container finished" podID="90d7cc5c-aebb-43f8-880a-22f7a4c687dd" containerID="18ecac4eb3a7461a6827d17812bba06c9ffd44dcffad4e0b7fac72b8f9494a4f" exitCode=0 Apr 20 15:03:02.047583 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:02.047249 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9" event={"ID":"90d7cc5c-aebb-43f8-880a-22f7a4c687dd","Type":"ContainerDied","Data":"18ecac4eb3a7461a6827d17812bba06c9ffd44dcffad4e0b7fac72b8f9494a4f"} Apr 20 15:03:02.048989 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:02.048864 2574 generic.go:358] "Generic (PLEG): container finished" podID="c5c2466d-2df4-4d90-9496-8107320cda01" containerID="ab105e00a10c973b0bc1d171b538c79ca2bd769b86868eb45350a2115a287630" exitCode=0 Apr 20 15:03:02.048989 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:02.048941 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj" event={"ID":"c5c2466d-2df4-4d90-9496-8107320cda01","Type":"ContainerDied","Data":"ab105e00a10c973b0bc1d171b538c79ca2bd769b86868eb45350a2115a287630"} Apr 20 15:03:02.170942 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:02.170908 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt" Apr 20 15:03:02.269978 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:02.269952 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vkxw\" (UniqueName: \"kubernetes.io/projected/9be176a2-ed3f-43c0-9883-1dac2ee5912a-kube-api-access-9vkxw\") pod \"9be176a2-ed3f-43c0-9883-1dac2ee5912a\" (UID: \"9be176a2-ed3f-43c0-9883-1dac2ee5912a\") " Apr 20 15:03:02.270102 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:02.269980 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9be176a2-ed3f-43c0-9883-1dac2ee5912a-util\") pod \"9be176a2-ed3f-43c0-9883-1dac2ee5912a\" (UID: \"9be176a2-ed3f-43c0-9883-1dac2ee5912a\") " Apr 20 15:03:02.270102 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:02.270030 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9be176a2-ed3f-43c0-9883-1dac2ee5912a-bundle\") pod \"9be176a2-ed3f-43c0-9883-1dac2ee5912a\" (UID: \"9be176a2-ed3f-43c0-9883-1dac2ee5912a\") " Apr 20 15:03:02.270628 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:02.270600 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9be176a2-ed3f-43c0-9883-1dac2ee5912a-bundle" (OuterVolumeSpecName: "bundle") pod "9be176a2-ed3f-43c0-9883-1dac2ee5912a" (UID: "9be176a2-ed3f-43c0-9883-1dac2ee5912a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:03:02.271956 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:02.271934 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9be176a2-ed3f-43c0-9883-1dac2ee5912a-kube-api-access-9vkxw" (OuterVolumeSpecName: "kube-api-access-9vkxw") pod "9be176a2-ed3f-43c0-9883-1dac2ee5912a" (UID: "9be176a2-ed3f-43c0-9883-1dac2ee5912a"). InnerVolumeSpecName "kube-api-access-9vkxw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:03:02.275608 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:02.275588 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9be176a2-ed3f-43c0-9883-1dac2ee5912a-util" (OuterVolumeSpecName: "util") pod "9be176a2-ed3f-43c0-9883-1dac2ee5912a" (UID: "9be176a2-ed3f-43c0-9883-1dac2ee5912a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:03:02.371573 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:02.371513 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9be176a2-ed3f-43c0-9883-1dac2ee5912a-bundle\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:03:02.371573 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:02.371536 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9vkxw\" (UniqueName: \"kubernetes.io/projected/9be176a2-ed3f-43c0-9883-1dac2ee5912a-kube-api-access-9vkxw\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:03:02.371573 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:02.371545 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9be176a2-ed3f-43c0-9883-1dac2ee5912a-util\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:03:03.053996 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:03.053964 2574 generic.go:358] "Generic (PLEG): container finished" podID="5996cc60-3e9b-4037-bd19-59b04fe500a8" containerID="5e8662be9b6f0beddd17f0803baaf5f24d997bc31b1518a1f5104e9a666f5859" exitCode=0 Apr 20 15:03:03.054627 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:03.054030 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn" event={"ID":"5996cc60-3e9b-4037-bd19-59b04fe500a8","Type":"ContainerDied","Data":"5e8662be9b6f0beddd17f0803baaf5f24d997bc31b1518a1f5104e9a666f5859"} Apr 20 15:03:03.055915 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:03.055880 2574 generic.go:358] "Generic (PLEG): container finished" podID="c5c2466d-2df4-4d90-9496-8107320cda01" containerID="53071257118dcb015c60ae1ade2084ced36b7fe297d62c2bc7ff20636ca6c61e" exitCode=0 Apr 20 15:03:03.056002 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:03.055983 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj" event={"ID":"c5c2466d-2df4-4d90-9496-8107320cda01","Type":"ContainerDied","Data":"53071257118dcb015c60ae1ade2084ced36b7fe297d62c2bc7ff20636ca6c61e"} Apr 20 15:03:03.057529 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:03.057506 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt" Apr 20 15:03:03.057628 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:03.057530 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt" event={"ID":"9be176a2-ed3f-43c0-9883-1dac2ee5912a","Type":"ContainerDied","Data":"922cfab1651aaad6413f37a0b488cc08e3ec443c54b83e4497b89d88cc57fd8b"} Apr 20 15:03:03.057628 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:03.057553 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="922cfab1651aaad6413f37a0b488cc08e3ec443c54b83e4497b89d88cc57fd8b" Apr 20 15:03:03.184052 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:03.184031 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9" Apr 20 15:03:03.278991 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:03.278952 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90d7cc5c-aebb-43f8-880a-22f7a4c687dd-util\") pod \"90d7cc5c-aebb-43f8-880a-22f7a4c687dd\" (UID: \"90d7cc5c-aebb-43f8-880a-22f7a4c687dd\") " Apr 20 15:03:03.279154 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:03.279021 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h28kj\" (UniqueName: \"kubernetes.io/projected/90d7cc5c-aebb-43f8-880a-22f7a4c687dd-kube-api-access-h28kj\") pod \"90d7cc5c-aebb-43f8-880a-22f7a4c687dd\" (UID: \"90d7cc5c-aebb-43f8-880a-22f7a4c687dd\") " Apr 20 15:03:03.279154 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:03.279080 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90d7cc5c-aebb-43f8-880a-22f7a4c687dd-bundle\") pod \"90d7cc5c-aebb-43f8-880a-22f7a4c687dd\" (UID: \"90d7cc5c-aebb-43f8-880a-22f7a4c687dd\") " Apr 20 15:03:03.279713 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:03.279684 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90d7cc5c-aebb-43f8-880a-22f7a4c687dd-bundle" (OuterVolumeSpecName: "bundle") pod "90d7cc5c-aebb-43f8-880a-22f7a4c687dd" (UID: "90d7cc5c-aebb-43f8-880a-22f7a4c687dd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:03:03.281050 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:03.281023 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90d7cc5c-aebb-43f8-880a-22f7a4c687dd-kube-api-access-h28kj" (OuterVolumeSpecName: "kube-api-access-h28kj") pod "90d7cc5c-aebb-43f8-880a-22f7a4c687dd" (UID: "90d7cc5c-aebb-43f8-880a-22f7a4c687dd"). InnerVolumeSpecName "kube-api-access-h28kj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:03:03.283542 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:03.283505 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90d7cc5c-aebb-43f8-880a-22f7a4c687dd-util" (OuterVolumeSpecName: "util") pod "90d7cc5c-aebb-43f8-880a-22f7a4c687dd" (UID: "90d7cc5c-aebb-43f8-880a-22f7a4c687dd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:03:03.379524 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:03.379500 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/90d7cc5c-aebb-43f8-880a-22f7a4c687dd-bundle\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:03:03.379524 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:03.379522 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/90d7cc5c-aebb-43f8-880a-22f7a4c687dd-util\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:03:03.379648 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:03.379532 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h28kj\" (UniqueName: \"kubernetes.io/projected/90d7cc5c-aebb-43f8-880a-22f7a4c687dd-kube-api-access-h28kj\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:03:04.064717 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:04.064680 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9" event={"ID":"90d7cc5c-aebb-43f8-880a-22f7a4c687dd","Type":"ContainerDied","Data":"ef48586c7f2a83f267964ca01f39956f6503b70364e8beac1b729c702993bbf6"} Apr 20 15:03:04.064717 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:04.064711 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9" Apr 20 15:03:04.065229 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:04.064717 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef48586c7f2a83f267964ca01f39956f6503b70364e8beac1b729c702993bbf6" Apr 20 15:03:04.066794 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:04.066767 2574 generic.go:358] "Generic (PLEG): container finished" podID="5996cc60-3e9b-4037-bd19-59b04fe500a8" containerID="4fecdcc2c3e6cc5a7b3aac95086bf334fe1c8d03e3fbcfac81256e02e263d342" exitCode=0 Apr 20 15:03:04.066951 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:04.066868 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn" event={"ID":"5996cc60-3e9b-4037-bd19-59b04fe500a8","Type":"ContainerDied","Data":"4fecdcc2c3e6cc5a7b3aac95086bf334fe1c8d03e3fbcfac81256e02e263d342"} Apr 20 15:03:04.187248 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:04.187230 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj" Apr 20 15:03:04.286187 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:04.286158 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6hvg\" (UniqueName: \"kubernetes.io/projected/c5c2466d-2df4-4d90-9496-8107320cda01-kube-api-access-q6hvg\") pod \"c5c2466d-2df4-4d90-9496-8107320cda01\" (UID: \"c5c2466d-2df4-4d90-9496-8107320cda01\") " Apr 20 15:03:04.286346 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:04.286200 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5c2466d-2df4-4d90-9496-8107320cda01-bundle\") pod \"c5c2466d-2df4-4d90-9496-8107320cda01\" (UID: \"c5c2466d-2df4-4d90-9496-8107320cda01\") " Apr 20 15:03:04.286346 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:04.286227 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5c2466d-2df4-4d90-9496-8107320cda01-util\") pod \"c5c2466d-2df4-4d90-9496-8107320cda01\" (UID: \"c5c2466d-2df4-4d90-9496-8107320cda01\") " Apr 20 15:03:04.286675 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:04.286640 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5c2466d-2df4-4d90-9496-8107320cda01-bundle" (OuterVolumeSpecName: "bundle") pod "c5c2466d-2df4-4d90-9496-8107320cda01" (UID: "c5c2466d-2df4-4d90-9496-8107320cda01"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:03:04.288305 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:04.288278 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5c2466d-2df4-4d90-9496-8107320cda01-kube-api-access-q6hvg" (OuterVolumeSpecName: "kube-api-access-q6hvg") pod "c5c2466d-2df4-4d90-9496-8107320cda01" (UID: "c5c2466d-2df4-4d90-9496-8107320cda01"). InnerVolumeSpecName "kube-api-access-q6hvg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:03:04.291247 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:04.291219 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5c2466d-2df4-4d90-9496-8107320cda01-util" (OuterVolumeSpecName: "util") pod "c5c2466d-2df4-4d90-9496-8107320cda01" (UID: "c5c2466d-2df4-4d90-9496-8107320cda01"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:03:04.386986 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:04.386934 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q6hvg\" (UniqueName: \"kubernetes.io/projected/c5c2466d-2df4-4d90-9496-8107320cda01-kube-api-access-q6hvg\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:03:04.386986 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:04.386955 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c5c2466d-2df4-4d90-9496-8107320cda01-bundle\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:03:04.386986 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:04.386966 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c5c2466d-2df4-4d90-9496-8107320cda01-util\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:03:05.071411 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:05.071383 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj" Apr 20 15:03:05.071411 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:05.071396 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj" event={"ID":"c5c2466d-2df4-4d90-9496-8107320cda01","Type":"ContainerDied","Data":"8bf50ea2388d9250a5e2a9a4bdd137ca9eb05d118e357bf6fe84e7438eaa8e46"} Apr 20 15:03:05.071801 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:05.071425 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bf50ea2388d9250a5e2a9a4bdd137ca9eb05d118e357bf6fe84e7438eaa8e46" Apr 20 15:03:05.194940 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:05.194914 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn" Apr 20 15:03:05.293948 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:05.293923 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5996cc60-3e9b-4037-bd19-59b04fe500a8-util\") pod \"5996cc60-3e9b-4037-bd19-59b04fe500a8\" (UID: \"5996cc60-3e9b-4037-bd19-59b04fe500a8\") " Apr 20 15:03:05.294125 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:05.293977 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5996cc60-3e9b-4037-bd19-59b04fe500a8-bundle\") pod \"5996cc60-3e9b-4037-bd19-59b04fe500a8\" (UID: \"5996cc60-3e9b-4037-bd19-59b04fe500a8\") " Apr 20 15:03:05.294125 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:05.294003 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7vr6\" (UniqueName: \"kubernetes.io/projected/5996cc60-3e9b-4037-bd19-59b04fe500a8-kube-api-access-c7vr6\") pod \"5996cc60-3e9b-4037-bd19-59b04fe500a8\" (UID: \"5996cc60-3e9b-4037-bd19-59b04fe500a8\") " Apr 20 15:03:05.294535 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:05.294511 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5996cc60-3e9b-4037-bd19-59b04fe500a8-bundle" (OuterVolumeSpecName: "bundle") pod "5996cc60-3e9b-4037-bd19-59b04fe500a8" (UID: "5996cc60-3e9b-4037-bd19-59b04fe500a8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:03:05.296091 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:05.296069 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5996cc60-3e9b-4037-bd19-59b04fe500a8-kube-api-access-c7vr6" (OuterVolumeSpecName: "kube-api-access-c7vr6") pod "5996cc60-3e9b-4037-bd19-59b04fe500a8" (UID: "5996cc60-3e9b-4037-bd19-59b04fe500a8"). InnerVolumeSpecName "kube-api-access-c7vr6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:03:05.299257 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:05.299233 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5996cc60-3e9b-4037-bd19-59b04fe500a8-util" (OuterVolumeSpecName: "util") pod "5996cc60-3e9b-4037-bd19-59b04fe500a8" (UID: "5996cc60-3e9b-4037-bd19-59b04fe500a8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:03:05.395358 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:05.395305 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5996cc60-3e9b-4037-bd19-59b04fe500a8-util\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:03:05.395358 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:05.395325 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5996cc60-3e9b-4037-bd19-59b04fe500a8-bundle\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:03:05.395358 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:05.395336 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c7vr6\" (UniqueName: \"kubernetes.io/projected/5996cc60-3e9b-4037-bd19-59b04fe500a8-kube-api-access-c7vr6\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:03:06.077358 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:06.077329 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn" Apr 20 15:03:06.077358 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:06.077337 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn" event={"ID":"5996cc60-3e9b-4037-bd19-59b04fe500a8","Type":"ContainerDied","Data":"66d42d2b0822f998edba2b2dba195958ecb7dfccbac5199358458df035471315"} Apr 20 15:03:06.077358 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:06.077364 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66d42d2b0822f998edba2b2dba195958ecb7dfccbac5199358458df035471315" Apr 20 15:03:16.844469 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.844433 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f48876877-7zw74"] Apr 20 15:03:16.844979 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.844866 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5996cc60-3e9b-4037-bd19-59b04fe500a8" containerName="util" Apr 20 15:03:16.844979 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.844885 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5996cc60-3e9b-4037-bd19-59b04fe500a8" containerName="util" Apr 20 15:03:16.844979 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.844897 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90d7cc5c-aebb-43f8-880a-22f7a4c687dd" containerName="extract" Apr 20 15:03:16.844979 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.844906 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d7cc5c-aebb-43f8-880a-22f7a4c687dd" containerName="extract" Apr 20 15:03:16.844979 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.844914 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5996cc60-3e9b-4037-bd19-59b04fe500a8" containerName="extract" Apr 20 15:03:16.844979 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.844922 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5996cc60-3e9b-4037-bd19-59b04fe500a8" containerName="extract" Apr 20 15:03:16.844979 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.844937 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9be176a2-ed3f-43c0-9883-1dac2ee5912a" containerName="pull" Apr 20 15:03:16.844979 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.844944 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9be176a2-ed3f-43c0-9883-1dac2ee5912a" containerName="pull" Apr 20 15:03:16.844979 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.844958 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9be176a2-ed3f-43c0-9883-1dac2ee5912a" containerName="extract" Apr 20 15:03:16.844979 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.844965 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9be176a2-ed3f-43c0-9883-1dac2ee5912a" containerName="extract" Apr 20 15:03:16.844979 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.844984 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5c2466d-2df4-4d90-9496-8107320cda01" containerName="pull" Apr 20 15:03:16.845508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.844992 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c2466d-2df4-4d90-9496-8107320cda01" containerName="pull" Apr 20 15:03:16.845508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.845009 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5c2466d-2df4-4d90-9496-8107320cda01" containerName="util" Apr 20 15:03:16.845508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.845018 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c2466d-2df4-4d90-9496-8107320cda01" containerName="util" Apr 20 15:03:16.845508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.845027 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5996cc60-3e9b-4037-bd19-59b04fe500a8" containerName="pull" Apr 20 15:03:16.845508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.845034 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5996cc60-3e9b-4037-bd19-59b04fe500a8" containerName="pull" Apr 20 15:03:16.845508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.845044 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5c2466d-2df4-4d90-9496-8107320cda01" containerName="extract" Apr 20 15:03:16.845508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.845051 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5c2466d-2df4-4d90-9496-8107320cda01" containerName="extract" Apr 20 15:03:16.845508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.845059 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90d7cc5c-aebb-43f8-880a-22f7a4c687dd" containerName="util" Apr 20 15:03:16.845508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.845067 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d7cc5c-aebb-43f8-880a-22f7a4c687dd" containerName="util" Apr 20 15:03:16.845508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.845080 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90d7cc5c-aebb-43f8-880a-22f7a4c687dd" containerName="pull" Apr 20 15:03:16.845508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.845089 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d7cc5c-aebb-43f8-880a-22f7a4c687dd" containerName="pull" Apr 20 15:03:16.845508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.845099 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9be176a2-ed3f-43c0-9883-1dac2ee5912a" containerName="util" Apr 20 15:03:16.845508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.845107 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9be176a2-ed3f-43c0-9883-1dac2ee5912a" containerName="util" Apr 20 15:03:16.845508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.845186 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="90d7cc5c-aebb-43f8-880a-22f7a4c687dd" containerName="extract" Apr 20 15:03:16.845508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.845199 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="9be176a2-ed3f-43c0-9883-1dac2ee5912a" containerName="extract" Apr 20 15:03:16.845508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.845212 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5c2466d-2df4-4d90-9496-8107320cda01" containerName="extract" Apr 20 15:03:16.845508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.845223 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="5996cc60-3e9b-4037-bd19-59b04fe500a8" containerName="extract" Apr 20 15:03:16.848276 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.848255 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:16.860966 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.860944 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f48876877-7zw74"] Apr 20 15:03:16.884899 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.884867 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36d89a71-6bdc-4d23-b6a4-fe4e445e5939-oauth-serving-cert\") pod \"console-6f48876877-7zw74\" (UID: \"36d89a71-6bdc-4d23-b6a4-fe4e445e5939\") " pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:16.885026 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.884911 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d89a71-6bdc-4d23-b6a4-fe4e445e5939-trusted-ca-bundle\") pod \"console-6f48876877-7zw74\" (UID: \"36d89a71-6bdc-4d23-b6a4-fe4e445e5939\") " pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:16.885026 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.884938 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36d89a71-6bdc-4d23-b6a4-fe4e445e5939-console-oauth-config\") pod \"console-6f48876877-7zw74\" (UID: \"36d89a71-6bdc-4d23-b6a4-fe4e445e5939\") " pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:16.885026 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.884978 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36d89a71-6bdc-4d23-b6a4-fe4e445e5939-service-ca\") pod \"console-6f48876877-7zw74\" (UID: \"36d89a71-6bdc-4d23-b6a4-fe4e445e5939\") " pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:16.885128 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.885056 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36d89a71-6bdc-4d23-b6a4-fe4e445e5939-console-serving-cert\") pod \"console-6f48876877-7zw74\" (UID: \"36d89a71-6bdc-4d23-b6a4-fe4e445e5939\") " pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:16.885128 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.885085 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36d89a71-6bdc-4d23-b6a4-fe4e445e5939-console-config\") pod \"console-6f48876877-7zw74\" (UID: \"36d89a71-6bdc-4d23-b6a4-fe4e445e5939\") " pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:16.885128 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.885107 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpv4j\" (UniqueName: \"kubernetes.io/projected/36d89a71-6bdc-4d23-b6a4-fe4e445e5939-kube-api-access-jpv4j\") pod \"console-6f48876877-7zw74\" (UID: \"36d89a71-6bdc-4d23-b6a4-fe4e445e5939\") " pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:16.985822 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.985788 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36d89a71-6bdc-4d23-b6a4-fe4e445e5939-console-serving-cert\") pod \"console-6f48876877-7zw74\" (UID: \"36d89a71-6bdc-4d23-b6a4-fe4e445e5939\") " pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:16.985976 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.985830 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36d89a71-6bdc-4d23-b6a4-fe4e445e5939-console-config\") pod \"console-6f48876877-7zw74\" (UID: \"36d89a71-6bdc-4d23-b6a4-fe4e445e5939\") " pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:16.985976 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.985851 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jpv4j\" (UniqueName: \"kubernetes.io/projected/36d89a71-6bdc-4d23-b6a4-fe4e445e5939-kube-api-access-jpv4j\") pod \"console-6f48876877-7zw74\" (UID: \"36d89a71-6bdc-4d23-b6a4-fe4e445e5939\") " pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:16.985976 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.985882 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36d89a71-6bdc-4d23-b6a4-fe4e445e5939-oauth-serving-cert\") pod \"console-6f48876877-7zw74\" (UID: \"36d89a71-6bdc-4d23-b6a4-fe4e445e5939\") " pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:16.985976 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.985903 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d89a71-6bdc-4d23-b6a4-fe4e445e5939-trusted-ca-bundle\") pod \"console-6f48876877-7zw74\" (UID: \"36d89a71-6bdc-4d23-b6a4-fe4e445e5939\") " pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:16.985976 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.985924 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36d89a71-6bdc-4d23-b6a4-fe4e445e5939-console-oauth-config\") pod \"console-6f48876877-7zw74\" (UID: \"36d89a71-6bdc-4d23-b6a4-fe4e445e5939\") " pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:16.985976 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.985945 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36d89a71-6bdc-4d23-b6a4-fe4e445e5939-service-ca\") pod \"console-6f48876877-7zw74\" (UID: \"36d89a71-6bdc-4d23-b6a4-fe4e445e5939\") " pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:16.986634 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.986609 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36d89a71-6bdc-4d23-b6a4-fe4e445e5939-service-ca\") pod \"console-6f48876877-7zw74\" (UID: \"36d89a71-6bdc-4d23-b6a4-fe4e445e5939\") " pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:16.986634 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.986626 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36d89a71-6bdc-4d23-b6a4-fe4e445e5939-console-config\") pod \"console-6f48876877-7zw74\" (UID: \"36d89a71-6bdc-4d23-b6a4-fe4e445e5939\") " pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:16.986823 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.986800 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d89a71-6bdc-4d23-b6a4-fe4e445e5939-trusted-ca-bundle\") pod \"console-6f48876877-7zw74\" (UID: \"36d89a71-6bdc-4d23-b6a4-fe4e445e5939\") " pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:16.986898 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.986800 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36d89a71-6bdc-4d23-b6a4-fe4e445e5939-oauth-serving-cert\") pod \"console-6f48876877-7zw74\" (UID: \"36d89a71-6bdc-4d23-b6a4-fe4e445e5939\") " pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:16.988453 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.988426 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36d89a71-6bdc-4d23-b6a4-fe4e445e5939-console-serving-cert\") pod \"console-6f48876877-7zw74\" (UID: \"36d89a71-6bdc-4d23-b6a4-fe4e445e5939\") " pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:16.988453 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.988447 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36d89a71-6bdc-4d23-b6a4-fe4e445e5939-console-oauth-config\") pod \"console-6f48876877-7zw74\" (UID: \"36d89a71-6bdc-4d23-b6a4-fe4e445e5939\") " pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:16.998612 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:16.998594 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpv4j\" (UniqueName: \"kubernetes.io/projected/36d89a71-6bdc-4d23-b6a4-fe4e445e5939-kube-api-access-jpv4j\") pod \"console-6f48876877-7zw74\" (UID: \"36d89a71-6bdc-4d23-b6a4-fe4e445e5939\") " pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:17.158850 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:17.158773 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:17.310736 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:17.310711 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f48876877-7zw74"] Apr 20 15:03:17.313346 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:03:17.313316 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36d89a71_6bdc_4d23_b6a4_fe4e445e5939.slice/crio-cd40e06fa3e01833018e4fc892744f87826ee7d69c7e94e07da626e44d0ce644 WatchSource:0}: Error finding container cd40e06fa3e01833018e4fc892744f87826ee7d69c7e94e07da626e44d0ce644: Status 404 returned error can't find the container with id cd40e06fa3e01833018e4fc892744f87826ee7d69c7e94e07da626e44d0ce644 Apr 20 15:03:18.120596 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:18.120561 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f48876877-7zw74" event={"ID":"36d89a71-6bdc-4d23-b6a4-fe4e445e5939","Type":"ContainerStarted","Data":"c02d56ecd1f371849ee309167e9acceb9b2e8e54069f679656f29b122184ca06"} Apr 20 15:03:18.120596 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:18.120601 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f48876877-7zw74" event={"ID":"36d89a71-6bdc-4d23-b6a4-fe4e445e5939","Type":"ContainerStarted","Data":"cd40e06fa3e01833018e4fc892744f87826ee7d69c7e94e07da626e44d0ce644"} Apr 20 15:03:18.143911 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:18.143855 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f48876877-7zw74" podStartSLOduration=2.143840503 podStartE2EDuration="2.143840503s" podCreationTimestamp="2026-04-20 15:03:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:03:18.141971605 +0000 UTC m=+477.215247231" watchObservedRunningTime="2026-04-20 15:03:18.143840503 +0000 UTC m=+477.217116128" Apr 20 15:03:27.158872 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:27.158843 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:27.158872 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:27.158879 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:27.163574 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:27.163548 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:28.160102 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:28.160074 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6f48876877-7zw74" Apr 20 15:03:28.213920 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:28.213885 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-844794986-75jsk"] Apr 20 15:03:31.321714 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:31.321683 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g7g7l"] Apr 20 15:03:31.326069 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:31.326051 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g7g7l" Apr 20 15:03:31.329161 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:31.329138 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 20 15:03:31.329517 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:31.329148 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 15:03:31.329922 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:31.329904 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-jwvf8\"" Apr 20 15:03:31.330341 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:31.330308 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 15:03:31.330777 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:31.330760 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 20 15:03:31.335774 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:31.335752 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g7g7l"] Apr 20 15:03:31.407782 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:31.407759 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksj8l\" (UniqueName: \"kubernetes.io/projected/fb55e236-55df-4e39-b642-53b0bc9d710c-kube-api-access-ksj8l\") pod \"kuadrant-console-plugin-6cb54b5c86-g7g7l\" (UID: \"fb55e236-55df-4e39-b642-53b0bc9d710c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g7g7l" Apr 20 15:03:31.407902 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:31.407796 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb55e236-55df-4e39-b642-53b0bc9d710c-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-g7g7l\" (UID: \"fb55e236-55df-4e39-b642-53b0bc9d710c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g7g7l" Apr 20 15:03:31.407902 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:31.407828 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fb55e236-55df-4e39-b642-53b0bc9d710c-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-g7g7l\" (UID: \"fb55e236-55df-4e39-b642-53b0bc9d710c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g7g7l" Apr 20 15:03:31.509112 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:31.509078 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fb55e236-55df-4e39-b642-53b0bc9d710c-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-g7g7l\" (UID: \"fb55e236-55df-4e39-b642-53b0bc9d710c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g7g7l" Apr 20 15:03:31.509298 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:31.509276 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ksj8l\" (UniqueName: \"kubernetes.io/projected/fb55e236-55df-4e39-b642-53b0bc9d710c-kube-api-access-ksj8l\") pod \"kuadrant-console-plugin-6cb54b5c86-g7g7l\" (UID: \"fb55e236-55df-4e39-b642-53b0bc9d710c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g7g7l" Apr 20 15:03:31.509421 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:31.509325 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb55e236-55df-4e39-b642-53b0bc9d710c-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-g7g7l\" (UID: \"fb55e236-55df-4e39-b642-53b0bc9d710c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g7g7l" Apr 20 15:03:31.509753 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:31.509733 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/fb55e236-55df-4e39-b642-53b0bc9d710c-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-g7g7l\" (UID: \"fb55e236-55df-4e39-b642-53b0bc9d710c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g7g7l" Apr 20 15:03:31.511636 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:31.511610 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/fb55e236-55df-4e39-b642-53b0bc9d710c-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-g7g7l\" (UID: \"fb55e236-55df-4e39-b642-53b0bc9d710c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g7g7l" Apr 20 15:03:31.517037 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:31.517008 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksj8l\" (UniqueName: \"kubernetes.io/projected/fb55e236-55df-4e39-b642-53b0bc9d710c-kube-api-access-ksj8l\") pod \"kuadrant-console-plugin-6cb54b5c86-g7g7l\" (UID: \"fb55e236-55df-4e39-b642-53b0bc9d710c\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g7g7l" Apr 20 15:03:31.641304 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:31.641228 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g7g7l" Apr 20 15:03:31.758364 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:31.758307 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g7g7l"] Apr 20 15:03:31.760746 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:03:31.760719 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb55e236_55df_4e39_b642_53b0bc9d710c.slice/crio-edb48e0d7fb5164d5e0e32e08d26ce2e1ca569704550f6239b3ec119173ccdaa WatchSource:0}: Error finding container edb48e0d7fb5164d5e0e32e08d26ce2e1ca569704550f6239b3ec119173ccdaa: Status 404 returned error can't find the container with id edb48e0d7fb5164d5e0e32e08d26ce2e1ca569704550f6239b3ec119173ccdaa Apr 20 15:03:32.171518 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:32.171480 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g7g7l" event={"ID":"fb55e236-55df-4e39-b642-53b0bc9d710c","Type":"ContainerStarted","Data":"edb48e0d7fb5164d5e0e32e08d26ce2e1ca569704550f6239b3ec119173ccdaa"} Apr 20 15:03:53.233666 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:53.233592 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-844794986-75jsk" podUID="a549cdfd-6a71-4221-aab3-acad67d5bbc0" containerName="console" containerID="cri-o://df658dd3cbb2caf69837ed795c483f9923af8c92f3593007e7bc157e5103e507" gracePeriod=15 Apr 20 15:03:54.407940 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.407924 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-844794986-75jsk_a549cdfd-6a71-4221-aab3-acad67d5bbc0/console/0.log" Apr 20 15:03:54.408226 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.407983 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-844794986-75jsk" Apr 20 15:03:54.410239 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.410217 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-trusted-ca-bundle\") pod \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " Apr 20 15:03:54.410352 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.410243 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gqkg\" (UniqueName: \"kubernetes.io/projected/a549cdfd-6a71-4221-aab3-acad67d5bbc0-kube-api-access-7gqkg\") pod \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " Apr 20 15:03:54.410352 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.410267 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-console-config\") pod \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " Apr 20 15:03:54.410499 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.410399 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a549cdfd-6a71-4221-aab3-acad67d5bbc0-console-oauth-config\") pod \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " Apr 20 15:03:54.410499 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.410440 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a549cdfd-6a71-4221-aab3-acad67d5bbc0-console-serving-cert\") pod \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " Apr 20 15:03:54.410499 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.410488 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-service-ca\") pod \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " Apr 20 15:03:54.410653 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.410513 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-oauth-serving-cert\") pod \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\" (UID: \"a549cdfd-6a71-4221-aab3-acad67d5bbc0\") " Apr 20 15:03:54.410717 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.410644 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-console-config" (OuterVolumeSpecName: "console-config") pod "a549cdfd-6a71-4221-aab3-acad67d5bbc0" (UID: "a549cdfd-6a71-4221-aab3-acad67d5bbc0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:03:54.410717 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.410653 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a549cdfd-6a71-4221-aab3-acad67d5bbc0" (UID: "a549cdfd-6a71-4221-aab3-acad67d5bbc0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:03:54.410829 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.410789 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-trusted-ca-bundle\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:03:54.410829 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.410807 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-console-config\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:03:54.411105 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.411063 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-service-ca" (OuterVolumeSpecName: "service-ca") pod "a549cdfd-6a71-4221-aab3-acad67d5bbc0" (UID: "a549cdfd-6a71-4221-aab3-acad67d5bbc0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:03:54.411225 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.411153 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a549cdfd-6a71-4221-aab3-acad67d5bbc0" (UID: "a549cdfd-6a71-4221-aab3-acad67d5bbc0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:03:54.412634 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.412597 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a549cdfd-6a71-4221-aab3-acad67d5bbc0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a549cdfd-6a71-4221-aab3-acad67d5bbc0" (UID: "a549cdfd-6a71-4221-aab3-acad67d5bbc0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:03:54.412817 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.412796 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a549cdfd-6a71-4221-aab3-acad67d5bbc0-kube-api-access-7gqkg" (OuterVolumeSpecName: "kube-api-access-7gqkg") pod "a549cdfd-6a71-4221-aab3-acad67d5bbc0" (UID: "a549cdfd-6a71-4221-aab3-acad67d5bbc0"). InnerVolumeSpecName "kube-api-access-7gqkg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:03:54.412915 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.412887 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a549cdfd-6a71-4221-aab3-acad67d5bbc0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a549cdfd-6a71-4221-aab3-acad67d5bbc0" (UID: "a549cdfd-6a71-4221-aab3-acad67d5bbc0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:03:54.511757 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.511728 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a549cdfd-6a71-4221-aab3-acad67d5bbc0-console-oauth-config\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:03:54.511757 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.511753 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a549cdfd-6a71-4221-aab3-acad67d5bbc0-console-serving-cert\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:03:54.511757 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.511762 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-service-ca\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:03:54.511961 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.511774 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a549cdfd-6a71-4221-aab3-acad67d5bbc0-oauth-serving-cert\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:03:54.511961 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:54.511789 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7gqkg\" (UniqueName: \"kubernetes.io/projected/a549cdfd-6a71-4221-aab3-acad67d5bbc0-kube-api-access-7gqkg\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:03:55.263219 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:55.263192 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-844794986-75jsk_a549cdfd-6a71-4221-aab3-acad67d5bbc0/console/0.log" Apr 20 15:03:55.263419 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:55.263232 2574 generic.go:358] "Generic (PLEG): container finished" podID="a549cdfd-6a71-4221-aab3-acad67d5bbc0" containerID="df658dd3cbb2caf69837ed795c483f9923af8c92f3593007e7bc157e5103e507" exitCode=2 Apr 20 15:03:55.263419 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:55.263307 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-844794986-75jsk" Apr 20 15:03:55.263419 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:55.263336 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-844794986-75jsk" event={"ID":"a549cdfd-6a71-4221-aab3-acad67d5bbc0","Type":"ContainerDied","Data":"df658dd3cbb2caf69837ed795c483f9923af8c92f3593007e7bc157e5103e507"} Apr 20 15:03:55.263419 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:55.263397 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-844794986-75jsk" event={"ID":"a549cdfd-6a71-4221-aab3-acad67d5bbc0","Type":"ContainerDied","Data":"f0e6693a2dcf06370096d5bd8df7d87e62537219779814d5ebdcc3ac9dbe05ae"} Apr 20 15:03:55.263419 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:55.263418 2574 scope.go:117] "RemoveContainer" containerID="df658dd3cbb2caf69837ed795c483f9923af8c92f3593007e7bc157e5103e507" Apr 20 15:03:55.264757 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:55.264727 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g7g7l" event={"ID":"fb55e236-55df-4e39-b642-53b0bc9d710c","Type":"ContainerStarted","Data":"0768d3ffeaadff92fd4131e4aa0f1e8295aafc15c41b7097a717f3332ac634ba"} Apr 20 15:03:55.278542 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:55.278525 2574 scope.go:117] "RemoveContainer" containerID="df658dd3cbb2caf69837ed795c483f9923af8c92f3593007e7bc157e5103e507" Apr 20 15:03:55.278791 ip-10-0-130-249 kubenswrapper[2574]: E0420 15:03:55.278769 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df658dd3cbb2caf69837ed795c483f9923af8c92f3593007e7bc157e5103e507\": container with ID starting with df658dd3cbb2caf69837ed795c483f9923af8c92f3593007e7bc157e5103e507 not found: ID does not exist" containerID="df658dd3cbb2caf69837ed795c483f9923af8c92f3593007e7bc157e5103e507" Apr 20 15:03:55.278880 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:55.278795 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df658dd3cbb2caf69837ed795c483f9923af8c92f3593007e7bc157e5103e507"} err="failed to get container status \"df658dd3cbb2caf69837ed795c483f9923af8c92f3593007e7bc157e5103e507\": rpc error: code = NotFound desc = could not find container \"df658dd3cbb2caf69837ed795c483f9923af8c92f3593007e7bc157e5103e507\": container with ID starting with df658dd3cbb2caf69837ed795c483f9923af8c92f3593007e7bc157e5103e507 not found: ID does not exist" Apr 20 15:03:55.290801 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:55.290764 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-g7g7l" podStartSLOduration=1.7269699360000001 podStartE2EDuration="24.290752942s" podCreationTimestamp="2026-04-20 15:03:31 +0000 UTC" firstStartedPulling="2026-04-20 15:03:31.762407138 +0000 UTC m=+490.835682741" lastFinishedPulling="2026-04-20 15:03:54.326190144 +0000 UTC m=+513.399465747" observedRunningTime="2026-04-20 15:03:55.290384332 +0000 UTC m=+514.363659951" watchObservedRunningTime="2026-04-20 15:03:55.290752942 +0000 UTC m=+514.364028567" Apr 20 15:03:55.312101 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:55.312079 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-844794986-75jsk"] Apr 20 15:03:55.319557 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:55.319535 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-844794986-75jsk"] Apr 20 15:03:55.453032 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:03:55.452998 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a549cdfd-6a71-4221-aab3-acad67d5bbc0" path="/var/lib/kubelet/pods/a549cdfd-6a71-4221-aab3-acad67d5bbc0/volumes" Apr 20 15:04:14.439427 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:14.439393 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4pmtj"] Apr 20 15:04:14.439848 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:14.439688 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a549cdfd-6a71-4221-aab3-acad67d5bbc0" containerName="console" Apr 20 15:04:14.439848 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:14.439698 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="a549cdfd-6a71-4221-aab3-acad67d5bbc0" containerName="console" Apr 20 15:04:14.439848 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:14.439759 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="a549cdfd-6a71-4221-aab3-acad67d5bbc0" containerName="console" Apr 20 15:04:14.441876 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:14.441847 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-4pmtj" Apr 20 15:04:14.445734 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:14.445711 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-x6mfj\"" Apr 20 15:04:14.455567 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:14.455544 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4pmtj"] Apr 20 15:04:14.470417 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:14.470361 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk6h2\" (UniqueName: \"kubernetes.io/projected/3f542828-1df7-4000-9b2a-2834da69e304-kube-api-access-rk6h2\") pod \"authorino-f99f4b5cd-4pmtj\" (UID: \"3f542828-1df7-4000-9b2a-2834da69e304\") " pod="kuadrant-system/authorino-f99f4b5cd-4pmtj" Apr 20 15:04:14.571722 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:14.571689 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rk6h2\" (UniqueName: \"kubernetes.io/projected/3f542828-1df7-4000-9b2a-2834da69e304-kube-api-access-rk6h2\") pod \"authorino-f99f4b5cd-4pmtj\" (UID: \"3f542828-1df7-4000-9b2a-2834da69e304\") " pod="kuadrant-system/authorino-f99f4b5cd-4pmtj" Apr 20 15:04:14.583000 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:14.582972 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk6h2\" (UniqueName: \"kubernetes.io/projected/3f542828-1df7-4000-9b2a-2834da69e304-kube-api-access-rk6h2\") pod \"authorino-f99f4b5cd-4pmtj\" (UID: \"3f542828-1df7-4000-9b2a-2834da69e304\") " pod="kuadrant-system/authorino-f99f4b5cd-4pmtj" Apr 20 15:04:14.591767 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:14.591744 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-wfdxh"] Apr 20 15:04:14.593877 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:14.593862 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-wfdxh" Apr 20 15:04:14.599257 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:14.599237 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-wfdxh"] Apr 20 15:04:14.672514 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:14.672488 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs4md\" (UniqueName: \"kubernetes.io/projected/782781bb-98bf-420a-8d4e-b5704d0a3f09-kube-api-access-gs4md\") pod \"authorino-7498df8756-wfdxh\" (UID: \"782781bb-98bf-420a-8d4e-b5704d0a3f09\") " pod="kuadrant-system/authorino-7498df8756-wfdxh" Apr 20 15:04:14.752273 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:14.752207 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-4pmtj" Apr 20 15:04:14.773319 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:14.773294 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gs4md\" (UniqueName: \"kubernetes.io/projected/782781bb-98bf-420a-8d4e-b5704d0a3f09-kube-api-access-gs4md\") pod \"authorino-7498df8756-wfdxh\" (UID: \"782781bb-98bf-420a-8d4e-b5704d0a3f09\") " pod="kuadrant-system/authorino-7498df8756-wfdxh" Apr 20 15:04:14.781475 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:14.781452 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs4md\" (UniqueName: \"kubernetes.io/projected/782781bb-98bf-420a-8d4e-b5704d0a3f09-kube-api-access-gs4md\") pod \"authorino-7498df8756-wfdxh\" (UID: \"782781bb-98bf-420a-8d4e-b5704d0a3f09\") " pod="kuadrant-system/authorino-7498df8756-wfdxh" Apr 20 15:04:14.869693 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:14.869661 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4pmtj"] Apr 20 15:04:14.872170 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:04:14.872141 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f542828_1df7_4000_9b2a_2834da69e304.slice/crio-a87a636955e5383c47b3cfddc877323ba32ac90e3ee2d736bd1faae92b7b9422 WatchSource:0}: Error finding container a87a636955e5383c47b3cfddc877323ba32ac90e3ee2d736bd1faae92b7b9422: Status 404 returned error can't find the container with id a87a636955e5383c47b3cfddc877323ba32ac90e3ee2d736bd1faae92b7b9422 Apr 20 15:04:14.904616 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:14.904593 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-wfdxh" Apr 20 15:04:15.019123 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:15.019096 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-wfdxh"] Apr 20 15:04:15.020763 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:04:15.020736 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod782781bb_98bf_420a_8d4e_b5704d0a3f09.slice/crio-870278c27ab7806d9b436ce49e7a4630c97036f6571ad7d1eebd822a2db0a77c WatchSource:0}: Error finding container 870278c27ab7806d9b436ce49e7a4630c97036f6571ad7d1eebd822a2db0a77c: Status 404 returned error can't find the container with id 870278c27ab7806d9b436ce49e7a4630c97036f6571ad7d1eebd822a2db0a77c Apr 20 15:04:15.334548 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:15.334514 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-4pmtj" event={"ID":"3f542828-1df7-4000-9b2a-2834da69e304","Type":"ContainerStarted","Data":"a87a636955e5383c47b3cfddc877323ba32ac90e3ee2d736bd1faae92b7b9422"} Apr 20 15:04:15.335505 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:15.335482 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-wfdxh" event={"ID":"782781bb-98bf-420a-8d4e-b5704d0a3f09","Type":"ContainerStarted","Data":"870278c27ab7806d9b436ce49e7a4630c97036f6571ad7d1eebd822a2db0a77c"} Apr 20 15:04:18.350891 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:18.350850 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-4pmtj" event={"ID":"3f542828-1df7-4000-9b2a-2834da69e304","Type":"ContainerStarted","Data":"f3e00f1defdf728333a480cbf341cb3067592bba3bb58984cec35bdf3902db60"} Apr 20 15:04:18.352142 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:18.352115 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-wfdxh" event={"ID":"782781bb-98bf-420a-8d4e-b5704d0a3f09","Type":"ContainerStarted","Data":"8032dae31f6791241fb98813319a94ba904990ee5752f743838c9d14971ff82d"} Apr 20 15:04:18.367242 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:18.367197 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-4pmtj" podStartSLOduration=1.808458278 podStartE2EDuration="4.367185188s" podCreationTimestamp="2026-04-20 15:04:14 +0000 UTC" firstStartedPulling="2026-04-20 15:04:14.873505475 +0000 UTC m=+533.946781078" lastFinishedPulling="2026-04-20 15:04:17.432232376 +0000 UTC m=+536.505507988" observedRunningTime="2026-04-20 15:04:18.366658014 +0000 UTC m=+537.439933641" watchObservedRunningTime="2026-04-20 15:04:18.367185188 +0000 UTC m=+537.440460814" Apr 20 15:04:18.382843 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:18.382798 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-wfdxh" podStartSLOduration=1.9616252809999999 podStartE2EDuration="4.382786804s" podCreationTimestamp="2026-04-20 15:04:14 +0000 UTC" firstStartedPulling="2026-04-20 15:04:15.022025453 +0000 UTC m=+534.095301055" lastFinishedPulling="2026-04-20 15:04:17.443186962 +0000 UTC m=+536.516462578" observedRunningTime="2026-04-20 15:04:18.381425822 +0000 UTC m=+537.454701446" watchObservedRunningTime="2026-04-20 15:04:18.382786804 +0000 UTC m=+537.456062429" Apr 20 15:04:18.405748 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:18.405723 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4pmtj"] Apr 20 15:04:20.360811 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:20.360770 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-4pmtj" podUID="3f542828-1df7-4000-9b2a-2834da69e304" containerName="authorino" containerID="cri-o://f3e00f1defdf728333a480cbf341cb3067592bba3bb58984cec35bdf3902db60" gracePeriod=30 Apr 20 15:04:20.593017 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:20.592989 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-4pmtj" Apr 20 15:04:20.726963 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:20.726871 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk6h2\" (UniqueName: \"kubernetes.io/projected/3f542828-1df7-4000-9b2a-2834da69e304-kube-api-access-rk6h2\") pod \"3f542828-1df7-4000-9b2a-2834da69e304\" (UID: \"3f542828-1df7-4000-9b2a-2834da69e304\") " Apr 20 15:04:20.728888 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:20.728866 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f542828-1df7-4000-9b2a-2834da69e304-kube-api-access-rk6h2" (OuterVolumeSpecName: "kube-api-access-rk6h2") pod "3f542828-1df7-4000-9b2a-2834da69e304" (UID: "3f542828-1df7-4000-9b2a-2834da69e304"). InnerVolumeSpecName "kube-api-access-rk6h2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:04:20.828052 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:20.828027 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rk6h2\" (UniqueName: \"kubernetes.io/projected/3f542828-1df7-4000-9b2a-2834da69e304-kube-api-access-rk6h2\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:04:21.365347 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:21.365316 2574 generic.go:358] "Generic (PLEG): container finished" podID="3f542828-1df7-4000-9b2a-2834da69e304" containerID="f3e00f1defdf728333a480cbf341cb3067592bba3bb58984cec35bdf3902db60" exitCode=0 Apr 20 15:04:21.365790 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:21.365363 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-4pmtj" Apr 20 15:04:21.365790 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:21.365398 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-4pmtj" event={"ID":"3f542828-1df7-4000-9b2a-2834da69e304","Type":"ContainerDied","Data":"f3e00f1defdf728333a480cbf341cb3067592bba3bb58984cec35bdf3902db60"} Apr 20 15:04:21.365790 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:21.365438 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-4pmtj" event={"ID":"3f542828-1df7-4000-9b2a-2834da69e304","Type":"ContainerDied","Data":"a87a636955e5383c47b3cfddc877323ba32ac90e3ee2d736bd1faae92b7b9422"} Apr 20 15:04:21.365790 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:21.365462 2574 scope.go:117] "RemoveContainer" containerID="f3e00f1defdf728333a480cbf341cb3067592bba3bb58984cec35bdf3902db60" Apr 20 15:04:21.374219 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:21.374203 2574 scope.go:117] "RemoveContainer" containerID="f3e00f1defdf728333a480cbf341cb3067592bba3bb58984cec35bdf3902db60" Apr 20 15:04:21.374443 ip-10-0-130-249 kubenswrapper[2574]: E0420 15:04:21.374425 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3e00f1defdf728333a480cbf341cb3067592bba3bb58984cec35bdf3902db60\": container with ID starting with f3e00f1defdf728333a480cbf341cb3067592bba3bb58984cec35bdf3902db60 not found: ID does not exist" containerID="f3e00f1defdf728333a480cbf341cb3067592bba3bb58984cec35bdf3902db60" Apr 20 15:04:21.374523 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:21.374450 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3e00f1defdf728333a480cbf341cb3067592bba3bb58984cec35bdf3902db60"} err="failed to get container status \"f3e00f1defdf728333a480cbf341cb3067592bba3bb58984cec35bdf3902db60\": rpc error: code = NotFound desc = could not find container \"f3e00f1defdf728333a480cbf341cb3067592bba3bb58984cec35bdf3902db60\": container with ID starting with f3e00f1defdf728333a480cbf341cb3067592bba3bb58984cec35bdf3902db60 not found: ID does not exist" Apr 20 15:04:21.386214 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:21.386193 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4pmtj"] Apr 20 15:04:21.389746 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:21.389727 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4pmtj"] Apr 20 15:04:21.452430 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:21.452403 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f542828-1df7-4000-9b2a-2834da69e304" path="/var/lib/kubelet/pods/3f542828-1df7-4000-9b2a-2834da69e304/volumes" Apr 20 15:04:51.740227 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:51.740192 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2"] Apr 20 15:04:51.740700 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:51.740502 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3f542828-1df7-4000-9b2a-2834da69e304" containerName="authorino" Apr 20 15:04:51.740700 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:51.740513 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f542828-1df7-4000-9b2a-2834da69e304" containerName="authorino" Apr 20 15:04:51.740700 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:51.740582 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3f542828-1df7-4000-9b2a-2834da69e304" containerName="authorino" Apr 20 15:04:51.742781 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:51.742766 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2" Apr 20 15:04:51.745328 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:51.745297 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-ngpdb\"" Apr 20 15:04:51.745489 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:51.745333 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 15:04:51.745489 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:51.745354 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 15:04:51.751317 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:51.751290 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2"] Apr 20 15:04:51.881647 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:51.881615 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4eb94a89-6161-49aa-9afd-b9b25ad03454-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2\" (UID: \"4eb94a89-6161-49aa-9afd-b9b25ad03454\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2" Apr 20 15:04:51.881816 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:51.881666 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4eb94a89-6161-49aa-9afd-b9b25ad03454-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2\" (UID: \"4eb94a89-6161-49aa-9afd-b9b25ad03454\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2" Apr 20 15:04:51.881816 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:51.881696 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spnjm\" (UniqueName: \"kubernetes.io/projected/4eb94a89-6161-49aa-9afd-b9b25ad03454-kube-api-access-spnjm\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2\" (UID: \"4eb94a89-6161-49aa-9afd-b9b25ad03454\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2" Apr 20 15:04:51.982974 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:51.982939 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4eb94a89-6161-49aa-9afd-b9b25ad03454-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2\" (UID: \"4eb94a89-6161-49aa-9afd-b9b25ad03454\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2" Apr 20 15:04:51.983136 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:51.982994 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4eb94a89-6161-49aa-9afd-b9b25ad03454-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2\" (UID: \"4eb94a89-6161-49aa-9afd-b9b25ad03454\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2" Apr 20 15:04:51.983136 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:51.983029 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spnjm\" (UniqueName: \"kubernetes.io/projected/4eb94a89-6161-49aa-9afd-b9b25ad03454-kube-api-access-spnjm\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2\" (UID: \"4eb94a89-6161-49aa-9afd-b9b25ad03454\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2" Apr 20 15:04:51.983340 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:51.983321 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4eb94a89-6161-49aa-9afd-b9b25ad03454-util\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2\" (UID: \"4eb94a89-6161-49aa-9afd-b9b25ad03454\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2" Apr 20 15:04:51.983395 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:51.983352 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4eb94a89-6161-49aa-9afd-b9b25ad03454-bundle\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2\" (UID: \"4eb94a89-6161-49aa-9afd-b9b25ad03454\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2" Apr 20 15:04:51.991353 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:51.991294 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spnjm\" (UniqueName: \"kubernetes.io/projected/4eb94a89-6161-49aa-9afd-b9b25ad03454-kube-api-access-spnjm\") pod \"7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2\" (UID: \"4eb94a89-6161-49aa-9afd-b9b25ad03454\") " pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2" Apr 20 15:04:52.052910 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:52.052873 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2" Apr 20 15:04:52.177006 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:52.176980 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2"] Apr 20 15:04:52.179336 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:04:52.179305 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4eb94a89_6161_49aa_9afd_b9b25ad03454.slice/crio-efb71d8afeeacd1729e93a76cb66cec3ac4208093b6b0f16a65394ba91127bb2 WatchSource:0}: Error finding container efb71d8afeeacd1729e93a76cb66cec3ac4208093b6b0f16a65394ba91127bb2: Status 404 returned error can't find the container with id efb71d8afeeacd1729e93a76cb66cec3ac4208093b6b0f16a65394ba91127bb2 Apr 20 15:04:52.475140 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:52.475104 2574 generic.go:358] "Generic (PLEG): container finished" podID="4eb94a89-6161-49aa-9afd-b9b25ad03454" containerID="89f3c00defc9c08ed90f4faf4028c886ed6f8a7683a80f2d8bffddc59ac79cf2" exitCode=0 Apr 20 15:04:52.475306 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:52.475190 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2" event={"ID":"4eb94a89-6161-49aa-9afd-b9b25ad03454","Type":"ContainerDied","Data":"89f3c00defc9c08ed90f4faf4028c886ed6f8a7683a80f2d8bffddc59ac79cf2"} Apr 20 15:04:52.475306 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:52.475221 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2" event={"ID":"4eb94a89-6161-49aa-9afd-b9b25ad03454","Type":"ContainerStarted","Data":"efb71d8afeeacd1729e93a76cb66cec3ac4208093b6b0f16a65394ba91127bb2"} Apr 20 15:04:53.480163 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:53.480128 2574 generic.go:358] "Generic (PLEG): container finished" podID="4eb94a89-6161-49aa-9afd-b9b25ad03454" containerID="79a7cad7c80af8394748e9b694fc2cf3009efe504e2c072abacb93851e01300f" exitCode=0 Apr 20 15:04:53.480568 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:53.480217 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2" event={"ID":"4eb94a89-6161-49aa-9afd-b9b25ad03454","Type":"ContainerDied","Data":"79a7cad7c80af8394748e9b694fc2cf3009efe504e2c072abacb93851e01300f"} Apr 20 15:04:54.486732 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:54.486697 2574 generic.go:358] "Generic (PLEG): container finished" podID="4eb94a89-6161-49aa-9afd-b9b25ad03454" containerID="7336f4afffbb46c4a7d1c20ca72741ca032c5dfe5ff96f651199387a51b60d21" exitCode=0 Apr 20 15:04:54.487166 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:54.486779 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2" event={"ID":"4eb94a89-6161-49aa-9afd-b9b25ad03454","Type":"ContainerDied","Data":"7336f4afffbb46c4a7d1c20ca72741ca032c5dfe5ff96f651199387a51b60d21"} Apr 20 15:04:55.611260 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:55.611241 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2" Apr 20 15:04:55.712153 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:55.712123 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spnjm\" (UniqueName: \"kubernetes.io/projected/4eb94a89-6161-49aa-9afd-b9b25ad03454-kube-api-access-spnjm\") pod \"4eb94a89-6161-49aa-9afd-b9b25ad03454\" (UID: \"4eb94a89-6161-49aa-9afd-b9b25ad03454\") " Apr 20 15:04:55.712288 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:55.712161 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4eb94a89-6161-49aa-9afd-b9b25ad03454-util\") pod \"4eb94a89-6161-49aa-9afd-b9b25ad03454\" (UID: \"4eb94a89-6161-49aa-9afd-b9b25ad03454\") " Apr 20 15:04:55.712288 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:55.712182 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4eb94a89-6161-49aa-9afd-b9b25ad03454-bundle\") pod \"4eb94a89-6161-49aa-9afd-b9b25ad03454\" (UID: \"4eb94a89-6161-49aa-9afd-b9b25ad03454\") " Apr 20 15:04:55.712727 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:55.712697 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eb94a89-6161-49aa-9afd-b9b25ad03454-bundle" (OuterVolumeSpecName: "bundle") pod "4eb94a89-6161-49aa-9afd-b9b25ad03454" (UID: "4eb94a89-6161-49aa-9afd-b9b25ad03454"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:04:55.714231 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:55.714209 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eb94a89-6161-49aa-9afd-b9b25ad03454-kube-api-access-spnjm" (OuterVolumeSpecName: "kube-api-access-spnjm") pod "4eb94a89-6161-49aa-9afd-b9b25ad03454" (UID: "4eb94a89-6161-49aa-9afd-b9b25ad03454"). InnerVolumeSpecName "kube-api-access-spnjm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:04:55.721013 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:55.720988 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eb94a89-6161-49aa-9afd-b9b25ad03454-util" (OuterVolumeSpecName: "util") pod "4eb94a89-6161-49aa-9afd-b9b25ad03454" (UID: "4eb94a89-6161-49aa-9afd-b9b25ad03454"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:04:55.813459 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:55.813434 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-spnjm\" (UniqueName: \"kubernetes.io/projected/4eb94a89-6161-49aa-9afd-b9b25ad03454-kube-api-access-spnjm\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:04:55.813459 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:55.813459 2574 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4eb94a89-6161-49aa-9afd-b9b25ad03454-util\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:04:55.813587 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:55.813469 2574 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4eb94a89-6161-49aa-9afd-b9b25ad03454-bundle\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:04:56.495814 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:56.495776 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2" event={"ID":"4eb94a89-6161-49aa-9afd-b9b25ad03454","Type":"ContainerDied","Data":"efb71d8afeeacd1729e93a76cb66cec3ac4208093b6b0f16a65394ba91127bb2"} Apr 20 15:04:56.495814 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:56.495812 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efb71d8afeeacd1729e93a76cb66cec3ac4208093b6b0f16a65394ba91127bb2" Apr 20 15:04:56.496166 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:04:56.495832 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7471e62b3b524e5f63095613ed1aa2d2aa2beb9d9bc7600d699dae1350ltkr2" Apr 20 15:05:17.744881 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:17.744847 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 15:05:17.745394 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:17.745165 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4eb94a89-6161-49aa-9afd-b9b25ad03454" containerName="extract" Apr 20 15:05:17.745394 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:17.745178 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb94a89-6161-49aa-9afd-b9b25ad03454" containerName="extract" Apr 20 15:05:17.745394 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:17.745191 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4eb94a89-6161-49aa-9afd-b9b25ad03454" containerName="util" Apr 20 15:05:17.745394 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:17.745197 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb94a89-6161-49aa-9afd-b9b25ad03454" containerName="util" Apr 20 15:05:17.745394 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:17.745214 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4eb94a89-6161-49aa-9afd-b9b25ad03454" containerName="pull" Apr 20 15:05:17.745394 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:17.745219 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eb94a89-6161-49aa-9afd-b9b25ad03454" containerName="pull" Apr 20 15:05:17.745394 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:17.745273 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="4eb94a89-6161-49aa-9afd-b9b25ad03454" containerName="extract" Apr 20 15:05:17.747067 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:17.747052 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 20 15:05:17.749730 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:17.749707 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 20 15:05:17.749730 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:17.749722 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-2tl7r\"" Apr 20 15:05:17.749915 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:17.749775 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 20 15:05:17.750763 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:17.750751 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 20 15:05:17.756551 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:17.756530 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 15:05:17.789547 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:17.789519 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ghhb\" (UniqueName: \"kubernetes.io/projected/faeff5e0-bbff-4190-8db9-d426c24a1afb-kube-api-access-7ghhb\") pod \"maas-keycloak-0\" (UID: \"faeff5e0-bbff-4190-8db9-d426c24a1afb\") " pod="keycloak-system/maas-keycloak-0" Apr 20 15:05:17.890109 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:17.890075 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7ghhb\" (UniqueName: \"kubernetes.io/projected/faeff5e0-bbff-4190-8db9-d426c24a1afb-kube-api-access-7ghhb\") pod \"maas-keycloak-0\" (UID: \"faeff5e0-bbff-4190-8db9-d426c24a1afb\") " pod="keycloak-system/maas-keycloak-0" Apr 20 15:05:17.899163 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:17.899134 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ghhb\" (UniqueName: \"kubernetes.io/projected/faeff5e0-bbff-4190-8db9-d426c24a1afb-kube-api-access-7ghhb\") pod \"maas-keycloak-0\" (UID: \"faeff5e0-bbff-4190-8db9-d426c24a1afb\") " pod="keycloak-system/maas-keycloak-0" Apr 20 15:05:18.057894 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:18.057859 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 20 15:05:18.179044 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:18.179014 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 15:05:18.182454 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:05:18.182296 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfaeff5e0_bbff_4190_8db9_d426c24a1afb.slice/crio-07958ed0bf7e172258854f1e98595de8ab39eb5a4e89280416187cbcbebfa2c4 WatchSource:0}: Error finding container 07958ed0bf7e172258854f1e98595de8ab39eb5a4e89280416187cbcbebfa2c4: Status 404 returned error can't find the container with id 07958ed0bf7e172258854f1e98595de8ab39eb5a4e89280416187cbcbebfa2c4 Apr 20 15:05:18.576338 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:18.576306 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"faeff5e0-bbff-4190-8db9-d426c24a1afb","Type":"ContainerStarted","Data":"07958ed0bf7e172258854f1e98595de8ab39eb5a4e89280416187cbcbebfa2c4"} Apr 20 15:05:22.136387 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:22.136324 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4z848_d54d05c4-b074-4189-b1dd-7ff476b824ec/console-operator/1.log" Apr 20 15:05:22.136841 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:22.136400 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4z848_d54d05c4-b074-4189-b1dd-7ff476b824ec/console-operator/1.log" Apr 20 15:05:23.072505 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:23.072360 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 20 15:05:23.602684 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:23.602644 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"faeff5e0-bbff-4190-8db9-d426c24a1afb","Type":"ContainerStarted","Data":"db2accf8f151e007b26cbd561809146ca2244b5ffa094f0994264194e56547cc"} Apr 20 15:05:23.623075 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:23.623015 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=1.737348104 podStartE2EDuration="6.622995382s" podCreationTimestamp="2026-04-20 15:05:17 +0000 UTC" firstStartedPulling="2026-04-20 15:05:18.183563487 +0000 UTC m=+597.256839095" lastFinishedPulling="2026-04-20 15:05:23.069210765 +0000 UTC m=+602.142486373" observedRunningTime="2026-04-20 15:05:23.619515614 +0000 UTC m=+602.692791254" watchObservedRunningTime="2026-04-20 15:05:23.622995382 +0000 UTC m=+602.696271009" Apr 20 15:05:24.058692 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:24.058650 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 20 15:05:24.060572 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:24.060506 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="faeff5e0-bbff-4190-8db9-d426c24a1afb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.45:9000/health/started\": dial tcp 10.134.0.45:9000: connect: connection refused" Apr 20 15:05:25.059207 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:25.059165 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="faeff5e0-bbff-4190-8db9-d426c24a1afb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.45:9000/health/started\": dial tcp 10.134.0.45:9000: connect: connection refused" Apr 20 15:05:26.059135 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:26.059087 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="faeff5e0-bbff-4190-8db9-d426c24a1afb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.45:9000/health/started\": dial tcp 10.134.0.45:9000: connect: connection refused" Apr 20 15:05:27.059090 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:27.059035 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="faeff5e0-bbff-4190-8db9-d426c24a1afb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.45:9000/health/started\": dial tcp 10.134.0.45:9000: connect: connection refused" Apr 20 15:05:28.058233 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:28.058191 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 20 15:05:28.059603 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:28.059570 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="faeff5e0-bbff-4190-8db9-d426c24a1afb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.45:9000/health/started\": dial tcp 10.134.0.45:9000: connect: connection refused" Apr 20 15:05:29.059107 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:29.059047 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="faeff5e0-bbff-4190-8db9-d426c24a1afb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.45:9000/health/started\": dial tcp 10.134.0.45:9000: connect: connection refused" Apr 20 15:05:30.059143 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:30.059091 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="faeff5e0-bbff-4190-8db9-d426c24a1afb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.45:9000/health/started\": dial tcp 10.134.0.45:9000: connect: connection refused" Apr 20 15:05:31.059271 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:31.059213 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="faeff5e0-bbff-4190-8db9-d426c24a1afb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.45:9000/health/started\": dial tcp 10.134.0.45:9000: connect: connection refused" Apr 20 15:05:32.058273 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:32.058232 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="faeff5e0-bbff-4190-8db9-d426c24a1afb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.45:9000/health/started\": dial tcp 10.134.0.45:9000: connect: connection refused" Apr 20 15:05:33.059154 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:33.059107 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="faeff5e0-bbff-4190-8db9-d426c24a1afb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.45:9000/health/started\": dial tcp 10.134.0.45:9000: connect: connection refused" Apr 20 15:05:34.059173 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:34.059125 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="faeff5e0-bbff-4190-8db9-d426c24a1afb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.45:9000/health/started\": dial tcp 10.134.0.45:9000: connect: connection refused" Apr 20 15:05:35.058771 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:35.058730 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="faeff5e0-bbff-4190-8db9-d426c24a1afb" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.45:9000/health/started\": dial tcp 10.134.0.45:9000: connect: connection refused" Apr 20 15:05:36.172034 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:36.171996 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 20 15:05:36.191538 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:36.191497 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="faeff5e0-bbff-4190-8db9-d426c24a1afb" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 15:05:46.179159 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:46.179076 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 20 15:05:47.353067 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:47.353028 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-6bcf75bcf9-c2czd"] Apr 20 15:05:47.358648 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:47.358625 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6bcf75bcf9-c2czd" Apr 20 15:05:47.361557 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:47.361533 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 15:05:47.361667 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:47.361627 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-gn9x5\"" Apr 20 15:05:47.361913 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:47.361898 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 15:05:47.369081 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:47.369059 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6bcf75bcf9-c2czd"] Apr 20 15:05:47.455919 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:47.455888 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da-maas-api-tls\") pod \"maas-api-6bcf75bcf9-c2czd\" (UID: \"8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da\") " pod="opendatahub/maas-api-6bcf75bcf9-c2czd" Apr 20 15:05:47.456101 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:47.455974 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvwxl\" (UniqueName: \"kubernetes.io/projected/8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da-kube-api-access-cvwxl\") pod \"maas-api-6bcf75bcf9-c2czd\" (UID: \"8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da\") " pod="opendatahub/maas-api-6bcf75bcf9-c2czd" Apr 20 15:05:47.557067 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:47.557030 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da-maas-api-tls\") pod \"maas-api-6bcf75bcf9-c2czd\" (UID: \"8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da\") " pod="opendatahub/maas-api-6bcf75bcf9-c2czd" Apr 20 15:05:47.557253 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:47.557219 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvwxl\" (UniqueName: \"kubernetes.io/projected/8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da-kube-api-access-cvwxl\") pod \"maas-api-6bcf75bcf9-c2czd\" (UID: \"8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da\") " pod="opendatahub/maas-api-6bcf75bcf9-c2czd" Apr 20 15:05:47.559548 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:47.559526 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da-maas-api-tls\") pod \"maas-api-6bcf75bcf9-c2czd\" (UID: \"8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da\") " pod="opendatahub/maas-api-6bcf75bcf9-c2czd" Apr 20 15:05:47.564905 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:47.564883 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvwxl\" (UniqueName: \"kubernetes.io/projected/8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da-kube-api-access-cvwxl\") pod \"maas-api-6bcf75bcf9-c2czd\" (UID: \"8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da\") " pod="opendatahub/maas-api-6bcf75bcf9-c2czd" Apr 20 15:05:47.669931 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:47.669837 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6bcf75bcf9-c2czd" Apr 20 15:05:47.792032 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:47.792007 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6bcf75bcf9-c2czd"] Apr 20 15:05:47.793570 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:05:47.793545 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c250dc9_e9ec_4dfb_9156_eb2b7dfee6da.slice/crio-5148b85d9da99105d4af200b351e92e791cc0b2232b350ae518b79825695a236 WatchSource:0}: Error finding container 5148b85d9da99105d4af200b351e92e791cc0b2232b350ae518b79825695a236: Status 404 returned error can't find the container with id 5148b85d9da99105d4af200b351e92e791cc0b2232b350ae518b79825695a236 Apr 20 15:05:47.794717 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:47.794699 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:05:48.104076 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.104045 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-p4nc8"] Apr 20 15:05:48.106999 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.106982 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-p4nc8" Apr 20 15:05:48.114446 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.114424 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-p4nc8"] Apr 20 15:05:48.262976 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.262945 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tvpx\" (UniqueName: \"kubernetes.io/projected/36c8e217-c8bd-42db-8544-166a3b9b9d7b-kube-api-access-2tvpx\") pod \"authorino-8b475cf9f-p4nc8\" (UID: \"36c8e217-c8bd-42db-8544-166a3b9b9d7b\") " pod="kuadrant-system/authorino-8b475cf9f-p4nc8" Apr 20 15:05:48.356152 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.356076 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-p4nc8"] Apr 20 15:05:48.356587 ip-10-0-130-249 kubenswrapper[2574]: E0420 15:05:48.356364 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-2tvpx], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-8b475cf9f-p4nc8" podUID="36c8e217-c8bd-42db-8544-166a3b9b9d7b" Apr 20 15:05:48.363766 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.363736 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2tvpx\" (UniqueName: \"kubernetes.io/projected/36c8e217-c8bd-42db-8544-166a3b9b9d7b-kube-api-access-2tvpx\") pod \"authorino-8b475cf9f-p4nc8\" (UID: \"36c8e217-c8bd-42db-8544-166a3b9b9d7b\") " pod="kuadrant-system/authorino-8b475cf9f-p4nc8" Apr 20 15:05:48.377031 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.377000 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tvpx\" (UniqueName: \"kubernetes.io/projected/36c8e217-c8bd-42db-8544-166a3b9b9d7b-kube-api-access-2tvpx\") pod \"authorino-8b475cf9f-p4nc8\" (UID: \"36c8e217-c8bd-42db-8544-166a3b9b9d7b\") " pod="kuadrant-system/authorino-8b475cf9f-p4nc8" Apr 20 15:05:48.382472 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.382447 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7695df8b6f-22glh"] Apr 20 15:05:48.385074 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.385051 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7695df8b6f-22glh" Apr 20 15:05:48.391361 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.391334 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7695df8b6f-22glh"] Apr 20 15:05:48.565162 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.565123 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrc2c\" (UniqueName: \"kubernetes.io/projected/f9fc77fc-c7f4-4f29-9680-8277eec2581e-kube-api-access-wrc2c\") pod \"authorino-7695df8b6f-22glh\" (UID: \"f9fc77fc-c7f4-4f29-9680-8277eec2581e\") " pod="kuadrant-system/authorino-7695df8b6f-22glh" Apr 20 15:05:48.622300 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.622224 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7695df8b6f-22glh"] Apr 20 15:05:48.622513 ip-10-0-130-249 kubenswrapper[2574]: E0420 15:05:48.622494 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-wrc2c], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-7695df8b6f-22glh" podUID="f9fc77fc-c7f4-4f29-9680-8277eec2581e" Apr 20 15:05:48.649712 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.649219 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-78f4797949-f64b4"] Apr 20 15:05:48.652298 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.652271 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-78f4797949-f64b4" Apr 20 15:05:48.655295 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.655272 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 20 15:05:48.660147 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.660111 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-78f4797949-f64b4"] Apr 20 15:05:48.666224 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.666198 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrc2c\" (UniqueName: \"kubernetes.io/projected/f9fc77fc-c7f4-4f29-9680-8277eec2581e-kube-api-access-wrc2c\") pod \"authorino-7695df8b6f-22glh\" (UID: \"f9fc77fc-c7f4-4f29-9680-8277eec2581e\") " pod="kuadrant-system/authorino-7695df8b6f-22glh" Apr 20 15:05:48.674882 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.674834 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrc2c\" (UniqueName: \"kubernetes.io/projected/f9fc77fc-c7f4-4f29-9680-8277eec2581e-kube-api-access-wrc2c\") pod \"authorino-7695df8b6f-22glh\" (UID: \"f9fc77fc-c7f4-4f29-9680-8277eec2581e\") " pod="kuadrant-system/authorino-7695df8b6f-22glh" Apr 20 15:05:48.713753 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.713696 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6bcf75bcf9-c2czd" event={"ID":"8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da","Type":"ContainerStarted","Data":"5148b85d9da99105d4af200b351e92e791cc0b2232b350ae518b79825695a236"} Apr 20 15:05:48.713753 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.713726 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-p4nc8" Apr 20 15:05:48.713946 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.713726 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7695df8b6f-22glh" Apr 20 15:05:48.719321 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.719300 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-p4nc8" Apr 20 15:05:48.723163 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.723145 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7695df8b6f-22glh" Apr 20 15:05:48.767464 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.767438 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47fgd\" (UniqueName: \"kubernetes.io/projected/830492f2-f456-4864-8b6b-cd72fc531958-kube-api-access-47fgd\") pod \"authorino-78f4797949-f64b4\" (UID: \"830492f2-f456-4864-8b6b-cd72fc531958\") " pod="kuadrant-system/authorino-78f4797949-f64b4" Apr 20 15:05:48.767592 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.767482 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/830492f2-f456-4864-8b6b-cd72fc531958-tls-cert\") pod \"authorino-78f4797949-f64b4\" (UID: \"830492f2-f456-4864-8b6b-cd72fc531958\") " pod="kuadrant-system/authorino-78f4797949-f64b4" Apr 20 15:05:48.868233 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.868205 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tvpx\" (UniqueName: \"kubernetes.io/projected/36c8e217-c8bd-42db-8544-166a3b9b9d7b-kube-api-access-2tvpx\") pod \"36c8e217-c8bd-42db-8544-166a3b9b9d7b\" (UID: \"36c8e217-c8bd-42db-8544-166a3b9b9d7b\") " Apr 20 15:05:48.868419 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.868326 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrc2c\" (UniqueName: \"kubernetes.io/projected/f9fc77fc-c7f4-4f29-9680-8277eec2581e-kube-api-access-wrc2c\") pod \"f9fc77fc-c7f4-4f29-9680-8277eec2581e\" (UID: \"f9fc77fc-c7f4-4f29-9680-8277eec2581e\") " Apr 20 15:05:48.868568 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.868547 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47fgd\" (UniqueName: \"kubernetes.io/projected/830492f2-f456-4864-8b6b-cd72fc531958-kube-api-access-47fgd\") pod \"authorino-78f4797949-f64b4\" (UID: \"830492f2-f456-4864-8b6b-cd72fc531958\") " pod="kuadrant-system/authorino-78f4797949-f64b4" Apr 20 15:05:48.868662 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.868594 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/830492f2-f456-4864-8b6b-cd72fc531958-tls-cert\") pod \"authorino-78f4797949-f64b4\" (UID: \"830492f2-f456-4864-8b6b-cd72fc531958\") " pod="kuadrant-system/authorino-78f4797949-f64b4" Apr 20 15:05:48.870787 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.870747 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9fc77fc-c7f4-4f29-9680-8277eec2581e-kube-api-access-wrc2c" (OuterVolumeSpecName: "kube-api-access-wrc2c") pod "f9fc77fc-c7f4-4f29-9680-8277eec2581e" (UID: "f9fc77fc-c7f4-4f29-9680-8277eec2581e"). InnerVolumeSpecName "kube-api-access-wrc2c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:05:48.870899 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.870795 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c8e217-c8bd-42db-8544-166a3b9b9d7b-kube-api-access-2tvpx" (OuterVolumeSpecName: "kube-api-access-2tvpx") pod "36c8e217-c8bd-42db-8544-166a3b9b9d7b" (UID: "36c8e217-c8bd-42db-8544-166a3b9b9d7b"). InnerVolumeSpecName "kube-api-access-2tvpx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:05:48.871533 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.871510 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/830492f2-f456-4864-8b6b-cd72fc531958-tls-cert\") pod \"authorino-78f4797949-f64b4\" (UID: \"830492f2-f456-4864-8b6b-cd72fc531958\") " pod="kuadrant-system/authorino-78f4797949-f64b4" Apr 20 15:05:48.878277 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.878201 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47fgd\" (UniqueName: \"kubernetes.io/projected/830492f2-f456-4864-8b6b-cd72fc531958-kube-api-access-47fgd\") pod \"authorino-78f4797949-f64b4\" (UID: \"830492f2-f456-4864-8b6b-cd72fc531958\") " pod="kuadrant-system/authorino-78f4797949-f64b4" Apr 20 15:05:48.965997 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.965961 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-78f4797949-f64b4" Apr 20 15:05:48.970093 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.970059 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wrc2c\" (UniqueName: \"kubernetes.io/projected/f9fc77fc-c7f4-4f29-9680-8277eec2581e-kube-api-access-wrc2c\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:05:48.970093 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:48.970091 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2tvpx\" (UniqueName: \"kubernetes.io/projected/36c8e217-c8bd-42db-8544-166a3b9b9d7b-kube-api-access-2tvpx\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:05:49.717593 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:49.717521 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-p4nc8" Apr 20 15:05:49.718008 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:49.717521 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7695df8b6f-22glh" Apr 20 15:05:49.748177 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:49.748142 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-p4nc8"] Apr 20 15:05:49.751841 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:49.751812 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-p4nc8"] Apr 20 15:05:49.769330 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:49.769300 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7695df8b6f-22glh"] Apr 20 15:05:49.771662 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:49.771642 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7695df8b6f-22glh"] Apr 20 15:05:50.072920 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:50.072899 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-78f4797949-f64b4"] Apr 20 15:05:50.073786 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:05:50.073749 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod830492f2_f456_4864_8b6b_cd72fc531958.slice/crio-9cb3addaccdb03ebc059c34e87d5006aca80a0997dec27ff8847295c22355e5c WatchSource:0}: Error finding container 9cb3addaccdb03ebc059c34e87d5006aca80a0997dec27ff8847295c22355e5c: Status 404 returned error can't find the container with id 9cb3addaccdb03ebc059c34e87d5006aca80a0997dec27ff8847295c22355e5c Apr 20 15:05:50.722381 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:50.722281 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6bcf75bcf9-c2czd" event={"ID":"8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da","Type":"ContainerStarted","Data":"7bd751cbac45f92a2a86921cb3c194ae8e131958e92f12a6a47e14e86e262075"} Apr 20 15:05:50.722814 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:50.722421 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-6bcf75bcf9-c2czd" Apr 20 15:05:50.723768 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:50.723747 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-78f4797949-f64b4" event={"ID":"830492f2-f456-4864-8b6b-cd72fc531958","Type":"ContainerStarted","Data":"b90bce06484b31f9208e9f6e0ce66eaa41e7a4d261fd8f7353575db972c66556"} Apr 20 15:05:50.723835 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:50.723773 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-78f4797949-f64b4" event={"ID":"830492f2-f456-4864-8b6b-cd72fc531958","Type":"ContainerStarted","Data":"9cb3addaccdb03ebc059c34e87d5006aca80a0997dec27ff8847295c22355e5c"} Apr 20 15:05:50.739593 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:50.739553 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-6bcf75bcf9-c2czd" podStartSLOduration=1.547302262 podStartE2EDuration="3.739542548s" podCreationTimestamp="2026-04-20 15:05:47 +0000 UTC" firstStartedPulling="2026-04-20 15:05:47.794815291 +0000 UTC m=+626.868090893" lastFinishedPulling="2026-04-20 15:05:49.987055575 +0000 UTC m=+629.060331179" observedRunningTime="2026-04-20 15:05:50.737436493 +0000 UTC m=+629.810712119" watchObservedRunningTime="2026-04-20 15:05:50.739542548 +0000 UTC m=+629.812818219" Apr 20 15:05:50.751221 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:50.751178 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-78f4797949-f64b4" podStartSLOduration=2.422669885 podStartE2EDuration="2.751166648s" podCreationTimestamp="2026-04-20 15:05:48 +0000 UTC" firstStartedPulling="2026-04-20 15:05:50.074998968 +0000 UTC m=+629.148274571" lastFinishedPulling="2026-04-20 15:05:50.403495725 +0000 UTC m=+629.476771334" observedRunningTime="2026-04-20 15:05:50.75023104 +0000 UTC m=+629.823506665" watchObservedRunningTime="2026-04-20 15:05:50.751166648 +0000 UTC m=+629.824442272" Apr 20 15:05:50.777648 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:50.777620 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-wfdxh"] Apr 20 15:05:50.777822 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:50.777799 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-wfdxh" podUID="782781bb-98bf-420a-8d4e-b5704d0a3f09" containerName="authorino" containerID="cri-o://8032dae31f6791241fb98813319a94ba904990ee5752f743838c9d14971ff82d" gracePeriod=30 Apr 20 15:05:51.013832 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:51.013812 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-wfdxh" Apr 20 15:05:51.189559 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:51.189530 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs4md\" (UniqueName: \"kubernetes.io/projected/782781bb-98bf-420a-8d4e-b5704d0a3f09-kube-api-access-gs4md\") pod \"782781bb-98bf-420a-8d4e-b5704d0a3f09\" (UID: \"782781bb-98bf-420a-8d4e-b5704d0a3f09\") " Apr 20 15:05:51.191601 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:51.191569 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/782781bb-98bf-420a-8d4e-b5704d0a3f09-kube-api-access-gs4md" (OuterVolumeSpecName: "kube-api-access-gs4md") pod "782781bb-98bf-420a-8d4e-b5704d0a3f09" (UID: "782781bb-98bf-420a-8d4e-b5704d0a3f09"). InnerVolumeSpecName "kube-api-access-gs4md". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:05:51.291153 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:51.291078 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gs4md\" (UniqueName: \"kubernetes.io/projected/782781bb-98bf-420a-8d4e-b5704d0a3f09-kube-api-access-gs4md\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:05:51.448028 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:51.447993 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36c8e217-c8bd-42db-8544-166a3b9b9d7b" path="/var/lib/kubelet/pods/36c8e217-c8bd-42db-8544-166a3b9b9d7b/volumes" Apr 20 15:05:51.448356 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:51.448337 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9fc77fc-c7f4-4f29-9680-8277eec2581e" path="/var/lib/kubelet/pods/f9fc77fc-c7f4-4f29-9680-8277eec2581e/volumes" Apr 20 15:05:51.728509 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:51.728474 2574 generic.go:358] "Generic (PLEG): container finished" podID="782781bb-98bf-420a-8d4e-b5704d0a3f09" containerID="8032dae31f6791241fb98813319a94ba904990ee5752f743838c9d14971ff82d" exitCode=0 Apr 20 15:05:51.728963 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:51.728526 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-wfdxh" Apr 20 15:05:51.728963 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:51.728561 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-wfdxh" event={"ID":"782781bb-98bf-420a-8d4e-b5704d0a3f09","Type":"ContainerDied","Data":"8032dae31f6791241fb98813319a94ba904990ee5752f743838c9d14971ff82d"} Apr 20 15:05:51.728963 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:51.728593 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-wfdxh" event={"ID":"782781bb-98bf-420a-8d4e-b5704d0a3f09","Type":"ContainerDied","Data":"870278c27ab7806d9b436ce49e7a4630c97036f6571ad7d1eebd822a2db0a77c"} Apr 20 15:05:51.728963 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:51.728607 2574 scope.go:117] "RemoveContainer" containerID="8032dae31f6791241fb98813319a94ba904990ee5752f743838c9d14971ff82d" Apr 20 15:05:51.737817 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:51.737796 2574 scope.go:117] "RemoveContainer" containerID="8032dae31f6791241fb98813319a94ba904990ee5752f743838c9d14971ff82d" Apr 20 15:05:51.738082 ip-10-0-130-249 kubenswrapper[2574]: E0420 15:05:51.738050 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8032dae31f6791241fb98813319a94ba904990ee5752f743838c9d14971ff82d\": container with ID starting with 8032dae31f6791241fb98813319a94ba904990ee5752f743838c9d14971ff82d not found: ID does not exist" containerID="8032dae31f6791241fb98813319a94ba904990ee5752f743838c9d14971ff82d" Apr 20 15:05:51.738133 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:51.738093 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8032dae31f6791241fb98813319a94ba904990ee5752f743838c9d14971ff82d"} err="failed to get container status \"8032dae31f6791241fb98813319a94ba904990ee5752f743838c9d14971ff82d\": rpc error: code = NotFound desc = could not find container \"8032dae31f6791241fb98813319a94ba904990ee5752f743838c9d14971ff82d\": container with ID starting with 8032dae31f6791241fb98813319a94ba904990ee5752f743838c9d14971ff82d not found: ID does not exist" Apr 20 15:05:51.751991 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:51.751962 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-wfdxh"] Apr 20 15:05:51.758122 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:51.758097 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-wfdxh"] Apr 20 15:05:53.447384 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:53.447340 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="782781bb-98bf-420a-8d4e-b5704d0a3f09" path="/var/lib/kubelet/pods/782781bb-98bf-420a-8d4e-b5704d0a3f09/volumes" Apr 20 15:05:56.735230 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:56.735200 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-6bcf75bcf9-c2czd" Apr 20 15:05:58.396837 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:58.396793 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-6bcf75bcf9-c2czd"] Apr 20 15:05:58.397255 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:58.397012 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-6bcf75bcf9-c2czd" podUID="8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da" containerName="maas-api" containerID="cri-o://7bd751cbac45f92a2a86921cb3c194ae8e131958e92f12a6a47e14e86e262075" gracePeriod=30 Apr 20 15:05:58.640504 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:58.640477 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6bcf75bcf9-c2czd" Apr 20 15:05:58.648296 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:58.648243 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da-maas-api-tls\") pod \"8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da\" (UID: \"8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da\") " Apr 20 15:05:58.648296 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:58.648278 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvwxl\" (UniqueName: \"kubernetes.io/projected/8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da-kube-api-access-cvwxl\") pod \"8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da\" (UID: \"8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da\") " Apr 20 15:05:58.650388 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:58.650347 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da" (UID: "8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:05:58.650479 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:58.650425 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da-kube-api-access-cvwxl" (OuterVolumeSpecName: "kube-api-access-cvwxl") pod "8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da" (UID: "8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da"). InnerVolumeSpecName "kube-api-access-cvwxl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:05:58.749588 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:58.749559 2574 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da-maas-api-tls\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:05:58.749588 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:58.749583 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cvwxl\" (UniqueName: \"kubernetes.io/projected/8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da-kube-api-access-cvwxl\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:05:58.757333 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:58.757305 2574 generic.go:358] "Generic (PLEG): container finished" podID="8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da" containerID="7bd751cbac45f92a2a86921cb3c194ae8e131958e92f12a6a47e14e86e262075" exitCode=0 Apr 20 15:05:58.757487 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:58.757387 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6bcf75bcf9-c2czd" Apr 20 15:05:58.757487 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:58.757398 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6bcf75bcf9-c2czd" event={"ID":"8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da","Type":"ContainerDied","Data":"7bd751cbac45f92a2a86921cb3c194ae8e131958e92f12a6a47e14e86e262075"} Apr 20 15:05:58.757487 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:58.757440 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6bcf75bcf9-c2czd" event={"ID":"8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da","Type":"ContainerDied","Data":"5148b85d9da99105d4af200b351e92e791cc0b2232b350ae518b79825695a236"} Apr 20 15:05:58.757487 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:58.757457 2574 scope.go:117] "RemoveContainer" containerID="7bd751cbac45f92a2a86921cb3c194ae8e131958e92f12a6a47e14e86e262075" Apr 20 15:05:58.766472 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:58.766454 2574 scope.go:117] "RemoveContainer" containerID="7bd751cbac45f92a2a86921cb3c194ae8e131958e92f12a6a47e14e86e262075" Apr 20 15:05:58.766730 ip-10-0-130-249 kubenswrapper[2574]: E0420 15:05:58.766712 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bd751cbac45f92a2a86921cb3c194ae8e131958e92f12a6a47e14e86e262075\": container with ID starting with 7bd751cbac45f92a2a86921cb3c194ae8e131958e92f12a6a47e14e86e262075 not found: ID does not exist" containerID="7bd751cbac45f92a2a86921cb3c194ae8e131958e92f12a6a47e14e86e262075" Apr 20 15:05:58.766811 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:58.766736 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bd751cbac45f92a2a86921cb3c194ae8e131958e92f12a6a47e14e86e262075"} err="failed to get container status \"7bd751cbac45f92a2a86921cb3c194ae8e131958e92f12a6a47e14e86e262075\": rpc error: code = NotFound desc = could not find container \"7bd751cbac45f92a2a86921cb3c194ae8e131958e92f12a6a47e14e86e262075\": container with ID starting with 7bd751cbac45f92a2a86921cb3c194ae8e131958e92f12a6a47e14e86e262075 not found: ID does not exist" Apr 20 15:05:58.779240 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:58.779214 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-6bcf75bcf9-c2czd"] Apr 20 15:05:58.782935 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:58.782912 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-6bcf75bcf9-c2czd"] Apr 20 15:05:59.448084 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:05:59.448047 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da" path="/var/lib/kubelet/pods/8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da/volumes" Apr 20 15:06:03.644419 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:03.644382 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-7dff657567-f472w"] Apr 20 15:06:03.644861 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:03.644680 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="782781bb-98bf-420a-8d4e-b5704d0a3f09" containerName="authorino" Apr 20 15:06:03.644861 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:03.644690 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="782781bb-98bf-420a-8d4e-b5704d0a3f09" containerName="authorino" Apr 20 15:06:03.644861 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:03.644707 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da" containerName="maas-api" Apr 20 15:06:03.644861 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:03.644713 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da" containerName="maas-api" Apr 20 15:06:03.644861 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:03.644770 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="8c250dc9-e9ec-4dfb-9156-eb2b7dfee6da" containerName="maas-api" Apr 20 15:06:03.644861 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:03.644779 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="782781bb-98bf-420a-8d4e-b5704d0a3f09" containerName="authorino" Apr 20 15:06:03.648885 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:03.648864 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7dff657567-f472w" Apr 20 15:06:03.651383 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:03.651354 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-vbgkq\"" Apr 20 15:06:03.656361 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:03.656327 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7dff657567-f472w"] Apr 20 15:06:03.688251 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:03.688219 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfsbj\" (UniqueName: \"kubernetes.io/projected/5e73bc52-3bcc-4604-a7db-a9a6717eaba2-kube-api-access-sfsbj\") pod \"maas-controller-7dff657567-f472w\" (UID: \"5e73bc52-3bcc-4604-a7db-a9a6717eaba2\") " pod="opendatahub/maas-controller-7dff657567-f472w" Apr 20 15:06:03.788635 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:03.788606 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sfsbj\" (UniqueName: \"kubernetes.io/projected/5e73bc52-3bcc-4604-a7db-a9a6717eaba2-kube-api-access-sfsbj\") pod \"maas-controller-7dff657567-f472w\" (UID: \"5e73bc52-3bcc-4604-a7db-a9a6717eaba2\") " pod="opendatahub/maas-controller-7dff657567-f472w" Apr 20 15:06:03.799207 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:03.799175 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfsbj\" (UniqueName: \"kubernetes.io/projected/5e73bc52-3bcc-4604-a7db-a9a6717eaba2-kube-api-access-sfsbj\") pod \"maas-controller-7dff657567-f472w\" (UID: \"5e73bc52-3bcc-4604-a7db-a9a6717eaba2\") " pod="opendatahub/maas-controller-7dff657567-f472w" Apr 20 15:06:03.960118 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:03.960050 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-7dff657567-f472w" Apr 20 15:06:04.082857 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:04.082814 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-7dff657567-f472w"] Apr 20 15:06:04.084427 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:06:04.084404 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e73bc52_3bcc_4604_a7db_a9a6717eaba2.slice/crio-eb1532e0a0925eca2758a5bc338c3195e9e7921f2b8fc9a0e91096bd461d6ac0 WatchSource:0}: Error finding container eb1532e0a0925eca2758a5bc338c3195e9e7921f2b8fc9a0e91096bd461d6ac0: Status 404 returned error can't find the container with id eb1532e0a0925eca2758a5bc338c3195e9e7921f2b8fc9a0e91096bd461d6ac0 Apr 20 15:06:04.782806 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:04.782767 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7dff657567-f472w" event={"ID":"5e73bc52-3bcc-4604-a7db-a9a6717eaba2","Type":"ContainerStarted","Data":"eb1532e0a0925eca2758a5bc338c3195e9e7921f2b8fc9a0e91096bd461d6ac0"} Apr 20 15:06:05.787869 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:05.787828 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-7dff657567-f472w" event={"ID":"5e73bc52-3bcc-4604-a7db-a9a6717eaba2","Type":"ContainerStarted","Data":"f2a1897371bb012845919e24771df66a2740952bc77761d970f3f5d15321ee89"} Apr 20 15:06:05.788223 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:05.787877 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-7dff657567-f472w" Apr 20 15:06:05.805286 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:05.805243 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-7dff657567-f472w" podStartSLOduration=1.600410441 podStartE2EDuration="2.805228914s" podCreationTimestamp="2026-04-20 15:06:03 +0000 UTC" firstStartedPulling="2026-04-20 15:06:04.086004124 +0000 UTC m=+643.159279730" lastFinishedPulling="2026-04-20 15:06:05.290822597 +0000 UTC m=+644.364098203" observedRunningTime="2026-04-20 15:06:05.803009171 +0000 UTC m=+644.876284819" watchObservedRunningTime="2026-04-20 15:06:05.805228914 +0000 UTC m=+644.878504538" Apr 20 15:06:16.797009 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:16.796974 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-7dff657567-f472w" Apr 20 15:06:17.719963 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:17.719933 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 15:06:17.720179 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:17.720152 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="keycloak-system/maas-keycloak-0" podUID="faeff5e0-bbff-4190-8db9-d426c24a1afb" containerName="keycloak" containerID="cri-o://db2accf8f151e007b26cbd561809146ca2244b5ffa094f0994264194e56547cc" gracePeriod=30 Apr 20 15:06:19.756438 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.756419 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 20 15:06:19.825315 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.825287 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ghhb\" (UniqueName: \"kubernetes.io/projected/faeff5e0-bbff-4190-8db9-d426c24a1afb-kube-api-access-7ghhb\") pod \"faeff5e0-bbff-4190-8db9-d426c24a1afb\" (UID: \"faeff5e0-bbff-4190-8db9-d426c24a1afb\") " Apr 20 15:06:19.827396 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.827351 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faeff5e0-bbff-4190-8db9-d426c24a1afb-kube-api-access-7ghhb" (OuterVolumeSpecName: "kube-api-access-7ghhb") pod "faeff5e0-bbff-4190-8db9-d426c24a1afb" (UID: "faeff5e0-bbff-4190-8db9-d426c24a1afb"). InnerVolumeSpecName "kube-api-access-7ghhb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:06:19.847566 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.847538 2574 generic.go:358] "Generic (PLEG): container finished" podID="faeff5e0-bbff-4190-8db9-d426c24a1afb" containerID="db2accf8f151e007b26cbd561809146ca2244b5ffa094f0994264194e56547cc" exitCode=143 Apr 20 15:06:19.847698 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.847619 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 20 15:06:19.847698 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.847626 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"faeff5e0-bbff-4190-8db9-d426c24a1afb","Type":"ContainerDied","Data":"db2accf8f151e007b26cbd561809146ca2244b5ffa094f0994264194e56547cc"} Apr 20 15:06:19.847698 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.847667 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"faeff5e0-bbff-4190-8db9-d426c24a1afb","Type":"ContainerDied","Data":"07958ed0bf7e172258854f1e98595de8ab39eb5a4e89280416187cbcbebfa2c4"} Apr 20 15:06:19.847698 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.847686 2574 scope.go:117] "RemoveContainer" containerID="db2accf8f151e007b26cbd561809146ca2244b5ffa094f0994264194e56547cc" Apr 20 15:06:19.857773 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.857755 2574 scope.go:117] "RemoveContainer" containerID="db2accf8f151e007b26cbd561809146ca2244b5ffa094f0994264194e56547cc" Apr 20 15:06:19.858013 ip-10-0-130-249 kubenswrapper[2574]: E0420 15:06:19.857993 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db2accf8f151e007b26cbd561809146ca2244b5ffa094f0994264194e56547cc\": container with ID starting with db2accf8f151e007b26cbd561809146ca2244b5ffa094f0994264194e56547cc not found: ID does not exist" containerID="db2accf8f151e007b26cbd561809146ca2244b5ffa094f0994264194e56547cc" Apr 20 15:06:19.858107 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.858018 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db2accf8f151e007b26cbd561809146ca2244b5ffa094f0994264194e56547cc"} err="failed to get container status \"db2accf8f151e007b26cbd561809146ca2244b5ffa094f0994264194e56547cc\": rpc error: code = NotFound desc = could not find container \"db2accf8f151e007b26cbd561809146ca2244b5ffa094f0994264194e56547cc\": container with ID starting with db2accf8f151e007b26cbd561809146ca2244b5ffa094f0994264194e56547cc not found: ID does not exist" Apr 20 15:06:19.870621 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.870596 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 15:06:19.872628 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.872606 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 15:06:19.892040 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.891989 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 15:06:19.892334 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.892321 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="faeff5e0-bbff-4190-8db9-d426c24a1afb" containerName="keycloak" Apr 20 15:06:19.892393 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.892336 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="faeff5e0-bbff-4190-8db9-d426c24a1afb" containerName="keycloak" Apr 20 15:06:19.892432 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.892403 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="faeff5e0-bbff-4190-8db9-d426c24a1afb" containerName="keycloak" Apr 20 15:06:19.896697 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.896681 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 20 15:06:19.899298 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.899280 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 20 15:06:19.899422 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.899282 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 20 15:06:19.899422 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.899312 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"keycloak-test-realms\"" Apr 20 15:06:19.899583 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.899567 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"maas-keycloak-initial-admin\"" Apr 20 15:06:19.899652 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.899606 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"default-dockercfg-2tl7r\"" Apr 20 15:06:19.905738 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.905719 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 15:06:19.926174 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:19.926153 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7ghhb\" (UniqueName: \"kubernetes.io/projected/faeff5e0-bbff-4190-8db9-d426c24a1afb-kube-api-access-7ghhb\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:06:20.027337 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:20.027307 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/d3c96496-afe7-42a4-9a0a-bb1186f412fa-test-realms\") pod \"maas-keycloak-0\" (UID: \"d3c96496-afe7-42a4-9a0a-bb1186f412fa\") " pod="keycloak-system/maas-keycloak-0" Apr 20 15:06:20.027486 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:20.027413 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt5n6\" (UniqueName: \"kubernetes.io/projected/d3c96496-afe7-42a4-9a0a-bb1186f412fa-kube-api-access-lt5n6\") pod \"maas-keycloak-0\" (UID: \"d3c96496-afe7-42a4-9a0a-bb1186f412fa\") " pod="keycloak-system/maas-keycloak-0" Apr 20 15:06:20.128615 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:20.128593 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lt5n6\" (UniqueName: \"kubernetes.io/projected/d3c96496-afe7-42a4-9a0a-bb1186f412fa-kube-api-access-lt5n6\") pod \"maas-keycloak-0\" (UID: \"d3c96496-afe7-42a4-9a0a-bb1186f412fa\") " pod="keycloak-system/maas-keycloak-0" Apr 20 15:06:20.128708 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:20.128636 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/d3c96496-afe7-42a4-9a0a-bb1186f412fa-test-realms\") pod \"maas-keycloak-0\" (UID: \"d3c96496-afe7-42a4-9a0a-bb1186f412fa\") " pod="keycloak-system/maas-keycloak-0" Apr 20 15:06:20.129222 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:20.129206 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"test-realms\" (UniqueName: \"kubernetes.io/configmap/d3c96496-afe7-42a4-9a0a-bb1186f412fa-test-realms\") pod \"maas-keycloak-0\" (UID: \"d3c96496-afe7-42a4-9a0a-bb1186f412fa\") " pod="keycloak-system/maas-keycloak-0" Apr 20 15:06:20.137071 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:20.137045 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt5n6\" (UniqueName: \"kubernetes.io/projected/d3c96496-afe7-42a4-9a0a-bb1186f412fa-kube-api-access-lt5n6\") pod \"maas-keycloak-0\" (UID: \"d3c96496-afe7-42a4-9a0a-bb1186f412fa\") " pod="keycloak-system/maas-keycloak-0" Apr 20 15:06:20.207612 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:20.207551 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/maas-keycloak-0" Apr 20 15:06:20.325255 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:20.325224 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/maas-keycloak-0"] Apr 20 15:06:20.326933 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:06:20.326908 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3c96496_afe7_42a4_9a0a_bb1186f412fa.slice/crio-e4b198af2ebb3b9c5ac0c8bebed076c7e0dd8c46c110c92e07ba0b60bb0b3c17 WatchSource:0}: Error finding container e4b198af2ebb3b9c5ac0c8bebed076c7e0dd8c46c110c92e07ba0b60bb0b3c17: Status 404 returned error can't find the container with id e4b198af2ebb3b9c5ac0c8bebed076c7e0dd8c46c110c92e07ba0b60bb0b3c17 Apr 20 15:06:20.855426 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:20.855392 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"d3c96496-afe7-42a4-9a0a-bb1186f412fa","Type":"ContainerStarted","Data":"c57e8d75229330a328c8512c2a87b6bb724c3de7040af781f801ad5c6eaae4e9"} Apr 20 15:06:20.855426 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:20.855429 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/maas-keycloak-0" event={"ID":"d3c96496-afe7-42a4-9a0a-bb1186f412fa","Type":"ContainerStarted","Data":"e4b198af2ebb3b9c5ac0c8bebed076c7e0dd8c46c110c92e07ba0b60bb0b3c17"} Apr 20 15:06:20.874049 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:20.874003 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/maas-keycloak-0" podStartSLOduration=1.545227959 podStartE2EDuration="1.873988451s" podCreationTimestamp="2026-04-20 15:06:19 +0000 UTC" firstStartedPulling="2026-04-20 15:06:20.328659357 +0000 UTC m=+659.401934960" lastFinishedPulling="2026-04-20 15:06:20.657419846 +0000 UTC m=+659.730695452" observedRunningTime="2026-04-20 15:06:20.872079457 +0000 UTC m=+659.945355085" watchObservedRunningTime="2026-04-20 15:06:20.873988451 +0000 UTC m=+659.947264075" Apr 20 15:06:21.208302 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:21.208266 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="keycloak-system/maas-keycloak-0" Apr 20 15:06:21.210183 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:21.210148 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d3c96496-afe7-42a4-9a0a-bb1186f412fa" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.51:9000/health/started\": dial tcp 10.134.0.51:9000: connect: connection refused" Apr 20 15:06:21.448897 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:21.448854 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faeff5e0-bbff-4190-8db9-d426c24a1afb" path="/var/lib/kubelet/pods/faeff5e0-bbff-4190-8db9-d426c24a1afb/volumes" Apr 20 15:06:22.208465 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:22.208414 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d3c96496-afe7-42a4-9a0a-bb1186f412fa" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.51:9000/health/started\": dial tcp 10.134.0.51:9000: connect: connection refused" Apr 20 15:06:23.208748 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:23.208707 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d3c96496-afe7-42a4-9a0a-bb1186f412fa" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.51:9000/health/started\": dial tcp 10.134.0.51:9000: connect: connection refused" Apr 20 15:06:24.208678 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:24.208625 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d3c96496-afe7-42a4-9a0a-bb1186f412fa" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.51:9000/health/started\": dial tcp 10.134.0.51:9000: connect: connection refused" Apr 20 15:06:25.208646 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:25.208594 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d3c96496-afe7-42a4-9a0a-bb1186f412fa" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.51:9000/health/started\": dial tcp 10.134.0.51:9000: connect: connection refused" Apr 20 15:06:26.208798 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:26.208739 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d3c96496-afe7-42a4-9a0a-bb1186f412fa" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.51:9000/health/started\": dial tcp 10.134.0.51:9000: connect: connection refused" Apr 20 15:06:27.208932 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:27.208875 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d3c96496-afe7-42a4-9a0a-bb1186f412fa" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.51:9000/health/started\": dial tcp 10.134.0.51:9000: connect: connection refused" Apr 20 15:06:28.208579 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:28.208525 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d3c96496-afe7-42a4-9a0a-bb1186f412fa" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.51:9000/health/started\": dial tcp 10.134.0.51:9000: connect: connection refused" Apr 20 15:06:29.208342 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:29.208293 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d3c96496-afe7-42a4-9a0a-bb1186f412fa" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.51:9000/health/started\": dial tcp 10.134.0.51:9000: connect: connection refused" Apr 20 15:06:30.207888 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:30.207838 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="keycloak-system/maas-keycloak-0" Apr 20 15:06:30.208227 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:30.208191 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d3c96496-afe7-42a4-9a0a-bb1186f412fa" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.51:9000/health/started\": dial tcp 10.134.0.51:9000: connect: connection refused" Apr 20 15:06:31.208486 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:31.208442 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d3c96496-afe7-42a4-9a0a-bb1186f412fa" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.51:9000/health/started\": dial tcp 10.134.0.51:9000: connect: connection refused" Apr 20 15:06:32.208284 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:32.208234 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d3c96496-afe7-42a4-9a0a-bb1186f412fa" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.51:9000/health/started\": dial tcp 10.134.0.51:9000: connect: connection refused" Apr 20 15:06:33.208253 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:33.208206 2574 prober.go:120] "Probe failed" probeType="Startup" pod="keycloak-system/maas-keycloak-0" podUID="d3c96496-afe7-42a4-9a0a-bb1186f412fa" containerName="keycloak" probeResult="failure" output="Get \"http://10.134.0.51:9000/health/started\": dial tcp 10.134.0.51:9000: connect: connection refused" Apr 20 15:06:34.339220 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:34.339172 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="keycloak-system/maas-keycloak-0" Apr 20 15:06:34.361158 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:34.361118 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="keycloak-system/maas-keycloak-0" podUID="d3c96496-afe7-42a4-9a0a-bb1186f412fa" containerName="keycloak" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 15:06:44.345805 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:44.345760 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="keycloak-system/maas-keycloak-0" Apr 20 15:06:55.032266 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:55.032227 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-5b565bc487-8rnzz"] Apr 20 15:06:55.035846 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:55.035825 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5b565bc487-8rnzz" Apr 20 15:06:55.038314 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:55.038291 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"authorino-oidc-ca\"" Apr 20 15:06:55.042123 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:55.042102 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5b565bc487-8rnzz"] Apr 20 15:06:55.143528 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:55.143496 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f4414530-9f52-42db-abd1-c960b1592386-tls-cert\") pod \"authorino-5b565bc487-8rnzz\" (UID: \"f4414530-9f52-42db-abd1-c960b1592386\") " pod="kuadrant-system/authorino-5b565bc487-8rnzz" Apr 20 15:06:55.143714 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:55.143559 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/f4414530-9f52-42db-abd1-c960b1592386-oidc-ca\") pod \"authorino-5b565bc487-8rnzz\" (UID: \"f4414530-9f52-42db-abd1-c960b1592386\") " pod="kuadrant-system/authorino-5b565bc487-8rnzz" Apr 20 15:06:55.143714 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:55.143646 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mtz8\" (UniqueName: \"kubernetes.io/projected/f4414530-9f52-42db-abd1-c960b1592386-kube-api-access-7mtz8\") pod \"authorino-5b565bc487-8rnzz\" (UID: \"f4414530-9f52-42db-abd1-c960b1592386\") " pod="kuadrant-system/authorino-5b565bc487-8rnzz" Apr 20 15:06:55.244786 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:55.244748 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/f4414530-9f52-42db-abd1-c960b1592386-oidc-ca\") pod \"authorino-5b565bc487-8rnzz\" (UID: \"f4414530-9f52-42db-abd1-c960b1592386\") " pod="kuadrant-system/authorino-5b565bc487-8rnzz" Apr 20 15:06:55.244951 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:55.244809 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mtz8\" (UniqueName: \"kubernetes.io/projected/f4414530-9f52-42db-abd1-c960b1592386-kube-api-access-7mtz8\") pod \"authorino-5b565bc487-8rnzz\" (UID: \"f4414530-9f52-42db-abd1-c960b1592386\") " pod="kuadrant-system/authorino-5b565bc487-8rnzz" Apr 20 15:06:55.244951 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:55.244835 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f4414530-9f52-42db-abd1-c960b1592386-tls-cert\") pod \"authorino-5b565bc487-8rnzz\" (UID: \"f4414530-9f52-42db-abd1-c960b1592386\") " pod="kuadrant-system/authorino-5b565bc487-8rnzz" Apr 20 15:06:55.245450 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:55.245429 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/f4414530-9f52-42db-abd1-c960b1592386-oidc-ca\") pod \"authorino-5b565bc487-8rnzz\" (UID: \"f4414530-9f52-42db-abd1-c960b1592386\") " pod="kuadrant-system/authorino-5b565bc487-8rnzz" Apr 20 15:06:55.247248 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:55.247231 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f4414530-9f52-42db-abd1-c960b1592386-tls-cert\") pod \"authorino-5b565bc487-8rnzz\" (UID: \"f4414530-9f52-42db-abd1-c960b1592386\") " pod="kuadrant-system/authorino-5b565bc487-8rnzz" Apr 20 15:06:55.252197 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:55.252179 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mtz8\" (UniqueName: \"kubernetes.io/projected/f4414530-9f52-42db-abd1-c960b1592386-kube-api-access-7mtz8\") pod \"authorino-5b565bc487-8rnzz\" (UID: \"f4414530-9f52-42db-abd1-c960b1592386\") " pod="kuadrant-system/authorino-5b565bc487-8rnzz" Apr 20 15:06:55.345412 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:55.345356 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5b565bc487-8rnzz" Apr 20 15:06:55.464584 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:55.464558 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5b565bc487-8rnzz"] Apr 20 15:06:55.466020 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:06:55.465995 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4414530_9f52_42db_abd1_c960b1592386.slice/crio-956c35037fc7498c2581224130a6a501d38136227bf1b74ddbc33aede8ce79a7 WatchSource:0}: Error finding container 956c35037fc7498c2581224130a6a501d38136227bf1b74ddbc33aede8ce79a7: Status 404 returned error can't find the container with id 956c35037fc7498c2581224130a6a501d38136227bf1b74ddbc33aede8ce79a7 Apr 20 15:06:56.008043 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:56.007995 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5b565bc487-8rnzz" event={"ID":"f4414530-9f52-42db-abd1-c960b1592386","Type":"ContainerStarted","Data":"6bafa95ffc41e1685fa6f241e4920eeef83e9703983e43ca04e0da9fe4a88bf2"} Apr 20 15:06:56.008043 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:56.008046 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5b565bc487-8rnzz" event={"ID":"f4414530-9f52-42db-abd1-c960b1592386","Type":"ContainerStarted","Data":"956c35037fc7498c2581224130a6a501d38136227bf1b74ddbc33aede8ce79a7"} Apr 20 15:06:56.025316 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:56.025262 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-5b565bc487-8rnzz" podStartSLOduration=0.668394891 podStartE2EDuration="1.025244958s" podCreationTimestamp="2026-04-20 15:06:55 +0000 UTC" firstStartedPulling="2026-04-20 15:06:55.467193917 +0000 UTC m=+694.540469524" lastFinishedPulling="2026-04-20 15:06:55.824043987 +0000 UTC m=+694.897319591" observedRunningTime="2026-04-20 15:06:56.02281296 +0000 UTC m=+695.096088614" watchObservedRunningTime="2026-04-20 15:06:56.025244958 +0000 UTC m=+695.098520584" Apr 20 15:06:56.051691 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:56.051664 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-78f4797949-f64b4"] Apr 20 15:06:56.052412 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:56.052355 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-78f4797949-f64b4" podUID="830492f2-f456-4864-8b6b-cd72fc531958" containerName="authorino" containerID="cri-o://b90bce06484b31f9208e9f6e0ce66eaa41e7a4d261fd8f7353575db972c66556" gracePeriod=30 Apr 20 15:06:56.376085 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:56.376060 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-78f4797949-f64b4" Apr 20 15:06:56.456360 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:56.456259 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/830492f2-f456-4864-8b6b-cd72fc531958-tls-cert\") pod \"830492f2-f456-4864-8b6b-cd72fc531958\" (UID: \"830492f2-f456-4864-8b6b-cd72fc531958\") " Apr 20 15:06:56.456360 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:56.456353 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-47fgd\" (UniqueName: \"kubernetes.io/projected/830492f2-f456-4864-8b6b-cd72fc531958-kube-api-access-47fgd\") pod \"830492f2-f456-4864-8b6b-cd72fc531958\" (UID: \"830492f2-f456-4864-8b6b-cd72fc531958\") " Apr 20 15:06:56.459480 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:56.459441 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/830492f2-f456-4864-8b6b-cd72fc531958-kube-api-access-47fgd" (OuterVolumeSpecName: "kube-api-access-47fgd") pod "830492f2-f456-4864-8b6b-cd72fc531958" (UID: "830492f2-f456-4864-8b6b-cd72fc531958"). InnerVolumeSpecName "kube-api-access-47fgd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:06:56.467920 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:56.467895 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/830492f2-f456-4864-8b6b-cd72fc531958-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "830492f2-f456-4864-8b6b-cd72fc531958" (UID: "830492f2-f456-4864-8b6b-cd72fc531958"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:06:56.558073 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:56.558020 2574 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/830492f2-f456-4864-8b6b-cd72fc531958-tls-cert\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:06:56.558073 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:56.558066 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-47fgd\" (UniqueName: \"kubernetes.io/projected/830492f2-f456-4864-8b6b-cd72fc531958-kube-api-access-47fgd\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:06:57.013123 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:57.013087 2574 generic.go:358] "Generic (PLEG): container finished" podID="830492f2-f456-4864-8b6b-cd72fc531958" containerID="b90bce06484b31f9208e9f6e0ce66eaa41e7a4d261fd8f7353575db972c66556" exitCode=0 Apr 20 15:06:57.013317 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:57.013142 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-78f4797949-f64b4" Apr 20 15:06:57.013317 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:57.013179 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-78f4797949-f64b4" event={"ID":"830492f2-f456-4864-8b6b-cd72fc531958","Type":"ContainerDied","Data":"b90bce06484b31f9208e9f6e0ce66eaa41e7a4d261fd8f7353575db972c66556"} Apr 20 15:06:57.013317 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:57.013225 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-78f4797949-f64b4" event={"ID":"830492f2-f456-4864-8b6b-cd72fc531958","Type":"ContainerDied","Data":"9cb3addaccdb03ebc059c34e87d5006aca80a0997dec27ff8847295c22355e5c"} Apr 20 15:06:57.013317 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:57.013248 2574 scope.go:117] "RemoveContainer" containerID="b90bce06484b31f9208e9f6e0ce66eaa41e7a4d261fd8f7353575db972c66556" Apr 20 15:06:57.022431 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:57.022414 2574 scope.go:117] "RemoveContainer" containerID="b90bce06484b31f9208e9f6e0ce66eaa41e7a4d261fd8f7353575db972c66556" Apr 20 15:06:57.022689 ip-10-0-130-249 kubenswrapper[2574]: E0420 15:06:57.022658 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b90bce06484b31f9208e9f6e0ce66eaa41e7a4d261fd8f7353575db972c66556\": container with ID starting with b90bce06484b31f9208e9f6e0ce66eaa41e7a4d261fd8f7353575db972c66556 not found: ID does not exist" containerID="b90bce06484b31f9208e9f6e0ce66eaa41e7a4d261fd8f7353575db972c66556" Apr 20 15:06:57.022733 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:57.022691 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b90bce06484b31f9208e9f6e0ce66eaa41e7a4d261fd8f7353575db972c66556"} err="failed to get container status \"b90bce06484b31f9208e9f6e0ce66eaa41e7a4d261fd8f7353575db972c66556\": rpc error: code = NotFound desc = could not find container \"b90bce06484b31f9208e9f6e0ce66eaa41e7a4d261fd8f7353575db972c66556\": container with ID starting with b90bce06484b31f9208e9f6e0ce66eaa41e7a4d261fd8f7353575db972c66556 not found: ID does not exist" Apr 20 15:06:57.041570 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:57.041547 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-78f4797949-f64b4"] Apr 20 15:06:57.048170 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:57.048150 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-78f4797949-f64b4"] Apr 20 15:06:57.453447 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:06:57.453413 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="830492f2-f456-4864-8b6b-cd72fc531958" path="/var/lib/kubelet/pods/830492f2-f456-4864-8b6b-cd72fc531958/volumes" Apr 20 15:07:10.136108 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.136068 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt"] Apr 20 15:07:10.136525 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.136507 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="830492f2-f456-4864-8b6b-cd72fc531958" containerName="authorino" Apr 20 15:07:10.136599 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.136527 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="830492f2-f456-4864-8b6b-cd72fc531958" containerName="authorino" Apr 20 15:07:10.136652 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.136644 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="830492f2-f456-4864-8b6b-cd72fc531958" containerName="authorino" Apr 20 15:07:10.141307 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.141287 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" Apr 20 15:07:10.145207 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.145184 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 20 15:07:10.145343 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.145229 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 15:07:10.145343 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.145186 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 15:07:10.145343 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.145297 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-z9d6w\"" Apr 20 15:07:10.150328 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.150305 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt"] Apr 20 15:07:10.266952 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.266876 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85q7w\" (UniqueName: \"kubernetes.io/projected/547e69ed-99a1-4db4-ad66-258166e5f48c-kube-api-access-85q7w\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt\" (UID: \"547e69ed-99a1-4db4-ad66-258166e5f48c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" Apr 20 15:07:10.266952 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.266934 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/547e69ed-99a1-4db4-ad66-258166e5f48c-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt\" (UID: \"547e69ed-99a1-4db4-ad66-258166e5f48c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" Apr 20 15:07:10.267155 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.266966 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/547e69ed-99a1-4db4-ad66-258166e5f48c-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt\" (UID: \"547e69ed-99a1-4db4-ad66-258166e5f48c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" Apr 20 15:07:10.267155 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.267003 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/547e69ed-99a1-4db4-ad66-258166e5f48c-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt\" (UID: \"547e69ed-99a1-4db4-ad66-258166e5f48c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" Apr 20 15:07:10.267155 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.267075 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/547e69ed-99a1-4db4-ad66-258166e5f48c-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt\" (UID: \"547e69ed-99a1-4db4-ad66-258166e5f48c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" Apr 20 15:07:10.267155 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.267113 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/547e69ed-99a1-4db4-ad66-258166e5f48c-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt\" (UID: \"547e69ed-99a1-4db4-ad66-258166e5f48c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" Apr 20 15:07:10.367452 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.367424 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/547e69ed-99a1-4db4-ad66-258166e5f48c-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt\" (UID: \"547e69ed-99a1-4db4-ad66-258166e5f48c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" Apr 20 15:07:10.367604 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.367457 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/547e69ed-99a1-4db4-ad66-258166e5f48c-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt\" (UID: \"547e69ed-99a1-4db4-ad66-258166e5f48c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" Apr 20 15:07:10.367604 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.367479 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/547e69ed-99a1-4db4-ad66-258166e5f48c-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt\" (UID: \"547e69ed-99a1-4db4-ad66-258166e5f48c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" Apr 20 15:07:10.367604 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.367541 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-85q7w\" (UniqueName: \"kubernetes.io/projected/547e69ed-99a1-4db4-ad66-258166e5f48c-kube-api-access-85q7w\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt\" (UID: \"547e69ed-99a1-4db4-ad66-258166e5f48c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" Apr 20 15:07:10.367604 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.367590 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/547e69ed-99a1-4db4-ad66-258166e5f48c-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt\" (UID: \"547e69ed-99a1-4db4-ad66-258166e5f48c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" Apr 20 15:07:10.367742 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.367616 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/547e69ed-99a1-4db4-ad66-258166e5f48c-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt\" (UID: \"547e69ed-99a1-4db4-ad66-258166e5f48c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" Apr 20 15:07:10.367952 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.367923 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/547e69ed-99a1-4db4-ad66-258166e5f48c-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt\" (UID: \"547e69ed-99a1-4db4-ad66-258166e5f48c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" Apr 20 15:07:10.368066 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.367958 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/547e69ed-99a1-4db4-ad66-258166e5f48c-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt\" (UID: \"547e69ed-99a1-4db4-ad66-258166e5f48c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" Apr 20 15:07:10.368066 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.368018 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/547e69ed-99a1-4db4-ad66-258166e5f48c-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt\" (UID: \"547e69ed-99a1-4db4-ad66-258166e5f48c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" Apr 20 15:07:10.369720 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.369698 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/547e69ed-99a1-4db4-ad66-258166e5f48c-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt\" (UID: \"547e69ed-99a1-4db4-ad66-258166e5f48c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" Apr 20 15:07:10.369964 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.369947 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/547e69ed-99a1-4db4-ad66-258166e5f48c-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt\" (UID: \"547e69ed-99a1-4db4-ad66-258166e5f48c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" Apr 20 15:07:10.375713 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.375694 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-85q7w\" (UniqueName: \"kubernetes.io/projected/547e69ed-99a1-4db4-ad66-258166e5f48c-kube-api-access-85q7w\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt\" (UID: \"547e69ed-99a1-4db4-ad66-258166e5f48c\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" Apr 20 15:07:10.452305 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.452239 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" Apr 20 15:07:10.580758 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:10.580717 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt"] Apr 20 15:07:10.581494 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:07:10.581468 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod547e69ed_99a1_4db4_ad66_258166e5f48c.slice/crio-28edf47305a754ae6ca19f423e23fda2d4d3ab32f6c5cedc91babe981ceecccc WatchSource:0}: Error finding container 28edf47305a754ae6ca19f423e23fda2d4d3ab32f6c5cedc91babe981ceecccc: Status 404 returned error can't find the container with id 28edf47305a754ae6ca19f423e23fda2d4d3ab32f6c5cedc91babe981ceecccc Apr 20 15:07:11.062430 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:11.062392 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" event={"ID":"547e69ed-99a1-4db4-ad66-258166e5f48c","Type":"ContainerStarted","Data":"28edf47305a754ae6ca19f423e23fda2d4d3ab32f6c5cedc91babe981ceecccc"} Apr 20 15:07:19.097306 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:19.097259 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" event={"ID":"547e69ed-99a1-4db4-ad66-258166e5f48c","Type":"ContainerStarted","Data":"2746a8fa405ec6a7891396634e4f4e7ff352add60325ac4175b069530b3e6fd4"} Apr 20 15:07:24.120003 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:24.119970 2574 generic.go:358] "Generic (PLEG): container finished" podID="547e69ed-99a1-4db4-ad66-258166e5f48c" containerID="2746a8fa405ec6a7891396634e4f4e7ff352add60325ac4175b069530b3e6fd4" exitCode=0 Apr 20 15:07:24.120387 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:24.120028 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" event={"ID":"547e69ed-99a1-4db4-ad66-258166e5f48c","Type":"ContainerDied","Data":"2746a8fa405ec6a7891396634e4f4e7ff352add60325ac4175b069530b3e6fd4"} Apr 20 15:07:28.139206 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:28.139173 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" event={"ID":"547e69ed-99a1-4db4-ad66-258166e5f48c","Type":"ContainerStarted","Data":"129d595c620fa7e12a6d4c9c243b6a478b190715162595e12a6d2c6ec3ccd8a4"} Apr 20 15:07:28.139630 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:28.139407 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" Apr 20 15:07:28.161249 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:28.161205 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" podStartSLOduration=0.977468066 podStartE2EDuration="18.16119081s" podCreationTimestamp="2026-04-20 15:07:10 +0000 UTC" firstStartedPulling="2026-04-20 15:07:10.583430111 +0000 UTC m=+709.656705734" lastFinishedPulling="2026-04-20 15:07:27.767152875 +0000 UTC m=+726.840428478" observedRunningTime="2026-04-20 15:07:28.157968583 +0000 UTC m=+727.231244230" watchObservedRunningTime="2026-04-20 15:07:28.16119081 +0000 UTC m=+727.234466436" Apr 20 15:07:39.155313 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:39.155284 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt" Apr 20 15:07:41.340540 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.340506 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc"] Apr 20 15:07:41.503306 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.503268 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc"] Apr 20 15:07:41.503488 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.503343 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" Apr 20 15:07:41.506949 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.506903 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 20 15:07:41.537271 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.537238 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5c19070-be08-4a9c-81b4-b662dc63f84d-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc\" (UID: \"e5c19070-be08-4a9c-81b4-b662dc63f84d\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" Apr 20 15:07:41.537402 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.537291 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djcfw\" (UniqueName: \"kubernetes.io/projected/e5c19070-be08-4a9c-81b4-b662dc63f84d-kube-api-access-djcfw\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc\" (UID: \"e5c19070-be08-4a9c-81b4-b662dc63f84d\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" Apr 20 15:07:41.537402 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.537343 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5c19070-be08-4a9c-81b4-b662dc63f84d-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc\" (UID: \"e5c19070-be08-4a9c-81b4-b662dc63f84d\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" Apr 20 15:07:41.537500 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.537414 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5c19070-be08-4a9c-81b4-b662dc63f84d-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc\" (UID: \"e5c19070-be08-4a9c-81b4-b662dc63f84d\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" Apr 20 15:07:41.537500 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.537451 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5c19070-be08-4a9c-81b4-b662dc63f84d-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc\" (UID: \"e5c19070-be08-4a9c-81b4-b662dc63f84d\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" Apr 20 15:07:41.537500 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.537475 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5c19070-be08-4a9c-81b4-b662dc63f84d-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc\" (UID: \"e5c19070-be08-4a9c-81b4-b662dc63f84d\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" Apr 20 15:07:41.638230 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.638145 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5c19070-be08-4a9c-81b4-b662dc63f84d-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc\" (UID: \"e5c19070-be08-4a9c-81b4-b662dc63f84d\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" Apr 20 15:07:41.638230 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.638201 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5c19070-be08-4a9c-81b4-b662dc63f84d-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc\" (UID: \"e5c19070-be08-4a9c-81b4-b662dc63f84d\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" Apr 20 15:07:41.638465 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.638249 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5c19070-be08-4a9c-81b4-b662dc63f84d-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc\" (UID: \"e5c19070-be08-4a9c-81b4-b662dc63f84d\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" Apr 20 15:07:41.638465 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.638272 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djcfw\" (UniqueName: \"kubernetes.io/projected/e5c19070-be08-4a9c-81b4-b662dc63f84d-kube-api-access-djcfw\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc\" (UID: \"e5c19070-be08-4a9c-81b4-b662dc63f84d\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" Apr 20 15:07:41.638465 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.638294 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5c19070-be08-4a9c-81b4-b662dc63f84d-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc\" (UID: \"e5c19070-be08-4a9c-81b4-b662dc63f84d\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" Apr 20 15:07:41.638465 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.638325 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5c19070-be08-4a9c-81b4-b662dc63f84d-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc\" (UID: \"e5c19070-be08-4a9c-81b4-b662dc63f84d\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" Apr 20 15:07:41.638685 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.638658 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5c19070-be08-4a9c-81b4-b662dc63f84d-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc\" (UID: \"e5c19070-be08-4a9c-81b4-b662dc63f84d\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" Apr 20 15:07:41.638738 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.638719 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/e5c19070-be08-4a9c-81b4-b662dc63f84d-home\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc\" (UID: \"e5c19070-be08-4a9c-81b4-b662dc63f84d\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" Apr 20 15:07:41.638783 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.638741 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/e5c19070-be08-4a9c-81b4-b662dc63f84d-model-cache\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc\" (UID: \"e5c19070-be08-4a9c-81b4-b662dc63f84d\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" Apr 20 15:07:41.640481 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.640453 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/e5c19070-be08-4a9c-81b4-b662dc63f84d-dshm\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc\" (UID: \"e5c19070-be08-4a9c-81b4-b662dc63f84d\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" Apr 20 15:07:41.640801 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.640782 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e5c19070-be08-4a9c-81b4-b662dc63f84d-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc\" (UID: \"e5c19070-be08-4a9c-81b4-b662dc63f84d\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" Apr 20 15:07:41.645608 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.645589 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djcfw\" (UniqueName: \"kubernetes.io/projected/e5c19070-be08-4a9c-81b4-b662dc63f84d-kube-api-access-djcfw\") pod \"premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc\" (UID: \"e5c19070-be08-4a9c-81b4-b662dc63f84d\") " pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" Apr 20 15:07:41.813771 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.813735 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" Apr 20 15:07:41.938295 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:41.938270 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc"] Apr 20 15:07:41.939948 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:07:41.939921 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5c19070_be08_4a9c_81b4_b662dc63f84d.slice/crio-3c4d2b51f6807d7e5acb95ca5328729bc6d13d1a175bca372fb7b5e1e0046ffd WatchSource:0}: Error finding container 3c4d2b51f6807d7e5acb95ca5328729bc6d13d1a175bca372fb7b5e1e0046ffd: Status 404 returned error can't find the container with id 3c4d2b51f6807d7e5acb95ca5328729bc6d13d1a175bca372fb7b5e1e0046ffd Apr 20 15:07:42.200692 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:42.200600 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" event={"ID":"e5c19070-be08-4a9c-81b4-b662dc63f84d","Type":"ContainerStarted","Data":"3203cfd82f234c302931e3fb7793780ca7f0107dc1f77995817757ddd1283411"} Apr 20 15:07:42.200692 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:42.200648 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" event={"ID":"e5c19070-be08-4a9c-81b4-b662dc63f84d","Type":"ContainerStarted","Data":"3c4d2b51f6807d7e5acb95ca5328729bc6d13d1a175bca372fb7b5e1e0046ffd"} Apr 20 15:07:50.233222 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:50.233188 2574 generic.go:358] "Generic (PLEG): container finished" podID="e5c19070-be08-4a9c-81b4-b662dc63f84d" containerID="3203cfd82f234c302931e3fb7793780ca7f0107dc1f77995817757ddd1283411" exitCode=0 Apr 20 15:07:50.233569 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:50.233264 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" event={"ID":"e5c19070-be08-4a9c-81b4-b662dc63f84d","Type":"ContainerDied","Data":"3203cfd82f234c302931e3fb7793780ca7f0107dc1f77995817757ddd1283411"} Apr 20 15:07:51.239360 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:51.239329 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" event={"ID":"e5c19070-be08-4a9c-81b4-b662dc63f84d","Type":"ContainerStarted","Data":"1b6585b420e9ab7860c3db9e8a12e3757fb0807c24330ec515daa9c24d312e43"} Apr 20 15:07:51.239844 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:51.239553 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" Apr 20 15:07:51.257430 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:07:51.257362 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" podStartSLOduration=10.02798374 podStartE2EDuration="10.257351064s" podCreationTimestamp="2026-04-20 15:07:41 +0000 UTC" firstStartedPulling="2026-04-20 15:07:50.23388939 +0000 UTC m=+749.307164996" lastFinishedPulling="2026-04-20 15:07:50.463256714 +0000 UTC m=+749.536532320" observedRunningTime="2026-04-20 15:07:51.256159067 +0000 UTC m=+750.329434693" watchObservedRunningTime="2026-04-20 15:07:51.257351064 +0000 UTC m=+750.330626692" Apr 20 15:08:02.256741 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:02.256700 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc" Apr 20 15:08:03.661696 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:03.661663 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m"] Apr 20 15:08:03.704812 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:03.704783 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m"] Apr 20 15:08:03.704955 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:03.704893 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" Apr 20 15:08:03.710636 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:03.710615 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 20 15:08:03.833472 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:03.833445 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bb87c3d7-fc9b-452c-8b14-d4e58dbc3139-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m\" (UID: \"bb87c3d7-fc9b-452c-8b14-d4e58dbc3139\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" Apr 20 15:08:03.833625 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:03.833489 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb87c3d7-fc9b-452c-8b14-d4e58dbc3139-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m\" (UID: \"bb87c3d7-fc9b-452c-8b14-d4e58dbc3139\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" Apr 20 15:08:03.833625 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:03.833526 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk7zd\" (UniqueName: \"kubernetes.io/projected/bb87c3d7-fc9b-452c-8b14-d4e58dbc3139-kube-api-access-hk7zd\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m\" (UID: \"bb87c3d7-fc9b-452c-8b14-d4e58dbc3139\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" Apr 20 15:08:03.833625 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:03.833600 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bb87c3d7-fc9b-452c-8b14-d4e58dbc3139-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m\" (UID: \"bb87c3d7-fc9b-452c-8b14-d4e58dbc3139\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" Apr 20 15:08:03.833831 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:03.833680 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bb87c3d7-fc9b-452c-8b14-d4e58dbc3139-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m\" (UID: \"bb87c3d7-fc9b-452c-8b14-d4e58dbc3139\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" Apr 20 15:08:03.833831 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:03.833747 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bb87c3d7-fc9b-452c-8b14-d4e58dbc3139-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m\" (UID: \"bb87c3d7-fc9b-452c-8b14-d4e58dbc3139\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" Apr 20 15:08:03.934831 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:03.934749 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bb87c3d7-fc9b-452c-8b14-d4e58dbc3139-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m\" (UID: \"bb87c3d7-fc9b-452c-8b14-d4e58dbc3139\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" Apr 20 15:08:03.934831 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:03.934784 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb87c3d7-fc9b-452c-8b14-d4e58dbc3139-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m\" (UID: \"bb87c3d7-fc9b-452c-8b14-d4e58dbc3139\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" Apr 20 15:08:03.934831 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:03.934807 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hk7zd\" (UniqueName: \"kubernetes.io/projected/bb87c3d7-fc9b-452c-8b14-d4e58dbc3139-kube-api-access-hk7zd\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m\" (UID: \"bb87c3d7-fc9b-452c-8b14-d4e58dbc3139\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" Apr 20 15:08:03.935091 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:03.934846 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bb87c3d7-fc9b-452c-8b14-d4e58dbc3139-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m\" (UID: \"bb87c3d7-fc9b-452c-8b14-d4e58dbc3139\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" Apr 20 15:08:03.935091 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:03.934876 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bb87c3d7-fc9b-452c-8b14-d4e58dbc3139-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m\" (UID: \"bb87c3d7-fc9b-452c-8b14-d4e58dbc3139\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" Apr 20 15:08:03.935091 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:03.934906 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bb87c3d7-fc9b-452c-8b14-d4e58dbc3139-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m\" (UID: \"bb87c3d7-fc9b-452c-8b14-d4e58dbc3139\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" Apr 20 15:08:03.935253 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:03.935199 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/bb87c3d7-fc9b-452c-8b14-d4e58dbc3139-model-cache\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m\" (UID: \"bb87c3d7-fc9b-452c-8b14-d4e58dbc3139\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" Apr 20 15:08:03.935316 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:03.935267 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb87c3d7-fc9b-452c-8b14-d4e58dbc3139-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m\" (UID: \"bb87c3d7-fc9b-452c-8b14-d4e58dbc3139\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" Apr 20 15:08:03.935411 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:03.935328 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/bb87c3d7-fc9b-452c-8b14-d4e58dbc3139-home\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m\" (UID: \"bb87c3d7-fc9b-452c-8b14-d4e58dbc3139\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" Apr 20 15:08:03.937115 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:03.937097 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/bb87c3d7-fc9b-452c-8b14-d4e58dbc3139-dshm\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m\" (UID: \"bb87c3d7-fc9b-452c-8b14-d4e58dbc3139\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" Apr 20 15:08:03.937357 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:03.937342 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/bb87c3d7-fc9b-452c-8b14-d4e58dbc3139-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m\" (UID: \"bb87c3d7-fc9b-452c-8b14-d4e58dbc3139\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" Apr 20 15:08:03.942860 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:03.942841 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk7zd\" (UniqueName: \"kubernetes.io/projected/bb87c3d7-fc9b-452c-8b14-d4e58dbc3139-kube-api-access-hk7zd\") pod \"facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m\" (UID: \"bb87c3d7-fc9b-452c-8b14-d4e58dbc3139\") " pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" Apr 20 15:08:04.015541 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:04.015512 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" Apr 20 15:08:04.145252 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:04.145224 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m"] Apr 20 15:08:04.146401 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:08:04.146338 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb87c3d7_fc9b_452c_8b14_d4e58dbc3139.slice/crio-c327c92339f8eed9ac8e2a4229cbd31b0f3ccf98a66f5056ebf0b683eabb66ef WatchSource:0}: Error finding container c327c92339f8eed9ac8e2a4229cbd31b0f3ccf98a66f5056ebf0b683eabb66ef: Status 404 returned error can't find the container with id c327c92339f8eed9ac8e2a4229cbd31b0f3ccf98a66f5056ebf0b683eabb66ef Apr 20 15:08:04.289534 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:04.289488 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" event={"ID":"bb87c3d7-fc9b-452c-8b14-d4e58dbc3139","Type":"ContainerStarted","Data":"3777e263007617863816dc240c6c8f836a78c134b3e2d614059064a28879ffcd"} Apr 20 15:08:04.289534 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:04.289533 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" event={"ID":"bb87c3d7-fc9b-452c-8b14-d4e58dbc3139","Type":"ContainerStarted","Data":"c327c92339f8eed9ac8e2a4229cbd31b0f3ccf98a66f5056ebf0b683eabb66ef"} Apr 20 15:08:10.315340 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:10.315301 2574 generic.go:358] "Generic (PLEG): container finished" podID="bb87c3d7-fc9b-452c-8b14-d4e58dbc3139" containerID="3777e263007617863816dc240c6c8f836a78c134b3e2d614059064a28879ffcd" exitCode=0 Apr 20 15:08:10.315754 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:10.315388 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" event={"ID":"bb87c3d7-fc9b-452c-8b14-d4e58dbc3139","Type":"ContainerDied","Data":"3777e263007617863816dc240c6c8f836a78c134b3e2d614059064a28879ffcd"} Apr 20 15:08:11.320476 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:11.320441 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" event={"ID":"bb87c3d7-fc9b-452c-8b14-d4e58dbc3139","Type":"ContainerStarted","Data":"22239e8eeadbc6e22a1b59e34829a982d1b7aa078f4874f37a73036f1abb4072"} Apr 20 15:08:11.320854 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:11.320656 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" Apr 20 15:08:11.338782 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:11.338738 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" podStartSLOduration=8.090845724 podStartE2EDuration="8.338725445s" podCreationTimestamp="2026-04-20 15:08:03 +0000 UTC" firstStartedPulling="2026-04-20 15:08:10.316017841 +0000 UTC m=+769.389293444" lastFinishedPulling="2026-04-20 15:08:10.563897559 +0000 UTC m=+769.637173165" observedRunningTime="2026-04-20 15:08:11.336628841 +0000 UTC m=+770.409904467" watchObservedRunningTime="2026-04-20 15:08:11.338725445 +0000 UTC m=+770.412001136" Apr 20 15:08:22.336993 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:22.336961 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m" Apr 20 15:08:23.380165 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:23.380133 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-54ffc9547d-k2w4b"] Apr 20 15:08:23.383592 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:23.383575 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-54ffc9547d-k2w4b" Apr 20 15:08:23.389858 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:23.389829 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-54ffc9547d-k2w4b"] Apr 20 15:08:23.512348 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:23.512314 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmftr\" (UniqueName: \"kubernetes.io/projected/2f15cc1a-7858-4e6f-ae57-1deacf45a422-kube-api-access-vmftr\") pod \"authorino-54ffc9547d-k2w4b\" (UID: \"2f15cc1a-7858-4e6f-ae57-1deacf45a422\") " pod="kuadrant-system/authorino-54ffc9547d-k2w4b" Apr 20 15:08:23.512348 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:23.512351 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/2f15cc1a-7858-4e6f-ae57-1deacf45a422-tls-cert\") pod \"authorino-54ffc9547d-k2w4b\" (UID: \"2f15cc1a-7858-4e6f-ae57-1deacf45a422\") " pod="kuadrant-system/authorino-54ffc9547d-k2w4b" Apr 20 15:08:23.512563 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:23.512391 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/2f15cc1a-7858-4e6f-ae57-1deacf45a422-oidc-ca\") pod \"authorino-54ffc9547d-k2w4b\" (UID: \"2f15cc1a-7858-4e6f-ae57-1deacf45a422\") " pod="kuadrant-system/authorino-54ffc9547d-k2w4b" Apr 20 15:08:23.613588 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:23.613557 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vmftr\" (UniqueName: \"kubernetes.io/projected/2f15cc1a-7858-4e6f-ae57-1deacf45a422-kube-api-access-vmftr\") pod \"authorino-54ffc9547d-k2w4b\" (UID: \"2f15cc1a-7858-4e6f-ae57-1deacf45a422\") " pod="kuadrant-system/authorino-54ffc9547d-k2w4b" Apr 20 15:08:23.613588 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:23.613592 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/2f15cc1a-7858-4e6f-ae57-1deacf45a422-tls-cert\") pod \"authorino-54ffc9547d-k2w4b\" (UID: \"2f15cc1a-7858-4e6f-ae57-1deacf45a422\") " pod="kuadrant-system/authorino-54ffc9547d-k2w4b" Apr 20 15:08:23.613812 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:23.613611 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/2f15cc1a-7858-4e6f-ae57-1deacf45a422-oidc-ca\") pod \"authorino-54ffc9547d-k2w4b\" (UID: \"2f15cc1a-7858-4e6f-ae57-1deacf45a422\") " pod="kuadrant-system/authorino-54ffc9547d-k2w4b" Apr 20 15:08:23.614137 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:23.614112 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/2f15cc1a-7858-4e6f-ae57-1deacf45a422-oidc-ca\") pod \"authorino-54ffc9547d-k2w4b\" (UID: \"2f15cc1a-7858-4e6f-ae57-1deacf45a422\") " pod="kuadrant-system/authorino-54ffc9547d-k2w4b" Apr 20 15:08:23.615878 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:23.615852 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/2f15cc1a-7858-4e6f-ae57-1deacf45a422-tls-cert\") pod \"authorino-54ffc9547d-k2w4b\" (UID: \"2f15cc1a-7858-4e6f-ae57-1deacf45a422\") " pod="kuadrant-system/authorino-54ffc9547d-k2w4b" Apr 20 15:08:23.620992 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:23.620971 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmftr\" (UniqueName: \"kubernetes.io/projected/2f15cc1a-7858-4e6f-ae57-1deacf45a422-kube-api-access-vmftr\") pod \"authorino-54ffc9547d-k2w4b\" (UID: \"2f15cc1a-7858-4e6f-ae57-1deacf45a422\") " pod="kuadrant-system/authorino-54ffc9547d-k2w4b" Apr 20 15:08:23.693905 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:23.693833 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-54ffc9547d-k2w4b" Apr 20 15:08:24.019993 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:24.019971 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-54ffc9547d-k2w4b"] Apr 20 15:08:24.021700 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:08:24.021673 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f15cc1a_7858_4e6f_ae57_1deacf45a422.slice/crio-6f57d32ab28ffa36e1d20f75b67e4a4120088ac66146b321a186c441e93a36eb WatchSource:0}: Error finding container 6f57d32ab28ffa36e1d20f75b67e4a4120088ac66146b321a186c441e93a36eb: Status 404 returned error can't find the container with id 6f57d32ab28ffa36e1d20f75b67e4a4120088ac66146b321a186c441e93a36eb Apr 20 15:08:24.372537 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:24.372497 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-54ffc9547d-k2w4b" event={"ID":"2f15cc1a-7858-4e6f-ae57-1deacf45a422","Type":"ContainerStarted","Data":"6f57d32ab28ffa36e1d20f75b67e4a4120088ac66146b321a186c441e93a36eb"} Apr 20 15:08:25.378052 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:25.378018 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-54ffc9547d-k2w4b" event={"ID":"2f15cc1a-7858-4e6f-ae57-1deacf45a422","Type":"ContainerStarted","Data":"c6853fbf7170c597b69f1362a3ca4038c0abe4afdbc367e9615d02e1ad2de72f"} Apr 20 15:08:25.396538 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:25.396483 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-54ffc9547d-k2w4b" podStartSLOduration=1.91914312 podStartE2EDuration="2.396464717s" podCreationTimestamp="2026-04-20 15:08:23 +0000 UTC" firstStartedPulling="2026-04-20 15:08:24.023001588 +0000 UTC m=+783.096277194" lastFinishedPulling="2026-04-20 15:08:24.500323172 +0000 UTC m=+783.573598791" observedRunningTime="2026-04-20 15:08:25.393124299 +0000 UTC m=+784.466399925" watchObservedRunningTime="2026-04-20 15:08:25.396464717 +0000 UTC m=+784.469740346" Apr 20 15:08:25.421839 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:25.421802 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5b565bc487-8rnzz"] Apr 20 15:08:25.422076 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:25.422052 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-5b565bc487-8rnzz" podUID="f4414530-9f52-42db-abd1-c960b1592386" containerName="authorino" containerID="cri-o://6bafa95ffc41e1685fa6f241e4920eeef83e9703983e43ca04e0da9fe4a88bf2" gracePeriod=30 Apr 20 15:08:25.671946 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:25.671923 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5b565bc487-8rnzz" Apr 20 15:08:25.733795 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:25.733761 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f4414530-9f52-42db-abd1-c960b1592386-tls-cert\") pod \"f4414530-9f52-42db-abd1-c960b1592386\" (UID: \"f4414530-9f52-42db-abd1-c960b1592386\") " Apr 20 15:08:25.733977 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:25.733810 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mtz8\" (UniqueName: \"kubernetes.io/projected/f4414530-9f52-42db-abd1-c960b1592386-kube-api-access-7mtz8\") pod \"f4414530-9f52-42db-abd1-c960b1592386\" (UID: \"f4414530-9f52-42db-abd1-c960b1592386\") " Apr 20 15:08:25.733977 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:25.733898 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/f4414530-9f52-42db-abd1-c960b1592386-oidc-ca\") pod \"f4414530-9f52-42db-abd1-c960b1592386\" (UID: \"f4414530-9f52-42db-abd1-c960b1592386\") " Apr 20 15:08:25.735926 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:25.735889 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4414530-9f52-42db-abd1-c960b1592386-kube-api-access-7mtz8" (OuterVolumeSpecName: "kube-api-access-7mtz8") pod "f4414530-9f52-42db-abd1-c960b1592386" (UID: "f4414530-9f52-42db-abd1-c960b1592386"). InnerVolumeSpecName "kube-api-access-7mtz8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:08:25.738572 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:25.738547 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4414530-9f52-42db-abd1-c960b1592386-oidc-ca" (OuterVolumeSpecName: "oidc-ca") pod "f4414530-9f52-42db-abd1-c960b1592386" (UID: "f4414530-9f52-42db-abd1-c960b1592386"). InnerVolumeSpecName "oidc-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:08:25.743268 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:25.743245 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4414530-9f52-42db-abd1-c960b1592386-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "f4414530-9f52-42db-abd1-c960b1592386" (UID: "f4414530-9f52-42db-abd1-c960b1592386"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 15:08:25.835439 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:25.835410 2574 reconciler_common.go:299] "Volume detached for volume \"oidc-ca\" (UniqueName: \"kubernetes.io/configmap/f4414530-9f52-42db-abd1-c960b1592386-oidc-ca\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:08:25.835439 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:25.835434 2574 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/f4414530-9f52-42db-abd1-c960b1592386-tls-cert\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:08:25.835439 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:25.835444 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7mtz8\" (UniqueName: \"kubernetes.io/projected/f4414530-9f52-42db-abd1-c960b1592386-kube-api-access-7mtz8\") on node \"ip-10-0-130-249.ec2.internal\" DevicePath \"\"" Apr 20 15:08:26.382956 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:26.382914 2574 generic.go:358] "Generic (PLEG): container finished" podID="f4414530-9f52-42db-abd1-c960b1592386" containerID="6bafa95ffc41e1685fa6f241e4920eeef83e9703983e43ca04e0da9fe4a88bf2" exitCode=0 Apr 20 15:08:26.383460 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:26.382966 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5b565bc487-8rnzz" Apr 20 15:08:26.383460 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:26.382998 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5b565bc487-8rnzz" event={"ID":"f4414530-9f52-42db-abd1-c960b1592386","Type":"ContainerDied","Data":"6bafa95ffc41e1685fa6f241e4920eeef83e9703983e43ca04e0da9fe4a88bf2"} Apr 20 15:08:26.383460 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:26.383037 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-5b565bc487-8rnzz" event={"ID":"f4414530-9f52-42db-abd1-c960b1592386","Type":"ContainerDied","Data":"956c35037fc7498c2581224130a6a501d38136227bf1b74ddbc33aede8ce79a7"} Apr 20 15:08:26.383460 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:26.383056 2574 scope.go:117] "RemoveContainer" containerID="6bafa95ffc41e1685fa6f241e4920eeef83e9703983e43ca04e0da9fe4a88bf2" Apr 20 15:08:26.392313 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:26.392293 2574 scope.go:117] "RemoveContainer" containerID="6bafa95ffc41e1685fa6f241e4920eeef83e9703983e43ca04e0da9fe4a88bf2" Apr 20 15:08:26.392609 ip-10-0-130-249 kubenswrapper[2574]: E0420 15:08:26.392591 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bafa95ffc41e1685fa6f241e4920eeef83e9703983e43ca04e0da9fe4a88bf2\": container with ID starting with 6bafa95ffc41e1685fa6f241e4920eeef83e9703983e43ca04e0da9fe4a88bf2 not found: ID does not exist" containerID="6bafa95ffc41e1685fa6f241e4920eeef83e9703983e43ca04e0da9fe4a88bf2" Apr 20 15:08:26.392658 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:26.392617 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bafa95ffc41e1685fa6f241e4920eeef83e9703983e43ca04e0da9fe4a88bf2"} err="failed to get container status \"6bafa95ffc41e1685fa6f241e4920eeef83e9703983e43ca04e0da9fe4a88bf2\": rpc error: code = NotFound desc = could not find container \"6bafa95ffc41e1685fa6f241e4920eeef83e9703983e43ca04e0da9fe4a88bf2\": container with ID starting with 6bafa95ffc41e1685fa6f241e4920eeef83e9703983e43ca04e0da9fe4a88bf2 not found: ID does not exist" Apr 20 15:08:26.404297 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:26.404271 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5b565bc487-8rnzz"] Apr 20 15:08:26.406638 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:26.406621 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-5b565bc487-8rnzz"] Apr 20 15:08:27.447059 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:08:27.447026 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4414530-9f52-42db-abd1-c960b1592386" path="/var/lib/kubelet/pods/f4414530-9f52-42db-abd1-c960b1592386/volumes" Apr 20 15:10:22.169735 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:10:22.169701 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4z848_d54d05c4-b074-4189-b1dd-7ff476b824ec/console-operator/1.log" Apr 20 15:10:22.171947 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:10:22.171922 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4z848_d54d05c4-b074-4189-b1dd-7ff476b824ec/console-operator/1.log" Apr 20 15:15:22.202753 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:15:22.202721 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4z848_d54d05c4-b074-4189-b1dd-7ff476b824ec/console-operator/1.log" Apr 20 15:15:22.205643 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:15:22.205620 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4z848_d54d05c4-b074-4189-b1dd-7ff476b824ec/console-operator/1.log" Apr 20 15:20:22.233998 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:20:22.233971 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4z848_d54d05c4-b074-4189-b1dd-7ff476b824ec/console-operator/1.log" Apr 20 15:20:22.239551 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:20:22.239526 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4z848_d54d05c4-b074-4189-b1dd-7ff476b824ec/console-operator/1.log" Apr 20 15:25:22.266992 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:25:22.266961 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4z848_d54d05c4-b074-4189-b1dd-7ff476b824ec/console-operator/1.log" Apr 20 15:25:22.274360 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:25:22.274336 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4z848_d54d05c4-b074-4189-b1dd-7ff476b824ec/console-operator/1.log" Apr 20 15:29:13.633503 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:13.633473 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-54ffc9547d-k2w4b_2f15cc1a-7858-4e6f-ae57-1deacf45a422/authorino/0.log" Apr 20 15:29:17.806413 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:17.806364 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-7dff657567-f472w_5e73bc52-3bcc-4604-a7db-a9a6717eaba2/manager/0.log" Apr 20 15:29:18.277170 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:18.277096 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-99ff97f7d-zvp7j_a64a35a4-d683-4459-a09a-c300ce5b4faf/manager/0.log" Apr 20 15:29:19.159610 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:19.159580 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj_c5c2466d-2df4-4d90-9496-8107320cda01/util/0.log" Apr 20 15:29:19.165541 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:19.165519 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj_c5c2466d-2df4-4d90-9496-8107320cda01/pull/0.log" Apr 20 15:29:19.171403 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:19.171384 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj_c5c2466d-2df4-4d90-9496-8107320cda01/extract/0.log" Apr 20 15:29:19.278978 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:19.278953 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt_9be176a2-ed3f-43c0-9883-1dac2ee5912a/util/0.log" Apr 20 15:29:19.284669 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:19.284647 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt_9be176a2-ed3f-43c0-9883-1dac2ee5912a/pull/0.log" Apr 20 15:29:19.290327 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:19.290311 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt_9be176a2-ed3f-43c0-9883-1dac2ee5912a/extract/0.log" Apr 20 15:29:19.397629 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:19.397603 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9_90d7cc5c-aebb-43f8-880a-22f7a4c687dd/extract/0.log" Apr 20 15:29:19.402632 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:19.402613 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9_90d7cc5c-aebb-43f8-880a-22f7a4c687dd/util/0.log" Apr 20 15:29:19.408660 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:19.408644 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9_90d7cc5c-aebb-43f8-880a-22f7a4c687dd/pull/0.log" Apr 20 15:29:19.515180 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:19.515111 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn_5996cc60-3e9b-4037-bd19-59b04fe500a8/util/0.log" Apr 20 15:29:19.521427 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:19.521409 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn_5996cc60-3e9b-4037-bd19-59b04fe500a8/pull/0.log" Apr 20 15:29:19.527123 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:19.527105 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn_5996cc60-3e9b-4037-bd19-59b04fe500a8/extract/0.log" Apr 20 15:29:19.643755 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:19.643731 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-54ffc9547d-k2w4b_2f15cc1a-7858-4e6f-ae57-1deacf45a422/authorino/0.log" Apr 20 15:29:19.987925 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:19.987892 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-g7g7l_fb55e236-55df-4e39-b642-53b0bc9d710c/kuadrant-console-plugin/0.log" Apr 20 15:29:21.117386 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:21.117344 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-85d448cc4f-qxj6q_42d57d48-3d62-4ec2-a13f-104344cd9cd1/kube-auth-proxy/0.log" Apr 20 15:29:21.906904 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:21.906874 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt_547e69ed-99a1-4db4-ad66-258166e5f48c/main/0.log" Apr 20 15:29:21.913085 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:21.913058 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dccs46dt_547e69ed-99a1-4db4-ad66-258166e5f48c/storage-initializer/0.log" Apr 20 15:29:22.023623 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:22.023590 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m_bb87c3d7-fc9b-452c-8b14-d4e58dbc3139/storage-initializer/0.log" Apr 20 15:29:22.030183 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:22.030164 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-7889cd8c78-rfc9m_bb87c3d7-fc9b-452c-8b14-d4e58dbc3139/main/0.log" Apr 20 15:29:22.140849 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:22.140805 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc_e5c19070-be08-4a9c-81b4-b662dc63f84d/storage-initializer/0.log" Apr 20 15:29:22.147114 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:22.147092 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-f5df4587b-q4nxc_e5c19070-be08-4a9c-81b4-b662dc63f84d/main/0.log" Apr 20 15:29:28.964362 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:28.964332 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7pz5g_16fab89b-6034-44ce-9e43-24eea5f7402c/global-pull-secret-syncer/0.log" Apr 20 15:29:29.116847 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:29.116819 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-plvj4_6658032c-bba0-4e90-8a55-840d8cdab9e3/konnectivity-agent/0.log" Apr 20 15:29:29.172632 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:29.172608 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-249.ec2.internal_75684a0b9a7080a9984ae7578d5b190b/haproxy/0.log" Apr 20 15:29:33.088902 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:33.088876 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj_c5c2466d-2df4-4d90-9496-8107320cda01/extract/0.log" Apr 20 15:29:33.108836 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:33.108809 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj_c5c2466d-2df4-4d90-9496-8107320cda01/util/0.log" Apr 20 15:29:33.130965 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:33.130947 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b75957qsj_c5c2466d-2df4-4d90-9496-8107320cda01/pull/0.log" Apr 20 15:29:33.173635 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:33.173618 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt_9be176a2-ed3f-43c0-9883-1dac2ee5912a/extract/0.log" Apr 20 15:29:33.194117 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:33.194097 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt_9be176a2-ed3f-43c0-9883-1dac2ee5912a/util/0.log" Apr 20 15:29:33.213065 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:33.213050 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0np7dt_9be176a2-ed3f-43c0-9883-1dac2ee5912a/pull/0.log" Apr 20 15:29:33.238514 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:33.238499 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9_90d7cc5c-aebb-43f8-880a-22f7a4c687dd/extract/0.log" Apr 20 15:29:33.258642 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:33.258625 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9_90d7cc5c-aebb-43f8-880a-22f7a4c687dd/util/0.log" Apr 20 15:29:33.277022 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:33.277005 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73ffpd9_90d7cc5c-aebb-43f8-880a-22f7a4c687dd/pull/0.log" Apr 20 15:29:33.302855 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:33.302839 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn_5996cc60-3e9b-4037-bd19-59b04fe500a8/extract/0.log" Apr 20 15:29:33.322720 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:33.322703 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn_5996cc60-3e9b-4037-bd19-59b04fe500a8/util/0.log" Apr 20 15:29:33.341676 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:33.341600 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1d7rjn_5996cc60-3e9b-4037-bd19-59b04fe500a8/pull/0.log" Apr 20 15:29:33.510834 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:33.510805 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-54ffc9547d-k2w4b_2f15cc1a-7858-4e6f-ae57-1deacf45a422/authorino/0.log" Apr 20 15:29:33.605147 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:33.605078 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-g7g7l_fb55e236-55df-4e39-b642-53b0bc9d710c/kuadrant-console-plugin/0.log" Apr 20 15:29:35.321967 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:35.321940 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-2whnr_27348b4c-3c6d-4f5c-aecc-ee4f7ea4eac8/cluster-monitoring-operator/0.log" Apr 20 15:29:35.438103 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:35.438078 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-h6r27_1b00210f-5d58-409a-91ec-366aaeae3d8f/monitoring-plugin/0.log" Apr 20 15:29:35.467268 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:35.467244 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n9dx7_f4717f11-2104-4860-9b31-d3171a0eacef/node-exporter/0.log" Apr 20 15:29:35.488461 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:35.488442 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n9dx7_f4717f11-2104-4860-9b31-d3171a0eacef/kube-rbac-proxy/0.log" Apr 20 15:29:35.509718 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:35.509699 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-n9dx7_f4717f11-2104-4860-9b31-d3171a0eacef/init-textfile/0.log" Apr 20 15:29:37.259792 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:37.259760 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-l7992_9444ff6f-3ede-40a2-a63c-97c92b90d755/networking-console-plugin/0.log" Apr 20 15:29:37.747759 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:37.747730 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4z848_d54d05c4-b074-4189-b1dd-7ff476b824ec/console-operator/1.log" Apr 20 15:29:37.752948 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:37.752924 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-4z848_d54d05c4-b074-4189-b1dd-7ff476b824ec/console-operator/2.log" Apr 20 15:29:37.843697 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:37.843666 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh"] Apr 20 15:29:37.844046 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:37.844032 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4414530-9f52-42db-abd1-c960b1592386" containerName="authorino" Apr 20 15:29:37.844098 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:37.844047 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4414530-9f52-42db-abd1-c960b1592386" containerName="authorino" Apr 20 15:29:37.844137 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:37.844128 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4414530-9f52-42db-abd1-c960b1592386" containerName="authorino" Apr 20 15:29:37.847308 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:37.847293 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh" Apr 20 15:29:37.850021 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:37.849996 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8mfh8\"/\"openshift-service-ca.crt\"" Apr 20 15:29:37.850144 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:37.850055 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8mfh8\"/\"kube-root-ca.crt\"" Apr 20 15:29:37.851216 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:37.851201 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8mfh8\"/\"default-dockercfg-xm2g5\"" Apr 20 15:29:37.856949 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:37.856923 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh"] Apr 20 15:29:37.951334 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:37.951306 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b166a827-8e34-4db0-97e0-cf872c5efafc-podres\") pod \"perf-node-gather-daemonset-kzblh\" (UID: \"b166a827-8e34-4db0-97e0-cf872c5efafc\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh" Apr 20 15:29:37.951476 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:37.951355 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5tmx\" (UniqueName: \"kubernetes.io/projected/b166a827-8e34-4db0-97e0-cf872c5efafc-kube-api-access-b5tmx\") pod \"perf-node-gather-daemonset-kzblh\" (UID: \"b166a827-8e34-4db0-97e0-cf872c5efafc\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh" Apr 20 15:29:37.951476 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:37.951465 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b166a827-8e34-4db0-97e0-cf872c5efafc-proc\") pod \"perf-node-gather-daemonset-kzblh\" (UID: \"b166a827-8e34-4db0-97e0-cf872c5efafc\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh" Apr 20 15:29:37.951554 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:37.951490 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b166a827-8e34-4db0-97e0-cf872c5efafc-lib-modules\") pod \"perf-node-gather-daemonset-kzblh\" (UID: \"b166a827-8e34-4db0-97e0-cf872c5efafc\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh" Apr 20 15:29:37.951554 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:37.951513 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b166a827-8e34-4db0-97e0-cf872c5efafc-sys\") pod \"perf-node-gather-daemonset-kzblh\" (UID: \"b166a827-8e34-4db0-97e0-cf872c5efafc\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh" Apr 20 15:29:38.052348 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:38.052271 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b5tmx\" (UniqueName: \"kubernetes.io/projected/b166a827-8e34-4db0-97e0-cf872c5efafc-kube-api-access-b5tmx\") pod \"perf-node-gather-daemonset-kzblh\" (UID: \"b166a827-8e34-4db0-97e0-cf872c5efafc\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh" Apr 20 15:29:38.052508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:38.052349 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b166a827-8e34-4db0-97e0-cf872c5efafc-proc\") pod \"perf-node-gather-daemonset-kzblh\" (UID: \"b166a827-8e34-4db0-97e0-cf872c5efafc\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh" Apr 20 15:29:38.052508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:38.052397 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b166a827-8e34-4db0-97e0-cf872c5efafc-lib-modules\") pod \"perf-node-gather-daemonset-kzblh\" (UID: \"b166a827-8e34-4db0-97e0-cf872c5efafc\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh" Apr 20 15:29:38.052508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:38.052426 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b166a827-8e34-4db0-97e0-cf872c5efafc-sys\") pod \"perf-node-gather-daemonset-kzblh\" (UID: \"b166a827-8e34-4db0-97e0-cf872c5efafc\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh" Apr 20 15:29:38.052508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:38.052453 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b166a827-8e34-4db0-97e0-cf872c5efafc-proc\") pod \"perf-node-gather-daemonset-kzblh\" (UID: \"b166a827-8e34-4db0-97e0-cf872c5efafc\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh" Apr 20 15:29:38.052508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:38.052482 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b166a827-8e34-4db0-97e0-cf872c5efafc-podres\") pod \"perf-node-gather-daemonset-kzblh\" (UID: \"b166a827-8e34-4db0-97e0-cf872c5efafc\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh" Apr 20 15:29:38.052508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:38.052505 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b166a827-8e34-4db0-97e0-cf872c5efafc-sys\") pod \"perf-node-gather-daemonset-kzblh\" (UID: \"b166a827-8e34-4db0-97e0-cf872c5efafc\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh" Apr 20 15:29:38.052508 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:38.052507 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b166a827-8e34-4db0-97e0-cf872c5efafc-lib-modules\") pod \"perf-node-gather-daemonset-kzblh\" (UID: \"b166a827-8e34-4db0-97e0-cf872c5efafc\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh" Apr 20 15:29:38.052826 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:38.052625 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b166a827-8e34-4db0-97e0-cf872c5efafc-podres\") pod \"perf-node-gather-daemonset-kzblh\" (UID: \"b166a827-8e34-4db0-97e0-cf872c5efafc\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh" Apr 20 15:29:38.060664 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:38.060637 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5tmx\" (UniqueName: \"kubernetes.io/projected/b166a827-8e34-4db0-97e0-cf872c5efafc-kube-api-access-b5tmx\") pod \"perf-node-gather-daemonset-kzblh\" (UID: \"b166a827-8e34-4db0-97e0-cf872c5efafc\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh" Apr 20 15:29:38.157821 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:38.157795 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh" Apr 20 15:29:38.209898 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:38.209870 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f48876877-7zw74_36d89a71-6bdc-4d23-b6a4-fe4e445e5939/console/0.log" Apr 20 15:29:38.283624 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:38.283600 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh"] Apr 20 15:29:38.288090 ip-10-0-130-249 kubenswrapper[2574]: W0420 15:29:38.288048 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb166a827_8e34_4db0_97e0_cf872c5efafc.slice/crio-626a20f75a3b39f5e1e052c105201bea616b8fd7fcba0d52be3926a16ece8d97 WatchSource:0}: Error finding container 626a20f75a3b39f5e1e052c105201bea616b8fd7fcba0d52be3926a16ece8d97: Status 404 returned error can't find the container with id 626a20f75a3b39f5e1e052c105201bea616b8fd7fcba0d52be3926a16ece8d97 Apr 20 15:29:38.290278 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:38.290258 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:29:39.177886 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:39.177854 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh" event={"ID":"b166a827-8e34-4db0-97e0-cf872c5efafc","Type":"ContainerStarted","Data":"4be18c9e1cecc5eebeff6def11a8ef15d09016a76f0ef9cf747fcff39e4bcfb7"} Apr 20 15:29:39.177886 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:39.177885 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh" event={"ID":"b166a827-8e34-4db0-97e0-cf872c5efafc","Type":"ContainerStarted","Data":"626a20f75a3b39f5e1e052c105201bea616b8fd7fcba0d52be3926a16ece8d97"} Apr 20 15:29:39.178106 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:39.177978 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh" Apr 20 15:29:39.194439 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:39.194400 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh" podStartSLOduration=2.194389148 podStartE2EDuration="2.194389148s" podCreationTimestamp="2026-04-20 15:29:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:29:39.192099347 +0000 UTC m=+2058.265374972" watchObservedRunningTime="2026-04-20 15:29:39.194389148 +0000 UTC m=+2058.267664773" Apr 20 15:29:39.620291 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:39.620265 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pq8qx_057f0667-15cc-4883-a91d-c360de54e58f/dns/0.log" Apr 20 15:29:39.639486 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:39.639465 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pq8qx_057f0667-15cc-4883-a91d-c360de54e58f/kube-rbac-proxy/0.log" Apr 20 15:29:39.681197 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:39.681178 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mj558_772f88da-629b-4161-9ed5-8a916387c9bd/dns-node-resolver/0.log" Apr 20 15:29:40.124253 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:40.124226 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-79bd4b4857-ktbsk_022f5820-b316-4891-a62b-9cdbcd1b964e/registry/0.log" Apr 20 15:29:40.183318 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:40.183260 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-qxsfj_10081761-39cd-4657-8ccf-94426cfd0833/node-ca/0.log" Apr 20 15:29:41.196773 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:41.196744 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-85d448cc4f-qxj6q_42d57d48-3d62-4ec2-a13f-104344cd9cd1/kube-auth-proxy/0.log" Apr 20 15:29:41.846266 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:41.846226 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-d8wts_a001809b-d266-4d12-b9a2-d400942f2755/serve-healthcheck-canary/0.log" Apr 20 15:29:42.272737 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:42.272626 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-xc2s2_1b93bf46-c126-4ef5-9add-d72c0cbb7dae/insights-operator/0.log" Apr 20 15:29:42.273138 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:42.272810 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-xc2s2_1b93bf46-c126-4ef5-9add-d72c0cbb7dae/insights-operator/1.log" Apr 20 15:29:42.413261 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:42.413226 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qsjv2_cfb26a94-539f-4773-8a3d-e8350d9e2367/kube-rbac-proxy/0.log" Apr 20 15:29:42.432791 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:42.432763 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qsjv2_cfb26a94-539f-4773-8a3d-e8350d9e2367/exporter/0.log" Apr 20 15:29:42.452238 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:42.452212 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qsjv2_cfb26a94-539f-4773-8a3d-e8350d9e2367/extractor/0.log" Apr 20 15:29:44.474566 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:44.474539 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-7dff657567-f472w_5e73bc52-3bcc-4604-a7db-a9a6717eaba2/manager/0.log" Apr 20 15:29:44.619261 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:44.619237 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-99ff97f7d-zvp7j_a64a35a4-d683-4459-a09a-c300ce5b4faf/manager/0.log" Apr 20 15:29:45.191953 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:45.191921 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kzblh" Apr 20 15:29:45.846030 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:45.845993 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6687ffb5c6-njnff_0135a21a-fcc4-4ab0-9372-650bfce6790c/manager/0.log" Apr 20 15:29:45.890956 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:45.890909 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-57nfs_70145afa-8dad-481b-be4d-ced870b4f8c8/openshift-lws-operator/0.log" Apr 20 15:29:50.195871 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:50.195799 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-wqkv4_045ed647-e291-4700-83d9-8516e6788286/migrator/0.log" Apr 20 15:29:50.217539 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:50.217520 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-wqkv4_045ed647-e291-4700-83d9-8516e6788286/graceful-termination/0.log" Apr 20 15:29:50.586533 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:50.586505 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-mv44d_332fde80-3942-477a-918e-84086221c09b/kube-storage-version-migrator-operator/1.log" Apr 20 15:29:50.587539 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:50.587521 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-mv44d_332fde80-3942-477a-918e-84086221c09b/kube-storage-version-migrator-operator/0.log" Apr 20 15:29:51.461728 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:51.461699 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-blx8n_61708c39-4987-438d-b51f-59e8cd1a1e59/kube-multus-additional-cni-plugins/0.log" Apr 20 15:29:51.480741 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:51.480722 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-blx8n_61708c39-4987-438d-b51f-59e8cd1a1e59/egress-router-binary-copy/0.log" Apr 20 15:29:51.501070 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:51.501051 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-blx8n_61708c39-4987-438d-b51f-59e8cd1a1e59/cni-plugins/0.log" Apr 20 15:29:51.519252 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:51.519233 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-blx8n_61708c39-4987-438d-b51f-59e8cd1a1e59/bond-cni-plugin/0.log" Apr 20 15:29:51.538909 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:51.538884 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-blx8n_61708c39-4987-438d-b51f-59e8cd1a1e59/routeoverride-cni/0.log" Apr 20 15:29:51.557662 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:51.557639 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-blx8n_61708c39-4987-438d-b51f-59e8cd1a1e59/whereabouts-cni-bincopy/0.log" Apr 20 15:29:51.576029 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:51.576009 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-blx8n_61708c39-4987-438d-b51f-59e8cd1a1e59/whereabouts-cni/0.log" Apr 20 15:29:51.953480 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:51.953425 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-zlwvt_09f1e2bc-4d9a-4838-b68f-01c2612ca3af/kube-multus/0.log" Apr 20 15:29:52.012023 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:52.011992 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-h24vc_90e2aae6-6b60-4b8e-a0ba-12474f425b1d/network-metrics-daemon/0.log" Apr 20 15:29:52.028988 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:52.028964 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-h24vc_90e2aae6-6b60-4b8e-a0ba-12474f425b1d/kube-rbac-proxy/0.log" Apr 20 15:29:52.864679 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:52.864652 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vmgd_e6562aeb-103a-4d96-b5d3-356a382186d6/ovn-controller/0.log" Apr 20 15:29:52.891125 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:52.891099 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vmgd_e6562aeb-103a-4d96-b5d3-356a382186d6/ovn-acl-logging/0.log" Apr 20 15:29:52.906688 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:52.906667 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vmgd_e6562aeb-103a-4d96-b5d3-356a382186d6/kube-rbac-proxy-node/0.log" Apr 20 15:29:52.924998 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:52.924981 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vmgd_e6562aeb-103a-4d96-b5d3-356a382186d6/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 15:29:52.943830 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:52.943809 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vmgd_e6562aeb-103a-4d96-b5d3-356a382186d6/northd/0.log" Apr 20 15:29:52.961788 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:52.961768 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vmgd_e6562aeb-103a-4d96-b5d3-356a382186d6/nbdb/0.log" Apr 20 15:29:52.982521 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:52.982502 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vmgd_e6562aeb-103a-4d96-b5d3-356a382186d6/sbdb/0.log" Apr 20 15:29:53.084168 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:53.084141 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5vmgd_e6562aeb-103a-4d96-b5d3-356a382186d6/ovnkube-controller/0.log" Apr 20 15:29:54.693286 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:54.693209 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-4h869_0ac42d0e-8ffe-4cc9-866d-cb7075ee1fde/check-endpoints/0.log" Apr 20 15:29:54.736514 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:54.736488 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-j2mjp_0ca0f6c2-6280-464c-8916-90374e2c88b8/network-check-target-container/0.log" Apr 20 15:29:55.862520 ip-10-0-130-249 kubenswrapper[2574]: I0420 15:29:55.862495 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-xm99w_ea32ceac-045d-412e-95db-ec7a62502246/iptables-alerter/0.log"