Apr 20 07:50:15.967404 ip-10-0-133-161 systemd[1]: Starting Kubernetes Kubelet... Apr 20 07:50:16.412697 ip-10-0-133-161 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 07:50:16.412697 ip-10-0-133-161 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 07:50:16.412697 ip-10-0-133-161 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 07:50:16.412697 ip-10-0-133-161 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 07:50:16.412697 ip-10-0-133-161 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 07:50:16.414471 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.414385 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 07:50:16.423952 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.423926 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:50:16.423952 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.423946 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:50:16.423952 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.423950 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:50:16.423952 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.423953 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:50:16.423952 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.423958 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:50:16.423952 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.423961 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:50:16.424197 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.423965 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:50:16.424197 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.423968 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:50:16.424197 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.423971 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:50:16.424197 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.423974 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:50:16.424197 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.423977 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:50:16.424197 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.423979 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:50:16.424197 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.423982 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:50:16.424197 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.423985 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:50:16.424197 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.423988 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:50:16.424197 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.423990 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:50:16.424197 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.423993 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:50:16.424197 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.423995 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:50:16.424197 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.423998 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:50:16.424197 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424001 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:50:16.424197 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424003 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:50:16.424197 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424012 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:50:16.424197 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424015 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:50:16.424197 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424018 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:50:16.424197 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424020 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:50:16.424197 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424023 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:50:16.424688 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424026 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:50:16.424688 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424028 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:50:16.424688 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424031 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:50:16.424688 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424034 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:50:16.424688 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424036 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:50:16.424688 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424039 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:50:16.424688 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424042 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:50:16.424688 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424044 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:50:16.424688 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424047 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:50:16.424688 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424050 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:50:16.424688 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424052 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:50:16.424688 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424055 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:50:16.424688 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424058 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:50:16.424688 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424061 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:50:16.424688 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424063 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:50:16.424688 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424066 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:50:16.424688 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424069 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:50:16.424688 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424072 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:50:16.424688 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424074 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:50:16.424688 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424077 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:50:16.425209 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424080 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:50:16.425209 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424082 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:50:16.425209 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424087 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:50:16.425209 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424092 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:50:16.425209 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424094 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:50:16.425209 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424097 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:50:16.425209 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424101 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:50:16.425209 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424104 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:50:16.425209 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424107 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:50:16.425209 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424109 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:50:16.425209 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424112 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:50:16.425209 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424115 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:50:16.425209 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424117 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:50:16.425209 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424120 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:50:16.425209 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424123 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:50:16.425209 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424125 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:50:16.425209 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424128 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:50:16.425209 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424132 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:50:16.425209 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424147 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:50:16.425660 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424150 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:50:16.425660 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424153 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:50:16.425660 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424156 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:50:16.425660 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424158 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:50:16.425660 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424161 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:50:16.425660 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424164 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:50:16.425660 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424168 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:50:16.425660 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424170 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:50:16.425660 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424173 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:50:16.425660 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424177 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:50:16.425660 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424180 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:50:16.425660 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424183 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:50:16.425660 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424185 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:50:16.425660 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424189 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:50:16.425660 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424192 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:50:16.425660 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424195 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:50:16.425660 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424198 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:50:16.425660 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424200 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:50:16.425660 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424203 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:50:16.425660 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424206 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:50:16.426128 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424209 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:50:16.426128 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424634 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:50:16.426128 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424639 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:50:16.426128 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424642 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:50:16.426128 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424645 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:50:16.426128 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424648 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:50:16.426128 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424651 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:50:16.426128 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424653 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:50:16.426128 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424656 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:50:16.426128 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424659 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:50:16.426128 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424661 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:50:16.426128 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424664 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:50:16.426128 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424667 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:50:16.426128 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424670 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:50:16.426128 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424672 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:50:16.426128 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424675 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:50:16.426128 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424677 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:50:16.426128 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424680 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:50:16.426128 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424683 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:50:16.426128 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424686 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:50:16.426652 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424689 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:50:16.426652 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424692 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:50:16.426652 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424695 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:50:16.426652 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424697 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:50:16.426652 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424700 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:50:16.426652 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424703 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:50:16.426652 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424706 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:50:16.426652 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424709 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:50:16.426652 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424711 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:50:16.426652 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424714 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:50:16.426652 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424716 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:50:16.426652 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424720 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:50:16.426652 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424724 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:50:16.426652 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424726 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:50:16.426652 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424729 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:50:16.426652 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424731 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:50:16.426652 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424734 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:50:16.426652 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424737 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:50:16.426652 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424741 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:50:16.427171 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424744 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:50:16.427171 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424747 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:50:16.427171 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424750 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:50:16.427171 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424752 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:50:16.427171 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424755 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:50:16.427171 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424758 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:50:16.427171 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424760 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:50:16.427171 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424763 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:50:16.427171 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424765 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:50:16.427171 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424768 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:50:16.427171 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424771 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:50:16.427171 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424775 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:50:16.427171 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424778 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:50:16.427171 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424780 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:50:16.427171 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424784 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:50:16.427171 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424786 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:50:16.427171 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424789 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:50:16.427171 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424791 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:50:16.427171 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424794 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:50:16.427640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424802 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:50:16.427640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424804 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:50:16.427640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424807 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:50:16.427640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424809 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:50:16.427640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424812 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:50:16.427640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424814 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:50:16.427640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424817 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:50:16.427640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424819 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:50:16.427640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424822 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:50:16.427640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424824 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:50:16.427640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424827 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:50:16.427640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424830 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:50:16.427640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424832 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:50:16.427640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424835 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:50:16.427640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424837 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:50:16.427640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424840 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:50:16.427640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424843 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:50:16.427640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424846 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:50:16.427640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424849 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:50:16.427640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424852 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:50:16.428124 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424854 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:50:16.428124 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424857 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:50:16.428124 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424860 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:50:16.428124 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424863 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:50:16.428124 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424865 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:50:16.428124 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424868 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:50:16.428124 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424871 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:50:16.428124 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424873 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:50:16.428124 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.424876 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:50:16.428124 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426392 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 07:50:16.428124 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426403 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 07:50:16.428124 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426409 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 07:50:16.428124 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426413 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 07:50:16.428124 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426419 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 07:50:16.428124 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426422 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 07:50:16.428124 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426427 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 07:50:16.428124 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426432 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 07:50:16.428124 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426435 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 07:50:16.428124 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426438 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 07:50:16.428124 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426442 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 07:50:16.428124 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426445 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 07:50:16.428124 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426448 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426451 2572 flags.go:64] FLAG: --cgroup-root="" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426454 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426457 2572 flags.go:64] FLAG: --client-ca-file="" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426460 2572 flags.go:64] FLAG: --cloud-config="" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426463 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426466 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426470 2572 flags.go:64] FLAG: --cluster-domain="" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426474 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426477 2572 flags.go:64] FLAG: --config-dir="" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426480 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426483 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426487 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426490 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426494 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426498 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426501 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426504 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426507 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426511 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426514 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426518 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426521 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426524 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426527 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 07:50:16.428725 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426530 2572 flags.go:64] FLAG: --enable-server="true" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426533 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426538 2572 flags.go:64] FLAG: --event-burst="100" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426541 2572 flags.go:64] FLAG: --event-qps="50" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426544 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426547 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426550 2572 flags.go:64] FLAG: --eviction-hard="" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426554 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426557 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426560 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426563 2572 flags.go:64] FLAG: --eviction-soft="" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426567 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426570 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426573 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426576 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426579 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426582 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426584 2572 flags.go:64] FLAG: --feature-gates="" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426589 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426592 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426595 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426599 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426603 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426606 2572 flags.go:64] FLAG: --help="false" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426609 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-133-161.ec2.internal" Apr 20 07:50:16.429354 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426612 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426615 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426618 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426622 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426626 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426628 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426631 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426634 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426638 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426641 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426644 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426647 2572 flags.go:64] FLAG: --kube-reserved="" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426650 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426652 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426656 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426659 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426662 2572 flags.go:64] FLAG: --lock-file="" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426665 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426668 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426671 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426677 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426680 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426683 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 07:50:16.429937 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426686 2572 flags.go:64] FLAG: --logging-format="text" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426688 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426692 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426695 2572 flags.go:64] FLAG: --manifest-url="" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426697 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426702 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426708 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426712 2572 flags.go:64] FLAG: --max-pods="110" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426715 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426718 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426721 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426724 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426727 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426730 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426733 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426741 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426744 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426747 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426751 2572 flags.go:64] FLAG: --pod-cidr="" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426754 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426759 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426762 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426765 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426767 2572 flags.go:64] FLAG: --port="10250" Apr 20 07:50:16.430520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426771 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426773 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ce591d160bef5b72" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426777 2572 flags.go:64] FLAG: --qos-reserved="" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426780 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426782 2572 flags.go:64] FLAG: --register-node="true" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426785 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426788 2572 flags.go:64] FLAG: --register-with-taints="" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426792 2572 flags.go:64] FLAG: --registry-burst="10" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426795 2572 flags.go:64] FLAG: --registry-qps="5" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426797 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426800 2572 flags.go:64] FLAG: --reserved-memory="" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426804 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426807 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426810 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426814 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426818 2572 flags.go:64] FLAG: --runonce="false" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426821 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426824 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426827 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426830 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426833 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426836 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426839 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426842 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426845 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426848 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 07:50:16.431152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426851 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426855 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426858 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426861 2572 flags.go:64] FLAG: --system-cgroups="" Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426864 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426869 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426872 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426874 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426879 2572 flags.go:64] FLAG: --tls-min-version="" Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426882 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426885 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426887 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426890 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426893 2572 flags.go:64] FLAG: --v="2" Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426897 2572 flags.go:64] FLAG: --version="false" Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426902 2572 flags.go:64] FLAG: --vmodule="" Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426906 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.426909 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427011 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427015 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427022 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427026 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427029 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:50:16.431777 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427032 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:50:16.432362 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427035 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:50:16.432362 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427038 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:50:16.432362 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427053 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:50:16.432362 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427055 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:50:16.432362 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427058 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:50:16.432362 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427061 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:50:16.432362 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427063 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:50:16.432362 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427066 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:50:16.432362 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427069 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:50:16.432362 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427072 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:50:16.432362 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427075 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:50:16.432362 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427078 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:50:16.432362 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427080 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:50:16.432362 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427083 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:50:16.432362 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427086 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:50:16.432362 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427089 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:50:16.432362 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427092 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:50:16.432362 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427094 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:50:16.432362 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427096 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:50:16.432362 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427099 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:50:16.432898 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427101 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:50:16.432898 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427104 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:50:16.432898 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427106 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:50:16.432898 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427110 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:50:16.432898 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427114 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:50:16.432898 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427116 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:50:16.432898 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427119 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:50:16.432898 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427122 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:50:16.432898 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427126 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:50:16.432898 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427128 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:50:16.432898 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427131 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:50:16.432898 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427134 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:50:16.432898 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427153 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:50:16.432898 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427156 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:50:16.432898 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427158 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:50:16.432898 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427161 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:50:16.432898 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427165 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:50:16.432898 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427167 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:50:16.432898 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427170 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:50:16.432898 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427172 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:50:16.433539 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427175 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:50:16.433539 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427178 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:50:16.433539 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427181 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:50:16.433539 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427184 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:50:16.433539 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427187 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:50:16.433539 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427190 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:50:16.433539 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427192 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:50:16.433539 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427196 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:50:16.433539 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427198 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:50:16.433539 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427201 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:50:16.433539 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427203 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:50:16.433539 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427206 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:50:16.433539 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427208 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:50:16.433539 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427211 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:50:16.433539 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427213 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:50:16.433539 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427215 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:50:16.433539 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427218 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:50:16.433539 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427221 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:50:16.433539 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427223 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:50:16.433539 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427229 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:50:16.434039 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427232 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:50:16.434039 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427235 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:50:16.434039 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427238 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:50:16.434039 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427240 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:50:16.434039 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427243 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:50:16.434039 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427245 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:50:16.434039 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427248 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:50:16.434039 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427251 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:50:16.434039 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427254 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:50:16.434039 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427256 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:50:16.434039 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427259 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:50:16.434039 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427261 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:50:16.434039 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427264 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:50:16.434039 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427267 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:50:16.434039 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427269 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:50:16.434039 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427272 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:50:16.434039 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427275 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:50:16.434039 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427278 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:50:16.434039 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427280 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:50:16.434039 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.427283 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:50:16.434552 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.428214 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 07:50:16.434552 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.434527 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 07:50:16.434552 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.434546 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 07:50:16.434640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434594 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:50:16.434640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434599 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:50:16.434640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434602 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:50:16.434640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434605 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:50:16.434640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434608 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:50:16.434640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434611 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:50:16.434640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434614 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:50:16.434640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434616 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:50:16.434640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434619 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:50:16.434640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434622 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:50:16.434640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434624 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:50:16.434640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434627 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:50:16.434640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434630 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:50:16.434640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434633 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:50:16.434640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434636 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:50:16.434640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434639 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:50:16.434640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434641 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:50:16.434640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434644 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:50:16.434640 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434647 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:50:16.435110 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434651 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:50:16.435110 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434654 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:50:16.435110 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434656 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:50:16.435110 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434659 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:50:16.435110 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434663 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:50:16.435110 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434667 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:50:16.435110 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434671 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:50:16.435110 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434674 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:50:16.435110 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434677 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:50:16.435110 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434680 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:50:16.435110 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434682 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:50:16.435110 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434685 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:50:16.435110 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434688 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:50:16.435110 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434691 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:50:16.435110 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434694 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:50:16.435110 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434696 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:50:16.435110 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434699 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:50:16.435110 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434702 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:50:16.435110 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434704 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:50:16.435599 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434707 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:50:16.435599 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434709 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:50:16.435599 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434712 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:50:16.435599 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434714 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:50:16.435599 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434717 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:50:16.435599 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434719 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:50:16.435599 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434722 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:50:16.435599 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434724 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:50:16.435599 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434726 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:50:16.435599 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434729 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:50:16.435599 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434732 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:50:16.435599 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434735 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:50:16.435599 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434737 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:50:16.435599 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434740 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:50:16.435599 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434743 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:50:16.435599 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434746 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:50:16.435599 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434749 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:50:16.435599 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434751 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:50:16.435599 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434754 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:50:16.435599 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434756 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:50:16.436079 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434759 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:50:16.436079 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434762 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:50:16.436079 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434764 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:50:16.436079 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434767 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:50:16.436079 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434770 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:50:16.436079 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434772 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:50:16.436079 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434775 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:50:16.436079 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434777 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:50:16.436079 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434780 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:50:16.436079 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434783 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:50:16.436079 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434785 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:50:16.436079 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434788 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:50:16.436079 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434790 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:50:16.436079 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434793 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:50:16.436079 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434795 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:50:16.436079 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434798 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:50:16.436079 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434800 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:50:16.436079 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434802 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:50:16.436079 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434805 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:50:16.436079 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434807 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:50:16.436590 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434810 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:50:16.436590 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434813 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:50:16.436590 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434815 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:50:16.436590 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434818 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:50:16.436590 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434821 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:50:16.436590 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434825 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:50:16.436590 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434828 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:50:16.436590 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434832 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:50:16.436590 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.434837 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 07:50:16.436590 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434933 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:50:16.436590 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434937 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:50:16.436590 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434940 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:50:16.436590 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434943 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:50:16.436590 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434946 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:50:16.436590 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434949 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:50:16.436590 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434952 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:50:16.436983 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434955 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:50:16.436983 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434957 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:50:16.436983 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434960 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:50:16.436983 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434963 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:50:16.436983 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434966 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:50:16.436983 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434968 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:50:16.436983 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434971 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:50:16.436983 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434974 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:50:16.436983 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434976 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:50:16.436983 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434978 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:50:16.436983 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434981 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:50:16.436983 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434984 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:50:16.436983 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434986 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:50:16.436983 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434989 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:50:16.436983 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434991 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:50:16.436983 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434994 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:50:16.436983 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434996 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:50:16.436983 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.434999 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:50:16.436983 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435001 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:50:16.437456 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435004 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:50:16.437456 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435007 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:50:16.437456 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435009 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:50:16.437456 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435012 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:50:16.437456 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435015 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:50:16.437456 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435018 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:50:16.437456 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435021 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:50:16.437456 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435023 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:50:16.437456 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435026 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:50:16.437456 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435028 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:50:16.437456 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435031 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:50:16.437456 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435033 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:50:16.437456 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435036 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:50:16.437456 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435049 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:50:16.437456 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435052 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:50:16.437456 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435055 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:50:16.437456 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435059 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:50:16.437456 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435063 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:50:16.437456 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435066 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:50:16.437958 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435069 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:50:16.437958 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435072 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:50:16.437958 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435075 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:50:16.437958 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435079 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:50:16.437958 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435081 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:50:16.437958 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435084 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:50:16.437958 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435087 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:50:16.437958 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435089 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:50:16.437958 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435092 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:50:16.437958 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435094 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:50:16.437958 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435097 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:50:16.437958 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435100 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:50:16.437958 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435103 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:50:16.437958 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435105 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:50:16.437958 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435108 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:50:16.437958 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435111 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:50:16.437958 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435114 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:50:16.437958 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435117 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:50:16.437958 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435120 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:50:16.437958 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435122 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:50:16.438514 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435125 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:50:16.438514 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435128 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:50:16.438514 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435130 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:50:16.438514 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435133 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:50:16.438514 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435135 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:50:16.438514 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435152 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:50:16.438514 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435156 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:50:16.438514 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435159 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:50:16.438514 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435161 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:50:16.438514 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435164 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:50:16.438514 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435166 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:50:16.438514 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435170 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:50:16.438514 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435173 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:50:16.438514 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435175 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:50:16.438514 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435178 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:50:16.438514 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435180 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:50:16.438514 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435183 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:50:16.438514 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435186 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:50:16.438514 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435188 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:50:16.438514 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435191 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:50:16.438991 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:16.435194 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:50:16.438991 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.435200 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 07:50:16.438991 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.435967 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 07:50:16.438991 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.437975 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 07:50:16.439245 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.439234 2572 server.go:1019] "Starting client certificate rotation" Apr 20 07:50:16.439352 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.439334 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 07:50:16.439407 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.439379 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 07:50:16.465044 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.465021 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 07:50:16.469157 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.469114 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 07:50:16.484093 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.484037 2572 log.go:25] "Validated CRI v1 runtime API" Apr 20 07:50:16.489706 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.489686 2572 log.go:25] "Validated CRI v1 image API" Apr 20 07:50:16.491090 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.491074 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 07:50:16.496906 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.496880 2572 fs.go:135] Filesystem UUIDs: map[41c34584-107c-456d-bf88-d56a11435915:/dev/nvme0n1p4 4750bd9c-39ab-494a-ae9b-5dec866f4015:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 20 07:50:16.496977 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.496906 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 07:50:16.497877 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.497859 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 07:50:16.502411 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.502302 2572 manager.go:217] Machine: {Timestamp:2026-04-20 07:50:16.501068176 +0000 UTC m=+0.414698229 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3104148 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28fe9ee7f1e87b64bfe1eac97e4a57 SystemUUID:ec28fe9e-e7f1-e87b-64bf-e1eac97e4a57 BootID:32d75023-b85a-4332-a840-43864b2fdfc9 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:cd:49:06:12:8b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:cd:49:06:12:8b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:9e:23:23:e3:7e:76 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 07:50:16.502411 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.502410 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 07:50:16.502541 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.502530 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 07:50:16.504570 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.504544 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 07:50:16.504729 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.504573 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-161.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 07:50:16.504779 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.504739 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 07:50:16.504779 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.504748 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 07:50:16.504779 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.504766 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 07:50:16.504860 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.504784 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 07:50:16.506179 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.506169 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 20 07:50:16.506293 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.506284 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 07:50:16.508882 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.508861 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 20 07:50:16.508882 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.508885 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 07:50:16.508950 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.508898 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 07:50:16.508950 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.508909 2572 kubelet.go:397] "Adding apiserver pod source" Apr 20 07:50:16.508950 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.508918 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 07:50:16.510098 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.510086 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 07:50:16.510154 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.510109 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 07:50:16.513237 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.513215 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 07:50:16.513804 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.513785 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-c5cwt" Apr 20 07:50:16.517196 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.517125 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 07:50:16.519273 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.519256 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 07:50:16.519273 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.519276 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 07:50:16.519378 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.519282 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 07:50:16.519378 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.519289 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 07:50:16.519378 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.519295 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 07:50:16.519378 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.519301 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 07:50:16.519378 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.519307 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 07:50:16.519378 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.519313 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 07:50:16.519378 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.519319 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 07:50:16.519378 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.519325 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 07:50:16.519378 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.519333 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 07:50:16.519378 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.519342 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 07:50:16.519674 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:16.519578 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-161.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 07:50:16.519717 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:16.519699 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 07:50:16.520289 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.520280 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 07:50:16.520289 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.520290 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 07:50:16.520638 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.520624 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-c5cwt" Apr 20 07:50:16.524038 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.524023 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 07:50:16.524104 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.524061 2572 server.go:1295] "Started kubelet" Apr 20 07:50:16.524218 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.524173 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 07:50:16.524298 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.524172 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 07:50:16.524342 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.524324 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 07:50:16.524985 ip-10-0-133-161 systemd[1]: Started Kubernetes Kubelet. Apr 20 07:50:16.525536 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.525493 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 20 07:50:16.525880 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.525866 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 07:50:16.531114 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.531074 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 07:50:16.531694 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.531676 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 07:50:16.533952 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.532585 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 07:50:16.533952 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:16.532966 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 20 07:50:16.534632 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.534323 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:50:16.534731 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.534717 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 07:50:16.534818 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.534801 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 07:50:16.534910 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.534874 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 20 07:50:16.534910 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.534885 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 20 07:50:16.535125 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.535106 2572 factory.go:55] Registering systemd factory Apr 20 07:50:16.535270 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.535188 2572 factory.go:223] Registration of the systemd container factory successfully Apr 20 07:50:16.535699 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.535682 2572 factory.go:153] Registering CRI-O factory Apr 20 07:50:16.535699 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.535700 2572 factory.go:223] Registration of the crio container factory successfully Apr 20 07:50:16.535821 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.535760 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 07:50:16.535821 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.535779 2572 factory.go:103] Registering Raw factory Apr 20 07:50:16.535821 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.535795 2572 manager.go:1196] Started watching for new ooms in manager Apr 20 07:50:16.536378 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.536364 2572 manager.go:319] Starting recovery of all containers Apr 20 07:50:16.536476 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:16.536457 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 07:50:16.536656 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.536642 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-161.ec2.internal" not found Apr 20 07:50:16.536754 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:16.536731 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-161.ec2.internal\" not found" node="ip-10-0-133-161.ec2.internal" Apr 20 07:50:16.546091 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.545926 2572 manager.go:324] Recovery completed Apr 20 07:50:16.551181 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.551166 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:50:16.551882 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.551864 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-161.ec2.internal" not found Apr 20 07:50:16.555181 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.555164 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:50:16.555260 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.555192 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:50:16.555260 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.555203 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:50:16.555734 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.555719 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 07:50:16.555734 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.555733 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 07:50:16.555857 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.555752 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 20 07:50:16.558170 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.558157 2572 policy_none.go:49] "None policy: Start" Apr 20 07:50:16.558242 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.558175 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 07:50:16.558242 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.558188 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 20 07:50:16.602769 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.602751 2572 manager.go:341] "Starting Device Plugin manager" Apr 20 07:50:16.624072 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:16.602788 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 07:50:16.624072 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.602801 2572 server.go:85] "Starting device plugin registration server" Apr 20 07:50:16.624072 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.603085 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 07:50:16.624072 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.603096 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 07:50:16.624072 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.603260 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 07:50:16.624072 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.603338 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 07:50:16.624072 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.603347 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 07:50:16.624072 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:16.603814 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 07:50:16.624072 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:16.603861 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-161.ec2.internal\" not found" Apr 20 07:50:16.624072 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.613774 2572 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-161.ec2.internal" not found Apr 20 07:50:16.639469 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.639439 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 07:50:16.640700 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.640679 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 07:50:16.640797 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.640705 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 07:50:16.640797 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.640724 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 07:50:16.640797 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.640730 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 07:50:16.640797 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:16.640760 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 07:50:16.642832 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.642812 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:50:16.703667 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.703632 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:50:16.704753 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.704737 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:50:16.704837 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.704766 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:50:16.704837 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.704778 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:50:16.704837 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.704805 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-161.ec2.internal" Apr 20 07:50:16.714942 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.714926 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-161.ec2.internal" Apr 20 07:50:16.714992 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:16.714948 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-161.ec2.internal\": node \"ip-10-0-133-161.ec2.internal\" not found" Apr 20 07:50:16.734846 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:16.734791 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 20 07:50:16.740965 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.740935 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-161.ec2.internal"] Apr 20 07:50:16.741042 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.741024 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:50:16.742466 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.742448 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:50:16.742539 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.742474 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:50:16.742539 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.742484 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:50:16.744664 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.744650 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:50:16.744830 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.744815 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" Apr 20 07:50:16.744896 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.744850 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:50:16.745340 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.745324 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:50:16.745431 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.745350 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:50:16.745431 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.745360 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:50:16.745431 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.745398 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:50:16.745431 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.745421 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:50:16.745614 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.745435 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:50:16.748204 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.748189 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-161.ec2.internal" Apr 20 07:50:16.748280 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.748217 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:50:16.749597 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.749582 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:50:16.749675 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.749607 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:50:16.749675 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.749618 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:50:16.780821 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:16.780798 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-161.ec2.internal\" not found" node="ip-10-0-133-161.ec2.internal" Apr 20 07:50:16.785846 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:16.785830 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-161.ec2.internal\" not found" node="ip-10-0-133-161.ec2.internal" Apr 20 07:50:16.835192 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:16.835173 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 20 07:50:16.836315 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.836299 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/17556b0047f1a6a15f4b7d5854560826-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal\" (UID: \"17556b0047f1a6a15f4b7d5854560826\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" Apr 20 07:50:16.836363 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.836325 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17556b0047f1a6a15f4b7d5854560826-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal\" (UID: \"17556b0047f1a6a15f4b7d5854560826\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" Apr 20 07:50:16.836363 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.836352 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/71c8bd7a095056144ad8091ca3c68103-config\") pod \"kube-apiserver-proxy-ip-10-0-133-161.ec2.internal\" (UID: \"71c8bd7a095056144ad8091ca3c68103\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-161.ec2.internal" Apr 20 07:50:16.935748 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:16.935707 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 20 07:50:16.936849 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.936830 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/17556b0047f1a6a15f4b7d5854560826-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal\" (UID: \"17556b0047f1a6a15f4b7d5854560826\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" Apr 20 07:50:16.936910 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.936858 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17556b0047f1a6a15f4b7d5854560826-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal\" (UID: \"17556b0047f1a6a15f4b7d5854560826\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" Apr 20 07:50:16.936910 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.936879 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/71c8bd7a095056144ad8091ca3c68103-config\") pod \"kube-apiserver-proxy-ip-10-0-133-161.ec2.internal\" (UID: \"71c8bd7a095056144ad8091ca3c68103\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-161.ec2.internal" Apr 20 07:50:16.936991 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.936925 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/17556b0047f1a6a15f4b7d5854560826-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal\" (UID: \"17556b0047f1a6a15f4b7d5854560826\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" Apr 20 07:50:16.936991 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.936925 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/17556b0047f1a6a15f4b7d5854560826-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal\" (UID: \"17556b0047f1a6a15f4b7d5854560826\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" Apr 20 07:50:16.936991 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:16.936941 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/71c8bd7a095056144ad8091ca3c68103-config\") pod \"kube-apiserver-proxy-ip-10-0-133-161.ec2.internal\" (UID: \"71c8bd7a095056144ad8091ca3c68103\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-161.ec2.internal" Apr 20 07:50:17.036528 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:17.036451 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 20 07:50:17.082719 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:17.082698 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" Apr 20 07:50:17.088286 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:17.088259 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-161.ec2.internal" Apr 20 07:50:17.136831 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:17.136787 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 20 07:50:17.237336 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:17.237298 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 20 07:50:17.337830 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:17.337768 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 20 07:50:17.438277 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:17.438248 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 20 07:50:17.438277 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:17.438272 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 07:50:17.438868 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:17.438417 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 07:50:17.438868 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:17.438426 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 07:50:17.522972 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:17.522929 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 07:45:16 +0000 UTC" deadline="2028-01-29 07:24:19.10754288 +0000 UTC" Apr 20 07:50:17.522972 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:17.522964 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15575h34m1.58458299s" Apr 20 07:50:17.532108 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:17.532083 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 07:50:17.539047 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:17.539018 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 20 07:50:17.550364 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:17.550344 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 07:50:17.571216 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:17.571185 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-jz479" Apr 20 07:50:17.578098 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:17.578076 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-jz479" Apr 20 07:50:17.607218 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:17.607187 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71c8bd7a095056144ad8091ca3c68103.slice/crio-be11d20f3199f12e7075292184aac513aa5fa56966c2061e8a5eda0a5089346d WatchSource:0}: Error finding container be11d20f3199f12e7075292184aac513aa5fa56966c2061e8a5eda0a5089346d: Status 404 returned error can't find the container with id be11d20f3199f12e7075292184aac513aa5fa56966c2061e8a5eda0a5089346d Apr 20 07:50:17.607389 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:17.607376 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17556b0047f1a6a15f4b7d5854560826.slice/crio-1f2c8694c3b1e16d582803cf789653fc5cb3d12a06fdf41b592e67d00ba96062 WatchSource:0}: Error finding container 1f2c8694c3b1e16d582803cf789653fc5cb3d12a06fdf41b592e67d00ba96062: Status 404 returned error can't find the container with id 1f2c8694c3b1e16d582803cf789653fc5cb3d12a06fdf41b592e67d00ba96062 Apr 20 07:50:17.611587 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:17.611572 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 07:50:17.639265 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:17.639238 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 20 07:50:17.644216 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:17.644164 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" event={"ID":"17556b0047f1a6a15f4b7d5854560826","Type":"ContainerStarted","Data":"1f2c8694c3b1e16d582803cf789653fc5cb3d12a06fdf41b592e67d00ba96062"} Apr 20 07:50:17.645123 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:17.645099 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-161.ec2.internal" event={"ID":"71c8bd7a095056144ad8091ca3c68103","Type":"ContainerStarted","Data":"be11d20f3199f12e7075292184aac513aa5fa56966c2061e8a5eda0a5089346d"} Apr 20 07:50:17.706493 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:17.706468 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:50:17.739989 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:17.739963 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 20 07:50:17.840534 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:17.840503 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 20 07:50:17.941039 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:17.940957 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 20 07:50:18.041724 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:18.041694 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-161.ec2.internal\" not found" Apr 20 07:50:18.098933 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.098888 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:50:18.132102 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.132069 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" Apr 20 07:50:18.142895 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.142763 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 07:50:18.143822 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.143633 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-161.ec2.internal" Apr 20 07:50:18.151341 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.151321 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 07:50:18.510502 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.510470 2572 apiserver.go:52] "Watching apiserver" Apr 20 07:50:18.518433 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.518407 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 07:50:18.518832 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.518797 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-133-161.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl","openshift-image-registry/node-ca-fcfwc","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal","openshift-multus/multus-additional-cni-plugins-lfk6z","openshift-multus/network-metrics-daemon-brq5h","openshift-network-operator/iptables-alerter-ng5xl","openshift-cluster-node-tuning-operator/tuned-vnqcw","openshift-dns/node-resolver-45gfj","openshift-multus/multus-xsjgg","openshift-network-diagnostics/network-check-target-qr7pr","openshift-ovn-kubernetes/ovnkube-node-55zsn","kube-system/konnectivity-agent-l8wjp"] Apr 20 07:50:18.521869 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.521848 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.524568 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.524539 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 07:50:18.524568 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.524565 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 07:50:18.524752 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.524588 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-5sfpv\"" Apr 20 07:50:18.524752 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.524549 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 07:50:18.524971 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.524954 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 07:50:18.526122 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.526103 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.528289 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.528272 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fcfwc" Apr 20 07:50:18.528485 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.528468 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-nhvx4\"" Apr 20 07:50:18.528623 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.528609 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 07:50:18.528684 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.528630 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 07:50:18.528739 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.528682 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 07:50:18.530557 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.530537 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 07:50:18.530656 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.530576 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:18.530709 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:18.530666 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-brq5h" podUID="07219834-44d6-42ab-9058-aed46274d1a8" Apr 20 07:50:18.530760 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.530733 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 07:50:18.530810 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.530791 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 07:50:18.530916 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.530901 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5fsx7\"" Apr 20 07:50:18.533197 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.533003 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.535587 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.535202 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-ng5xl" Apr 20 07:50:18.535587 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.535381 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-4grwr\"" Apr 20 07:50:18.535798 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.535705 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 07:50:18.535798 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.535705 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 07:50:18.537420 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.537397 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.537855 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.537832 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 07:50:18.538076 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.538061 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-tx4bb\"" Apr 20 07:50:18.538200 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.538187 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:50:18.538295 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.538275 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 07:50:18.539532 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.539513 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:50:18.539622 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.539586 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-45gfj" Apr 20 07:50:18.539622 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.539616 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 07:50:18.540446 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.540430 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-ssw4t\"" Apr 20 07:50:18.542122 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.542102 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:18.542226 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:18.542207 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qr7pr" podUID="3e84a5ff-c8e6-4c91-95b6-66697b65f3e6" Apr 20 07:50:18.543075 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.543055 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 07:50:18.543178 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.543107 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-m258s\"" Apr 20 07:50:18.543577 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.543561 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 07:50:18.543848 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.543831 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.545413 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.545392 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-host-var-lib-cni-multus\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.545500 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.545439 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60a64748-7449-441c-8327-211d296e5ef6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rl8rl\" (UID: \"60a64748-7449-441c-8327-211d296e5ef6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.545500 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.545476 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/893e108a-cd48-4c06-80c6-167a8ad53ac2-system-cni-dir\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.545607 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.545541 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/893e108a-cd48-4c06-80c6-167a8ad53ac2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.545657 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.545606 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/60a64748-7449-441c-8327-211d296e5ef6-registration-dir\") pod \"aws-ebs-csi-driver-node-rl8rl\" (UID: \"60a64748-7449-441c-8327-211d296e5ef6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.546151 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.546107 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-l8wjp" Apr 20 07:50:18.546240 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.546171 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-cnibin\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.546293 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.546248 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4g2p\" (UniqueName: \"kubernetes.io/projected/60a64748-7449-441c-8327-211d296e5ef6-kube-api-access-n4g2p\") pod \"aws-ebs-csi-driver-node-rl8rl\" (UID: \"60a64748-7449-441c-8327-211d296e5ef6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.546714 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.546383 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/893e108a-cd48-4c06-80c6-167a8ad53ac2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.546714 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.546429 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-host-var-lib-cni-bin\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.546714 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.546477 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-hostroot\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.546714 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.546498 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 07:50:18.546714 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.546510 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/edf19122-ee32-4e12-a720-45239728231d-multus-daemon-config\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.546714 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.546540 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4462eeb2-94f5-4c1c-bc8e-62fe9f10c78f-serviceca\") pod \"node-ca-fcfwc\" (UID: \"4462eeb2-94f5-4c1c-bc8e-62fe9f10c78f\") " pod="openshift-image-registry/node-ca-fcfwc" Apr 20 07:50:18.546714 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.546593 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwmv9\" (UniqueName: \"kubernetes.io/projected/4462eeb2-94f5-4c1c-bc8e-62fe9f10c78f-kube-api-access-xwmv9\") pod \"node-ca-fcfwc\" (UID: \"4462eeb2-94f5-4c1c-bc8e-62fe9f10c78f\") " pod="openshift-image-registry/node-ca-fcfwc" Apr 20 07:50:18.546714 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.546680 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/893e108a-cd48-4c06-80c6-167a8ad53ac2-os-release\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.547109 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.546723 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-multus-socket-dir-parent\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.547109 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.546756 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-host-run-k8s-cni-cncf-io\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.547109 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.546780 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 07:50:18.547109 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.546802 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/60a64748-7449-441c-8327-211d296e5ef6-sys-fs\") pod \"aws-ebs-csi-driver-node-rl8rl\" (UID: \"60a64748-7449-441c-8327-211d296e5ef6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.547109 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.546842 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/893e108a-cd48-4c06-80c6-167a8ad53ac2-cni-binary-copy\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.547109 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.546874 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/893e108a-cd48-4c06-80c6-167a8ad53ac2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.547109 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.546908 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/60a64748-7449-441c-8327-211d296e5ef6-socket-dir\") pod \"aws-ebs-csi-driver-node-rl8rl\" (UID: \"60a64748-7449-441c-8327-211d296e5ef6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.547109 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.546947 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-os-release\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.547109 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.546990 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/edf19122-ee32-4e12-a720-45239728231d-cni-binary-copy\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.547109 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.547070 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-host-run-netns\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.547109 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.547113 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9szq\" (UniqueName: \"kubernetes.io/projected/edf19122-ee32-4e12-a720-45239728231d-kube-api-access-w9szq\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.547731 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.547209 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zglkz\" (UniqueName: \"kubernetes.io/projected/893e108a-cd48-4c06-80c6-167a8ad53ac2-kube-api-access-zglkz\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.547731 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.547256 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2fdddb28-e042-4361-94c8-ed537e5237f2-iptables-alerter-script\") pod \"iptables-alerter-ng5xl\" (UID: \"2fdddb28-e042-4361-94c8-ed537e5237f2\") " pod="openshift-network-operator/iptables-alerter-ng5xl" Apr 20 07:50:18.547731 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.547279 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-system-cni-dir\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.547731 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.547297 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-multus-cni-dir\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.547731 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.547319 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-multus-conf-dir\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.547731 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.547338 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/60a64748-7449-441c-8327-211d296e5ef6-device-dir\") pod \"aws-ebs-csi-driver-node-rl8rl\" (UID: \"60a64748-7449-441c-8327-211d296e5ef6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.547731 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.547356 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/893e108a-cd48-4c06-80c6-167a8ad53ac2-cnibin\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.547731 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.547383 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z7p6\" (UniqueName: \"kubernetes.io/projected/2fdddb28-e042-4361-94c8-ed537e5237f2-kube-api-access-4z7p6\") pod \"iptables-alerter-ng5xl\" (UID: \"2fdddb28-e042-4361-94c8-ed537e5237f2\") " pod="openshift-network-operator/iptables-alerter-ng5xl" Apr 20 07:50:18.547731 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.547407 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-host-var-lib-kubelet\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.547731 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.547424 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-host-run-multus-certs\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.547731 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.547441 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-etc-kubernetes\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.547731 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.547459 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4462eeb2-94f5-4c1c-bc8e-62fe9f10c78f-host\") pod \"node-ca-fcfwc\" (UID: \"4462eeb2-94f5-4c1c-bc8e-62fe9f10c78f\") " pod="openshift-image-registry/node-ca-fcfwc" Apr 20 07:50:18.547731 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.547483 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/60a64748-7449-441c-8327-211d296e5ef6-etc-selinux\") pod \"aws-ebs-csi-driver-node-rl8rl\" (UID: \"60a64748-7449-441c-8327-211d296e5ef6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.547731 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.547503 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs\") pod \"network-metrics-daemon-brq5h\" (UID: \"07219834-44d6-42ab-9058-aed46274d1a8\") " pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:18.547731 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.547526 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nr662\" (UniqueName: \"kubernetes.io/projected/07219834-44d6-42ab-9058-aed46274d1a8-kube-api-access-nr662\") pod \"network-metrics-daemon-brq5h\" (UID: \"07219834-44d6-42ab-9058-aed46274d1a8\") " pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:18.547731 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.547558 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2fdddb28-e042-4361-94c8-ed537e5237f2-host-slash\") pod \"iptables-alerter-ng5xl\" (UID: \"2fdddb28-e042-4361-94c8-ed537e5237f2\") " pod="openshift-network-operator/iptables-alerter-ng5xl" Apr 20 07:50:18.547731 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.547562 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 07:50:18.548644 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.547658 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 07:50:18.548644 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.547814 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 07:50:18.548644 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.547939 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-sj9vt\"" Apr 20 07:50:18.548644 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.548306 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 07:50:18.548872 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.548852 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 07:50:18.548950 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.548937 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 07:50:18.549274 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.549154 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-j887h\"" Apr 20 07:50:18.579390 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.579363 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 07:45:17 +0000 UTC" deadline="2027-12-19 12:49:54.071393874 +0000 UTC" Apr 20 07:50:18.579512 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.579494 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14596h59m35.491903218s" Apr 20 07:50:18.602464 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.602442 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:50:18.633703 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.633677 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 07:50:18.644852 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.644829 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:50:18.648071 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648050 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-etc-sysctl-conf\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.648186 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648077 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef399f99-581e-48a8-a30b-557c95337f8e-tmp\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.648186 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648101 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-log-socket\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.648186 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648116 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvd8z\" (UniqueName: \"kubernetes.io/projected/f989907c-6b39-4f73-8cb2-9fb3915c446d-kube-api-access-hvd8z\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.648186 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648181 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs\") pod \"network-metrics-daemon-brq5h\" (UID: \"07219834-44d6-42ab-9058-aed46274d1a8\") " pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:18.648332 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nr662\" (UniqueName: \"kubernetes.io/projected/07219834-44d6-42ab-9058-aed46274d1a8-kube-api-access-nr662\") pod \"network-metrics-daemon-brq5h\" (UID: \"07219834-44d6-42ab-9058-aed46274d1a8\") " pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:18.648332 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648222 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2fdddb28-e042-4361-94c8-ed537e5237f2-host-slash\") pod \"iptables-alerter-ng5xl\" (UID: \"2fdddb28-e042-4361-94c8-ed537e5237f2\") " pod="openshift-network-operator/iptables-alerter-ng5xl" Apr 20 07:50:18.648332 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648246 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-host-var-lib-cni-multus\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.648332 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648272 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60a64748-7449-441c-8327-211d296e5ef6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rl8rl\" (UID: \"60a64748-7449-441c-8327-211d296e5ef6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.648332 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648286 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2fdddb28-e042-4361-94c8-ed537e5237f2-host-slash\") pod \"iptables-alerter-ng5xl\" (UID: \"2fdddb28-e042-4361-94c8-ed537e5237f2\") " pod="openshift-network-operator/iptables-alerter-ng5xl" Apr 20 07:50:18.648332 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648295 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/893e108a-cd48-4c06-80c6-167a8ad53ac2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.648332 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648303 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-host-var-lib-cni-multus\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.648332 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648320 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n4g2p\" (UniqueName: \"kubernetes.io/projected/60a64748-7449-441c-8327-211d296e5ef6-kube-api-access-n4g2p\") pod \"aws-ebs-csi-driver-node-rl8rl\" (UID: \"60a64748-7449-441c-8327-211d296e5ef6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.648664 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648349 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/893e108a-cd48-4c06-80c6-167a8ad53ac2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.648664 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648365 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/60a64748-7449-441c-8327-211d296e5ef6-kubelet-dir\") pod \"aws-ebs-csi-driver-node-rl8rl\" (UID: \"60a64748-7449-441c-8327-211d296e5ef6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.648664 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648381 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-lib-modules\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.648664 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:18.648385 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:18.648664 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648406 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrrmr\" (UniqueName: \"kubernetes.io/projected/38e938ac-334a-46a9-bd54-099927b87530-kube-api-access-mrrmr\") pod \"node-resolver-45gfj\" (UID: \"38e938ac-334a-46a9-bd54-099927b87530\") " pod="openshift-dns/node-resolver-45gfj" Apr 20 07:50:18.648664 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648430 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-run-systemd\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.648664 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:18.648451 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs podName:07219834-44d6-42ab-9058-aed46274d1a8 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:19.148426781 +0000 UTC m=+3.062056801 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs") pod "network-metrics-daemon-brq5h" (UID: "07219834-44d6-42ab-9058-aed46274d1a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:18.648664 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648495 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-host-var-lib-cni-bin\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.648664 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648499 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/893e108a-cd48-4c06-80c6-167a8ad53ac2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.648664 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648542 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/edf19122-ee32-4e12-a720-45239728231d-multus-daemon-config\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.648664 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648565 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-host-var-lib-cni-bin\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.648664 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648600 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4462eeb2-94f5-4c1c-bc8e-62fe9f10c78f-serviceca\") pod \"node-ca-fcfwc\" (UID: \"4462eeb2-94f5-4c1c-bc8e-62fe9f10c78f\") " pod="openshift-image-registry/node-ca-fcfwc" Apr 20 07:50:18.648664 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648648 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwmv9\" (UniqueName: \"kubernetes.io/projected/4462eeb2-94f5-4c1c-bc8e-62fe9f10c78f-kube-api-access-xwmv9\") pod \"node-ca-fcfwc\" (UID: \"4462eeb2-94f5-4c1c-bc8e-62fe9f10c78f\") " pod="openshift-image-registry/node-ca-fcfwc" Apr 20 07:50:18.649254 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648672 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/38e938ac-334a-46a9-bd54-099927b87530-hosts-file\") pod \"node-resolver-45gfj\" (UID: \"38e938ac-334a-46a9-bd54-099927b87530\") " pod="openshift-dns/node-resolver-45gfj" Apr 20 07:50:18.649254 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648696 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-multus-socket-dir-parent\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.649254 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648723 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-host-run-k8s-cni-cncf-io\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.649254 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648769 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-host-run-k8s-cni-cncf-io\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.649254 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648775 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-multus-socket-dir-parent\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.649254 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648799 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/60a64748-7449-441c-8327-211d296e5ef6-sys-fs\") pod \"aws-ebs-csi-driver-node-rl8rl\" (UID: \"60a64748-7449-441c-8327-211d296e5ef6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.649254 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648824 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/893e108a-cd48-4c06-80c6-167a8ad53ac2-cni-binary-copy\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.649254 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648853 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a7425036-5ff0-42c9-9f51-3d27f67f9232-konnectivity-ca\") pod \"konnectivity-agent-l8wjp\" (UID: \"a7425036-5ff0-42c9-9f51-3d27f67f9232\") " pod="kube-system/konnectivity-agent-l8wjp" Apr 20 07:50:18.649254 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648880 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/60a64748-7449-441c-8327-211d296e5ef6-socket-dir\") pod \"aws-ebs-csi-driver-node-rl8rl\" (UID: \"60a64748-7449-441c-8327-211d296e5ef6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.649254 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648886 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/60a64748-7449-441c-8327-211d296e5ef6-sys-fs\") pod \"aws-ebs-csi-driver-node-rl8rl\" (UID: \"60a64748-7449-441c-8327-211d296e5ef6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.649254 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648927 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-host-run-netns\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.649254 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648955 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w9szq\" (UniqueName: \"kubernetes.io/projected/edf19122-ee32-4e12-a720-45239728231d-kube-api-access-w9szq\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.649254 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.648983 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjlw9\" (UniqueName: \"kubernetes.io/projected/ef399f99-581e-48a8-a30b-557c95337f8e-kube-api-access-xjlw9\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.649254 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649008 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2fdddb28-e042-4361-94c8-ed537e5237f2-iptables-alerter-script\") pod \"iptables-alerter-ng5xl\" (UID: \"2fdddb28-e042-4361-94c8-ed537e5237f2\") " pod="openshift-network-operator/iptables-alerter-ng5xl" Apr 20 07:50:18.649254 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649017 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4462eeb2-94f5-4c1c-bc8e-62fe9f10c78f-serviceca\") pod \"node-ca-fcfwc\" (UID: \"4462eeb2-94f5-4c1c-bc8e-62fe9f10c78f\") " pod="openshift-image-registry/node-ca-fcfwc" Apr 20 07:50:18.649254 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649018 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/893e108a-cd48-4c06-80c6-167a8ad53ac2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.649254 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649053 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/60a64748-7449-441c-8327-211d296e5ef6-socket-dir\") pod \"aws-ebs-csi-driver-node-rl8rl\" (UID: \"60a64748-7449-441c-8327-211d296e5ef6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.649811 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649030 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-system-cni-dir\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.649811 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649073 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-host-run-netns\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.649811 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649105 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-multus-cni-dir\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.649811 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649113 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/edf19122-ee32-4e12-a720-45239728231d-multus-daemon-config\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.649811 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649133 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/60a64748-7449-441c-8327-211d296e5ef6-device-dir\") pod \"aws-ebs-csi-driver-node-rl8rl\" (UID: \"60a64748-7449-441c-8327-211d296e5ef6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.649811 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649178 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4z7p6\" (UniqueName: \"kubernetes.io/projected/2fdddb28-e042-4361-94c8-ed537e5237f2-kube-api-access-4z7p6\") pod \"iptables-alerter-ng5xl\" (UID: \"2fdddb28-e042-4361-94c8-ed537e5237f2\") " pod="openshift-network-operator/iptables-alerter-ng5xl" Apr 20 07:50:18.649811 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649202 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-host-var-lib-kubelet\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.649811 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649219 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/60a64748-7449-441c-8327-211d296e5ef6-device-dir\") pod \"aws-ebs-csi-driver-node-rl8rl\" (UID: \"60a64748-7449-441c-8327-211d296e5ef6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.649811 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649228 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-host-run-multus-certs\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.649811 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649164 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-system-cni-dir\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.649811 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649238 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-multus-cni-dir\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.649811 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649251 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4462eeb2-94f5-4c1c-bc8e-62fe9f10c78f-host\") pod \"node-ca-fcfwc\" (UID: \"4462eeb2-94f5-4c1c-bc8e-62fe9f10c78f\") " pod="openshift-image-registry/node-ca-fcfwc" Apr 20 07:50:18.649811 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649279 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-host-kubelet\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.649811 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649302 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4462eeb2-94f5-4c1c-bc8e-62fe9f10c78f-host\") pod \"node-ca-fcfwc\" (UID: \"4462eeb2-94f5-4c1c-bc8e-62fe9f10c78f\") " pod="openshift-image-registry/node-ca-fcfwc" Apr 20 07:50:18.649811 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649303 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-host-cni-bin\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.649811 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649306 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-host-var-lib-kubelet\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.649811 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649342 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-host-run-multus-certs\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.649811 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649373 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/893e108a-cd48-4c06-80c6-167a8ad53ac2-cni-binary-copy\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.650584 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649381 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/60a64748-7449-441c-8327-211d296e5ef6-etc-selinux\") pod \"aws-ebs-csi-driver-node-rl8rl\" (UID: \"60a64748-7449-441c-8327-211d296e5ef6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.650584 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649417 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-host-run-netns\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.650584 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649448 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-host-run-ovn-kubernetes\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.650584 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649496 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.650584 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649519 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/60a64748-7449-441c-8327-211d296e5ef6-etc-selinux\") pod \"aws-ebs-csi-driver-node-rl8rl\" (UID: \"60a64748-7449-441c-8327-211d296e5ef6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.650584 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649528 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f989907c-6b39-4f73-8cb2-9fb3915c446d-ovnkube-config\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.650584 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649560 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/893e108a-cd48-4c06-80c6-167a8ad53ac2-system-cni-dir\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.650584 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649589 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx9xg\" (UniqueName: \"kubernetes.io/projected/3e84a5ff-c8e6-4c91-95b6-66697b65f3e6-kube-api-access-mx9xg\") pod \"network-check-target-qr7pr\" (UID: \"3e84a5ff-c8e6-4c91-95b6-66697b65f3e6\") " pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:18.650584 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649594 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/893e108a-cd48-4c06-80c6-167a8ad53ac2-system-cni-dir\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.650584 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649594 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2fdddb28-e042-4361-94c8-ed537e5237f2-iptables-alerter-script\") pod \"iptables-alerter-ng5xl\" (UID: \"2fdddb28-e042-4361-94c8-ed537e5237f2\") " pod="openshift-network-operator/iptables-alerter-ng5xl" Apr 20 07:50:18.650584 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649618 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-host-cni-netd\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.650584 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649644 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/60a64748-7449-441c-8327-211d296e5ef6-registration-dir\") pod \"aws-ebs-csi-driver-node-rl8rl\" (UID: \"60a64748-7449-441c-8327-211d296e5ef6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.650584 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649665 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-cnibin\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.650584 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649696 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/60a64748-7449-441c-8327-211d296e5ef6-registration-dir\") pod \"aws-ebs-csi-driver-node-rl8rl\" (UID: \"60a64748-7449-441c-8327-211d296e5ef6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.650584 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649699 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-var-lib-kubelet\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.650584 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649727 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-run-ovn\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.651226 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649759 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-cnibin\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.651226 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649790 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-hostroot\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.651226 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649807 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/893e108a-cd48-4c06-80c6-167a8ad53ac2-os-release\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.651226 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649822 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-node-log\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.651226 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649838 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ef399f99-581e-48a8-a30b-557c95337f8e-etc-tuned\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.651226 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649867 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/893e108a-cd48-4c06-80c6-167a8ad53ac2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.651226 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649891 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-hostroot\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.651226 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649903 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/893e108a-cd48-4c06-80c6-167a8ad53ac2-os-release\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.651226 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649920 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-etc-modprobe-d\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.651226 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649937 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-etc-systemd\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.651226 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649960 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-run\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.651226 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.649990 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/38e938ac-334a-46a9-bd54-099927b87530-tmp-dir\") pod \"node-resolver-45gfj\" (UID: \"38e938ac-334a-46a9-bd54-099927b87530\") " pod="openshift-dns/node-resolver-45gfj" Apr 20 07:50:18.651226 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650011 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-systemd-units\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.651226 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650025 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f989907c-6b39-4f73-8cb2-9fb3915c446d-ovnkube-script-lib\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.651226 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650040 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-os-release\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.651226 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650065 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/edf19122-ee32-4e12-a720-45239728231d-cni-binary-copy\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.651226 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650120 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-os-release\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.651977 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650176 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zglkz\" (UniqueName: \"kubernetes.io/projected/893e108a-cd48-4c06-80c6-167a8ad53ac2-kube-api-access-zglkz\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.651977 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650209 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-etc-sysconfig\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.651977 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650234 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-sys\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.651977 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650277 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-etc-openvswitch\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.651977 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650320 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-host\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.651977 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650351 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-multus-conf-dir\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.651977 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650380 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/893e108a-cd48-4c06-80c6-167a8ad53ac2-cnibin\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.651977 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650420 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-multus-conf-dir\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.651977 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650422 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/893e108a-cd48-4c06-80c6-167a8ad53ac2-cnibin\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.651977 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650451 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/893e108a-cd48-4c06-80c6-167a8ad53ac2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.651977 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650458 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-etc-kubernetes\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.651977 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650502 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-etc-sysctl-d\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.651977 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650519 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-run-openvswitch\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.651977 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650797 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f989907c-6b39-4f73-8cb2-9fb3915c446d-env-overrides\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.651977 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650852 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f989907c-6b39-4f73-8cb2-9fb3915c446d-ovn-node-metrics-cert\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.651977 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650891 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-etc-kubernetes\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.651977 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650929 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-host-slash\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.652698 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.650991 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edf19122-ee32-4e12-a720-45239728231d-etc-kubernetes\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.652698 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.651058 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-var-lib-openvswitch\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.652698 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.651098 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a7425036-5ff0-42c9-9f51-3d27f67f9232-agent-certs\") pod \"konnectivity-agent-l8wjp\" (UID: \"a7425036-5ff0-42c9-9f51-3d27f67f9232\") " pod="kube-system/konnectivity-agent-l8wjp" Apr 20 07:50:18.652698 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.651312 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/edf19122-ee32-4e12-a720-45239728231d-cni-binary-copy\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.657544 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.657513 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 07:50:18.661671 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.661603 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwmv9\" (UniqueName: \"kubernetes.io/projected/4462eeb2-94f5-4c1c-bc8e-62fe9f10c78f-kube-api-access-xwmv9\") pod \"node-ca-fcfwc\" (UID: \"4462eeb2-94f5-4c1c-bc8e-62fe9f10c78f\") " pod="openshift-image-registry/node-ca-fcfwc" Apr 20 07:50:18.661671 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.661629 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zglkz\" (UniqueName: \"kubernetes.io/projected/893e108a-cd48-4c06-80c6-167a8ad53ac2-kube-api-access-zglkz\") pod \"multus-additional-cni-plugins-lfk6z\" (UID: \"893e108a-cd48-4c06-80c6-167a8ad53ac2\") " pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.661671 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.661638 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nr662\" (UniqueName: \"kubernetes.io/projected/07219834-44d6-42ab-9058-aed46274d1a8-kube-api-access-nr662\") pod \"network-metrics-daemon-brq5h\" (UID: \"07219834-44d6-42ab-9058-aed46274d1a8\") " pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:18.661882 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.661723 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9szq\" (UniqueName: \"kubernetes.io/projected/edf19122-ee32-4e12-a720-45239728231d-kube-api-access-w9szq\") pod \"multus-xsjgg\" (UID: \"edf19122-ee32-4e12-a720-45239728231d\") " pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.661882 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.661770 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4g2p\" (UniqueName: \"kubernetes.io/projected/60a64748-7449-441c-8327-211d296e5ef6-kube-api-access-n4g2p\") pod \"aws-ebs-csi-driver-node-rl8rl\" (UID: \"60a64748-7449-441c-8327-211d296e5ef6\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.661882 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.661780 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z7p6\" (UniqueName: \"kubernetes.io/projected/2fdddb28-e042-4361-94c8-ed537e5237f2-kube-api-access-4z7p6\") pod \"iptables-alerter-ng5xl\" (UID: \"2fdddb28-e042-4361-94c8-ed537e5237f2\") " pod="openshift-network-operator/iptables-alerter-ng5xl" Apr 20 07:50:18.751605 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.751567 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-host-kubelet\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.751605 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.751611 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-host-cni-bin\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.751837 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.751636 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-host-run-netns\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.751837 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.751659 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-host-run-ovn-kubernetes\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.751837 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.751683 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.751837 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.751693 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-host-kubelet\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.751837 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.751706 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f989907c-6b39-4f73-8cb2-9fb3915c446d-ovnkube-config\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.751837 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.751756 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx9xg\" (UniqueName: \"kubernetes.io/projected/3e84a5ff-c8e6-4c91-95b6-66697b65f3e6-kube-api-access-mx9xg\") pod \"network-check-target-qr7pr\" (UID: \"3e84a5ff-c8e6-4c91-95b6-66697b65f3e6\") " pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:18.751837 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.751781 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-host-cni-netd\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.751837 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.751807 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-var-lib-kubelet\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.751837 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.751825 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-run-ovn\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.752290 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.751842 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-node-log\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.752290 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.751859 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ef399f99-581e-48a8-a30b-557c95337f8e-etc-tuned\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.752290 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.751874 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-etc-modprobe-d\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.752290 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.751890 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-etc-systemd\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.752290 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.751908 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-run\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.752290 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.751928 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/38e938ac-334a-46a9-bd54-099927b87530-tmp-dir\") pod \"node-resolver-45gfj\" (UID: \"38e938ac-334a-46a9-bd54-099927b87530\") " pod="openshift-dns/node-resolver-45gfj" Apr 20 07:50:18.752290 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.751953 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-systemd-units\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.752290 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.751976 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f989907c-6b39-4f73-8cb2-9fb3915c446d-ovnkube-script-lib\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.752290 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.751993 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-etc-sysconfig\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.752290 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752024 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-sys\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.752290 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752064 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-etc-openvswitch\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.752290 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752075 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.752290 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752149 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-host\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.752290 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752155 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-host-cni-netd\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.752290 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752188 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-var-lib-kubelet\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.752290 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752198 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-host-cni-bin\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.752290 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752236 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-etc-openvswitch\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.752290 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752239 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-run-ovn\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.752948 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752262 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-host-run-netns\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.752948 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752274 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-sys\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.752948 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752302 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-host-run-ovn-kubernetes\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.752948 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752321 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-node-log\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.752948 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752089 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-host\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.752948 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752348 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-etc-kubernetes\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.752948 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752364 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-etc-sysctl-d\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.752948 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752379 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-run-openvswitch\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.752948 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752399 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f989907c-6b39-4f73-8cb2-9fb3915c446d-env-overrides\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.752948 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752429 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f989907c-6b39-4f73-8cb2-9fb3915c446d-ovn-node-metrics-cert\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.752948 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752455 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-host-slash\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.752948 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752471 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-etc-systemd\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.752948 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752479 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-var-lib-openvswitch\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.752948 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752471 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-etc-kubernetes\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.752948 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752478 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-systemd-units\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.752948 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752509 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a7425036-5ff0-42c9-9f51-3d27f67f9232-agent-certs\") pod \"konnectivity-agent-l8wjp\" (UID: \"a7425036-5ff0-42c9-9f51-3d27f67f9232\") " pod="kube-system/konnectivity-agent-l8wjp" Apr 20 07:50:18.752948 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752533 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-run-openvswitch\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.752948 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752535 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-etc-sysctl-conf\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.753727 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752565 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef399f99-581e-48a8-a30b-557c95337f8e-tmp\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.753727 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752588 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-log-socket\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.753727 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752613 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hvd8z\" (UniqueName: \"kubernetes.io/projected/f989907c-6b39-4f73-8cb2-9fb3915c446d-kube-api-access-hvd8z\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.753727 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752656 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-lib-modules\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.753727 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752659 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/38e938ac-334a-46a9-bd54-099927b87530-tmp-dir\") pod \"node-resolver-45gfj\" (UID: \"38e938ac-334a-46a9-bd54-099927b87530\") " pod="openshift-dns/node-resolver-45gfj" Apr 20 07:50:18.753727 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752682 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrrmr\" (UniqueName: \"kubernetes.io/projected/38e938ac-334a-46a9-bd54-099927b87530-kube-api-access-mrrmr\") pod \"node-resolver-45gfj\" (UID: \"38e938ac-334a-46a9-bd54-099927b87530\") " pod="openshift-dns/node-resolver-45gfj" Apr 20 07:50:18.753727 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752706 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-run-systemd\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.753727 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752735 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/38e938ac-334a-46a9-bd54-099927b87530-hosts-file\") pod \"node-resolver-45gfj\" (UID: \"38e938ac-334a-46a9-bd54-099927b87530\") " pod="openshift-dns/node-resolver-45gfj" Apr 20 07:50:18.753727 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752745 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-run\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.753727 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752754 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-host-slash\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.753727 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752765 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a7425036-5ff0-42c9-9f51-3d27f67f9232-konnectivity-ca\") pod \"konnectivity-agent-l8wjp\" (UID: \"a7425036-5ff0-42c9-9f51-3d27f67f9232\") " pod="kube-system/konnectivity-agent-l8wjp" Apr 20 07:50:18.753727 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752794 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjlw9\" (UniqueName: \"kubernetes.io/projected/ef399f99-581e-48a8-a30b-557c95337f8e-kube-api-access-xjlw9\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.753727 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752799 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-var-lib-openvswitch\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.753727 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752809 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-etc-sysconfig\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.753727 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752658 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-etc-sysctl-conf\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.753727 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752841 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-lib-modules\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.753727 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752860 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-log-socket\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.753727 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752930 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/38e938ac-334a-46a9-bd54-099927b87530-hosts-file\") pod \"node-resolver-45gfj\" (UID: \"38e938ac-334a-46a9-bd54-099927b87530\") " pod="openshift-dns/node-resolver-45gfj" Apr 20 07:50:18.754576 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.752963 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f989907c-6b39-4f73-8cb2-9fb3915c446d-run-systemd\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.754576 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.753016 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f989907c-6b39-4f73-8cb2-9fb3915c446d-ovnkube-config\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.754576 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.753068 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-etc-sysctl-d\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.754576 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.753209 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f989907c-6b39-4f73-8cb2-9fb3915c446d-env-overrides\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.754576 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.753309 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ef399f99-581e-48a8-a30b-557c95337f8e-etc-modprobe-d\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.754576 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.753871 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f989907c-6b39-4f73-8cb2-9fb3915c446d-ovnkube-script-lib\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.754849 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.754702 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a7425036-5ff0-42c9-9f51-3d27f67f9232-konnectivity-ca\") pod \"konnectivity-agent-l8wjp\" (UID: \"a7425036-5ff0-42c9-9f51-3d27f67f9232\") " pod="kube-system/konnectivity-agent-l8wjp" Apr 20 07:50:18.754849 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.754801 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ef399f99-581e-48a8-a30b-557c95337f8e-etc-tuned\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.755341 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.755317 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef399f99-581e-48a8-a30b-557c95337f8e-tmp\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.755912 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.755889 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f989907c-6b39-4f73-8cb2-9fb3915c446d-ovn-node-metrics-cert\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.756004 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.755982 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a7425036-5ff0-42c9-9f51-3d27f67f9232-agent-certs\") pod \"konnectivity-agent-l8wjp\" (UID: \"a7425036-5ff0-42c9-9f51-3d27f67f9232\") " pod="kube-system/konnectivity-agent-l8wjp" Apr 20 07:50:18.757947 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:18.757924 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:50:18.758046 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:18.757952 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:50:18.758046 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:18.757965 2572 projected.go:194] Error preparing data for projected volume kube-api-access-mx9xg for pod openshift-network-diagnostics/network-check-target-qr7pr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:18.758046 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:18.758030 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e84a5ff-c8e6-4c91-95b6-66697b65f3e6-kube-api-access-mx9xg podName:3e84a5ff-c8e6-4c91-95b6-66697b65f3e6 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:19.25801183 +0000 UTC m=+3.171641864 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mx9xg" (UniqueName: "kubernetes.io/projected/3e84a5ff-c8e6-4c91-95b6-66697b65f3e6-kube-api-access-mx9xg") pod "network-check-target-qr7pr" (UID: "3e84a5ff-c8e6-4c91-95b6-66697b65f3e6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:18.760720 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.760666 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrrmr\" (UniqueName: \"kubernetes.io/projected/38e938ac-334a-46a9-bd54-099927b87530-kube-api-access-mrrmr\") pod \"node-resolver-45gfj\" (UID: \"38e938ac-334a-46a9-bd54-099927b87530\") " pod="openshift-dns/node-resolver-45gfj" Apr 20 07:50:18.761446 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.761422 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjlw9\" (UniqueName: \"kubernetes.io/projected/ef399f99-581e-48a8-a30b-557c95337f8e-kube-api-access-xjlw9\") pod \"tuned-vnqcw\" (UID: \"ef399f99-581e-48a8-a30b-557c95337f8e\") " pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.761907 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.761890 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvd8z\" (UniqueName: \"kubernetes.io/projected/f989907c-6b39-4f73-8cb2-9fb3915c446d-kube-api-access-hvd8z\") pod \"ovnkube-node-55zsn\" (UID: \"f989907c-6b39-4f73-8cb2-9fb3915c446d\") " pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.832325 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.832287 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xsjgg" Apr 20 07:50:18.839258 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.839232 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" Apr 20 07:50:18.850909 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.850885 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-fcfwc" Apr 20 07:50:18.856634 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.856293 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lfk6z" Apr 20 07:50:18.862999 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.862978 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-ng5xl" Apr 20 07:50:18.870537 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.870520 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" Apr 20 07:50:18.878029 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.878003 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-45gfj" Apr 20 07:50:18.883602 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.883584 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:18.889231 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:18.889205 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-l8wjp" Apr 20 07:50:19.155391 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:19.155298 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs\") pod \"network-metrics-daemon-brq5h\" (UID: \"07219834-44d6-42ab-9058-aed46274d1a8\") " pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:19.155565 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:19.155465 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:19.155565 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:19.155547 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs podName:07219834-44d6-42ab-9058-aed46274d1a8 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:20.155526939 +0000 UTC m=+4.069156961 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs") pod "network-metrics-daemon-brq5h" (UID: "07219834-44d6-42ab-9058-aed46274d1a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:19.235282 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:19.235246 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fdddb28_e042_4361_94c8_ed537e5237f2.slice/crio-2b399010ea0076385a49b5c9634eca3d364976767299f196c645fc10a21ae4f0 WatchSource:0}: Error finding container 2b399010ea0076385a49b5c9634eca3d364976767299f196c645fc10a21ae4f0: Status 404 returned error can't find the container with id 2b399010ea0076385a49b5c9634eca3d364976767299f196c645fc10a21ae4f0 Apr 20 07:50:19.237397 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:19.237357 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60a64748_7449_441c_8327_211d296e5ef6.slice/crio-5e538caf0e7c5680a3075b22c0a22bddc99fd12d4bd039d489f4365b225391db WatchSource:0}: Error finding container 5e538caf0e7c5680a3075b22c0a22bddc99fd12d4bd039d489f4365b225391db: Status 404 returned error can't find the container with id 5e538caf0e7c5680a3075b22c0a22bddc99fd12d4bd039d489f4365b225391db Apr 20 07:50:19.238557 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:19.238531 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod893e108a_cd48_4c06_80c6_167a8ad53ac2.slice/crio-353f953f69e9914f240659f500f11167abb962166c5a81837899806d8a1ec3d3 WatchSource:0}: Error finding container 353f953f69e9914f240659f500f11167abb962166c5a81837899806d8a1ec3d3: Status 404 returned error can't find the container with id 353f953f69e9914f240659f500f11167abb962166c5a81837899806d8a1ec3d3 Apr 20 07:50:19.240457 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:19.240407 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf989907c_6b39_4f73_8cb2_9fb3915c446d.slice/crio-df2f33c445e01858322fd506bd9c46c217dc324fef1e8054cc2f42523d4cfe94 WatchSource:0}: Error finding container df2f33c445e01858322fd506bd9c46c217dc324fef1e8054cc2f42523d4cfe94: Status 404 returned error can't find the container with id df2f33c445e01858322fd506bd9c46c217dc324fef1e8054cc2f42523d4cfe94 Apr 20 07:50:19.242231 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:19.242209 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4462eeb2_94f5_4c1c_bc8e_62fe9f10c78f.slice/crio-dfa14f71b4b0bd7bda40a101599225afce8fbeac01c72f89e6946c957370c224 WatchSource:0}: Error finding container dfa14f71b4b0bd7bda40a101599225afce8fbeac01c72f89e6946c957370c224: Status 404 returned error can't find the container with id dfa14f71b4b0bd7bda40a101599225afce8fbeac01c72f89e6946c957370c224 Apr 20 07:50:19.243694 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:19.243668 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7425036_5ff0_42c9_9f51_3d27f67f9232.slice/crio-3b8ff00650b38c3df59afd51c30922ba93bc6730b15bd8bbfb40e2600a10b90c WatchSource:0}: Error finding container 3b8ff00650b38c3df59afd51c30922ba93bc6730b15bd8bbfb40e2600a10b90c: Status 404 returned error can't find the container with id 3b8ff00650b38c3df59afd51c30922ba93bc6730b15bd8bbfb40e2600a10b90c Apr 20 07:50:19.357256 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:19.357073 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx9xg\" (UniqueName: \"kubernetes.io/projected/3e84a5ff-c8e6-4c91-95b6-66697b65f3e6-kube-api-access-mx9xg\") pod \"network-check-target-qr7pr\" (UID: \"3e84a5ff-c8e6-4c91-95b6-66697b65f3e6\") " pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:19.357434 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:19.357267 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:50:19.357434 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:19.357288 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:50:19.357434 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:19.357298 2572 projected.go:194] Error preparing data for projected volume kube-api-access-mx9xg for pod openshift-network-diagnostics/network-check-target-qr7pr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:19.357434 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:19.357352 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e84a5ff-c8e6-4c91-95b6-66697b65f3e6-kube-api-access-mx9xg podName:3e84a5ff-c8e6-4c91-95b6-66697b65f3e6 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:20.357338138 +0000 UTC m=+4.270968162 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-mx9xg" (UniqueName: "kubernetes.io/projected/3e84a5ff-c8e6-4c91-95b6-66697b65f3e6-kube-api-access-mx9xg") pod "network-check-target-qr7pr" (UID: "3e84a5ff-c8e6-4c91-95b6-66697b65f3e6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:19.580333 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:19.580214 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 07:45:17 +0000 UTC" deadline="2028-01-13 20:25:02.255391742 +0000 UTC" Apr 20 07:50:19.580333 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:19.580254 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15204h34m42.675140831s" Apr 20 07:50:19.642014 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:19.641947 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:19.642211 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:19.642096 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-brq5h" podUID="07219834-44d6-42ab-9058-aed46274d1a8" Apr 20 07:50:19.655706 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:19.655667 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-161.ec2.internal" event={"ID":"71c8bd7a095056144ad8091ca3c68103","Type":"ContainerStarted","Data":"c33c66e269a9ddaa35734a7bbcd176bd09b41b3f18fb56f267639e5327d2dbec"} Apr 20 07:50:19.671863 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:19.671574 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-45gfj" event={"ID":"38e938ac-334a-46a9-bd54-099927b87530","Type":"ContainerStarted","Data":"b586c8760faa8e01feacbb863d2ab5bbef89b4c5d39f192e8b942e4fd1a0814d"} Apr 20 07:50:19.680874 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:19.680842 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" event={"ID":"ef399f99-581e-48a8-a30b-557c95337f8e","Type":"ContainerStarted","Data":"ec450274868d52ea6ee9b40acb6ddb2b4581ee41771bb523a8f23ee32c59845b"} Apr 20 07:50:19.682548 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:19.682519 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fcfwc" event={"ID":"4462eeb2-94f5-4c1c-bc8e-62fe9f10c78f","Type":"ContainerStarted","Data":"dfa14f71b4b0bd7bda40a101599225afce8fbeac01c72f89e6946c957370c224"} Apr 20 07:50:19.694194 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:19.694164 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-ng5xl" event={"ID":"2fdddb28-e042-4361-94c8-ed537e5237f2","Type":"ContainerStarted","Data":"2b399010ea0076385a49b5c9634eca3d364976767299f196c645fc10a21ae4f0"} Apr 20 07:50:19.703840 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:19.703810 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xsjgg" event={"ID":"edf19122-ee32-4e12-a720-45239728231d","Type":"ContainerStarted","Data":"5dc36f622a308a74773f9217d44aefbd856802e61e243e176c8c820866b2ce99"} Apr 20 07:50:19.709628 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:19.709598 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-l8wjp" event={"ID":"a7425036-5ff0-42c9-9f51-3d27f67f9232","Type":"ContainerStarted","Data":"3b8ff00650b38c3df59afd51c30922ba93bc6730b15bd8bbfb40e2600a10b90c"} Apr 20 07:50:19.718217 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:19.718190 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" event={"ID":"f989907c-6b39-4f73-8cb2-9fb3915c446d","Type":"ContainerStarted","Data":"df2f33c445e01858322fd506bd9c46c217dc324fef1e8054cc2f42523d4cfe94"} Apr 20 07:50:19.725984 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:19.725957 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" event={"ID":"60a64748-7449-441c-8327-211d296e5ef6","Type":"ContainerStarted","Data":"5e538caf0e7c5680a3075b22c0a22bddc99fd12d4bd039d489f4365b225391db"} Apr 20 07:50:19.730315 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:19.730289 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lfk6z" event={"ID":"893e108a-cd48-4c06-80c6-167a8ad53ac2","Type":"ContainerStarted","Data":"353f953f69e9914f240659f500f11167abb962166c5a81837899806d8a1ec3d3"} Apr 20 07:50:20.174617 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:20.174577 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs\") pod \"network-metrics-daemon-brq5h\" (UID: \"07219834-44d6-42ab-9058-aed46274d1a8\") " pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:20.174792 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:20.174724 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:20.174792 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:20.174786 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs podName:07219834-44d6-42ab-9058-aed46274d1a8 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:22.174767899 +0000 UTC m=+6.088397937 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs") pod "network-metrics-daemon-brq5h" (UID: "07219834-44d6-42ab-9058-aed46274d1a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:20.376069 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:20.375984 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx9xg\" (UniqueName: \"kubernetes.io/projected/3e84a5ff-c8e6-4c91-95b6-66697b65f3e6-kube-api-access-mx9xg\") pod \"network-check-target-qr7pr\" (UID: \"3e84a5ff-c8e6-4c91-95b6-66697b65f3e6\") " pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:20.376261 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:20.376188 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:50:20.376261 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:20.376209 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:50:20.376261 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:20.376221 2572 projected.go:194] Error preparing data for projected volume kube-api-access-mx9xg for pod openshift-network-diagnostics/network-check-target-qr7pr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:20.376409 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:20.376282 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e84a5ff-c8e6-4c91-95b6-66697b65f3e6-kube-api-access-mx9xg podName:3e84a5ff-c8e6-4c91-95b6-66697b65f3e6 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:22.376263285 +0000 UTC m=+6.289893312 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-mx9xg" (UniqueName: "kubernetes.io/projected/3e84a5ff-c8e6-4c91-95b6-66697b65f3e6-kube-api-access-mx9xg") pod "network-check-target-qr7pr" (UID: "3e84a5ff-c8e6-4c91-95b6-66697b65f3e6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:20.643449 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:20.643378 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:20.643859 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:20.643513 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qr7pr" podUID="3e84a5ff-c8e6-4c91-95b6-66697b65f3e6" Apr 20 07:50:20.771364 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:20.771293 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" event={"ID":"17556b0047f1a6a15f4b7d5854560826","Type":"ContainerDied","Data":"14eef9a487db6c497f125a18ab6d2cfad7b96b966a6890a33a06554bd423beea"} Apr 20 07:50:20.772423 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:20.771858 2572 generic.go:358] "Generic (PLEG): container finished" podID="17556b0047f1a6a15f4b7d5854560826" containerID="14eef9a487db6c497f125a18ab6d2cfad7b96b966a6890a33a06554bd423beea" exitCode=0 Apr 20 07:50:20.785542 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:20.784177 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-161.ec2.internal" podStartSLOduration=2.784158195 podStartE2EDuration="2.784158195s" podCreationTimestamp="2026-04-20 07:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:50:19.680030897 +0000 UTC m=+3.593660943" watchObservedRunningTime="2026-04-20 07:50:20.784158195 +0000 UTC m=+4.697788239" Apr 20 07:50:21.641837 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:21.641793 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:21.642033 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:21.641956 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-brq5h" podUID="07219834-44d6-42ab-9058-aed46274d1a8" Apr 20 07:50:21.780510 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:21.780474 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" event={"ID":"17556b0047f1a6a15f4b7d5854560826","Type":"ContainerStarted","Data":"d5336462fef6f35e3c26dc0459a18152ffd3340d19d4e43eaf158387df6a5ef1"} Apr 20 07:50:22.190771 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:22.190730 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs\") pod \"network-metrics-daemon-brq5h\" (UID: \"07219834-44d6-42ab-9058-aed46274d1a8\") " pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:22.190963 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:22.190909 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:22.191065 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:22.190976 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs podName:07219834-44d6-42ab-9058-aed46274d1a8 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:26.190956245 +0000 UTC m=+10.104586270 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs") pod "network-metrics-daemon-brq5h" (UID: "07219834-44d6-42ab-9058-aed46274d1a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:22.392051 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:22.391997 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx9xg\" (UniqueName: \"kubernetes.io/projected/3e84a5ff-c8e6-4c91-95b6-66697b65f3e6-kube-api-access-mx9xg\") pod \"network-check-target-qr7pr\" (UID: \"3e84a5ff-c8e6-4c91-95b6-66697b65f3e6\") " pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:22.392233 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:22.392161 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:50:22.392233 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:22.392177 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:50:22.392233 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:22.392186 2572 projected.go:194] Error preparing data for projected volume kube-api-access-mx9xg for pod openshift-network-diagnostics/network-check-target-qr7pr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:22.392233 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:22.392232 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e84a5ff-c8e6-4c91-95b6-66697b65f3e6-kube-api-access-mx9xg podName:3e84a5ff-c8e6-4c91-95b6-66697b65f3e6 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:26.392218634 +0000 UTC m=+10.305848655 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-mx9xg" (UniqueName: "kubernetes.io/projected/3e84a5ff-c8e6-4c91-95b6-66697b65f3e6-kube-api-access-mx9xg") pod "network-check-target-qr7pr" (UID: "3e84a5ff-c8e6-4c91-95b6-66697b65f3e6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:22.642001 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:22.641921 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:22.642165 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:22.642042 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qr7pr" podUID="3e84a5ff-c8e6-4c91-95b6-66697b65f3e6" Apr 20 07:50:23.641968 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:23.641932 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:23.642502 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:23.642070 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-brq5h" podUID="07219834-44d6-42ab-9058-aed46274d1a8" Apr 20 07:50:24.644863 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:24.644831 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:24.645355 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:24.644948 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qr7pr" podUID="3e84a5ff-c8e6-4c91-95b6-66697b65f3e6" Apr 20 07:50:25.641437 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:25.641401 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:25.641612 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:25.641557 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-brq5h" podUID="07219834-44d6-42ab-9058-aed46274d1a8" Apr 20 07:50:26.227724 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:26.227683 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs\") pod \"network-metrics-daemon-brq5h\" (UID: \"07219834-44d6-42ab-9058-aed46274d1a8\") " pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:26.228193 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:26.227801 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:26.228193 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:26.227852 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs podName:07219834-44d6-42ab-9058-aed46274d1a8 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:34.227838521 +0000 UTC m=+18.141468545 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs") pod "network-metrics-daemon-brq5h" (UID: "07219834-44d6-42ab-9058-aed46274d1a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:26.429848 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:26.429802 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx9xg\" (UniqueName: \"kubernetes.io/projected/3e84a5ff-c8e6-4c91-95b6-66697b65f3e6-kube-api-access-mx9xg\") pod \"network-check-target-qr7pr\" (UID: \"3e84a5ff-c8e6-4c91-95b6-66697b65f3e6\") " pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:26.430011 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:26.429982 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:50:26.430011 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:26.430006 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:50:26.430124 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:26.430020 2572 projected.go:194] Error preparing data for projected volume kube-api-access-mx9xg for pod openshift-network-diagnostics/network-check-target-qr7pr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:26.430124 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:26.430084 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e84a5ff-c8e6-4c91-95b6-66697b65f3e6-kube-api-access-mx9xg podName:3e84a5ff-c8e6-4c91-95b6-66697b65f3e6 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:34.430065928 +0000 UTC m=+18.343695962 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-mx9xg" (UniqueName: "kubernetes.io/projected/3e84a5ff-c8e6-4c91-95b6-66697b65f3e6-kube-api-access-mx9xg") pod "network-check-target-qr7pr" (UID: "3e84a5ff-c8e6-4c91-95b6-66697b65f3e6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:26.644872 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:26.644841 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:26.645013 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:26.644962 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qr7pr" podUID="3e84a5ff-c8e6-4c91-95b6-66697b65f3e6" Apr 20 07:50:27.641307 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:27.641266 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:27.641772 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:27.641416 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-brq5h" podUID="07219834-44d6-42ab-9058-aed46274d1a8" Apr 20 07:50:28.644432 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:28.644354 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:28.644822 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:28.644451 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qr7pr" podUID="3e84a5ff-c8e6-4c91-95b6-66697b65f3e6" Apr 20 07:50:29.641271 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:29.641233 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:29.641429 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:29.641357 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-brq5h" podUID="07219834-44d6-42ab-9058-aed46274d1a8" Apr 20 07:50:30.644123 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:30.644096 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:30.644608 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:30.644216 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qr7pr" podUID="3e84a5ff-c8e6-4c91-95b6-66697b65f3e6" Apr 20 07:50:31.641082 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:31.641049 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:31.641347 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:31.641176 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-brq5h" podUID="07219834-44d6-42ab-9058-aed46274d1a8" Apr 20 07:50:32.641040 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:32.641003 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:32.641521 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:32.641135 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qr7pr" podUID="3e84a5ff-c8e6-4c91-95b6-66697b65f3e6" Apr 20 07:50:33.641158 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:33.641116 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:33.641644 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:33.641268 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-brq5h" podUID="07219834-44d6-42ab-9058-aed46274d1a8" Apr 20 07:50:34.288677 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:34.288643 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs\") pod \"network-metrics-daemon-brq5h\" (UID: \"07219834-44d6-42ab-9058-aed46274d1a8\") " pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:34.288848 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:34.288767 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:34.288848 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:34.288833 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs podName:07219834-44d6-42ab-9058-aed46274d1a8 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:50.288817142 +0000 UTC m=+34.202447163 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs") pod "network-metrics-daemon-brq5h" (UID: "07219834-44d6-42ab-9058-aed46274d1a8") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:50:34.489657 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:34.489620 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx9xg\" (UniqueName: \"kubernetes.io/projected/3e84a5ff-c8e6-4c91-95b6-66697b65f3e6-kube-api-access-mx9xg\") pod \"network-check-target-qr7pr\" (UID: \"3e84a5ff-c8e6-4c91-95b6-66697b65f3e6\") " pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:34.489813 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:34.489786 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:50:34.489813 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:34.489813 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:50:34.489884 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:34.489828 2572 projected.go:194] Error preparing data for projected volume kube-api-access-mx9xg for pod openshift-network-diagnostics/network-check-target-qr7pr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:34.489924 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:34.489890 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3e84a5ff-c8e6-4c91-95b6-66697b65f3e6-kube-api-access-mx9xg podName:3e84a5ff-c8e6-4c91-95b6-66697b65f3e6 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:50.489873336 +0000 UTC m=+34.403503380 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-mx9xg" (UniqueName: "kubernetes.io/projected/3e84a5ff-c8e6-4c91-95b6-66697b65f3e6-kube-api-access-mx9xg") pod "network-check-target-qr7pr" (UID: "3e84a5ff-c8e6-4c91-95b6-66697b65f3e6") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:50:34.641037 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:34.640963 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:34.641222 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:34.641091 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qr7pr" podUID="3e84a5ff-c8e6-4c91-95b6-66697b65f3e6" Apr 20 07:50:35.641860 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:35.641826 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:35.642260 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:35.641944 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-brq5h" podUID="07219834-44d6-42ab-9058-aed46274d1a8" Apr 20 07:50:36.642087 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:36.641892 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:36.642515 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:36.642211 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qr7pr" podUID="3e84a5ff-c8e6-4c91-95b6-66697b65f3e6" Apr 20 07:50:36.807999 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:36.807971 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lfk6z" event={"ID":"893e108a-cd48-4c06-80c6-167a8ad53ac2","Type":"ContainerStarted","Data":"8d0f3951cd892cb23d368732455d53ab8ca3968d6b2405fb165479e1e4970159"} Apr 20 07:50:36.809093 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:36.809061 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-45gfj" event={"ID":"38e938ac-334a-46a9-bd54-099927b87530","Type":"ContainerStarted","Data":"552c3595770a971cfa7a2fbf479553ffa05a502067bc428e2895b05e16e71cb3"} Apr 20 07:50:36.810276 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:36.810257 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" event={"ID":"ef399f99-581e-48a8-a30b-557c95337f8e","Type":"ContainerStarted","Data":"005bf81d6664f9ba3916e3ad0aa7c2de18422942c4b3934cf8ec2022ab8d5f0e"} Apr 20 07:50:36.811489 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:36.811455 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-fcfwc" event={"ID":"4462eeb2-94f5-4c1c-bc8e-62fe9f10c78f","Type":"ContainerStarted","Data":"15009fc59f12bcadee113675b0a28d324bba631f454993c4812b5c0cb6762c1d"} Apr 20 07:50:36.812709 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:36.812690 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xsjgg" event={"ID":"edf19122-ee32-4e12-a720-45239728231d","Type":"ContainerStarted","Data":"7d6629837429bc190d1d50344185f40dda66f27da67d52b96ac6cbadd3f48cba"} Apr 20 07:50:36.813866 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:36.813845 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-l8wjp" event={"ID":"a7425036-5ff0-42c9-9f51-3d27f67f9232","Type":"ContainerStarted","Data":"d1ebe8ee5f8017e89bfac1d3e0b09f84def57d0be683700c52a1da6f0aff3541"} Apr 20 07:50:36.815312 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:36.815296 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovn-acl-logging/0.log" Apr 20 07:50:36.815580 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:36.815560 2572 generic.go:358] "Generic (PLEG): container finished" podID="f989907c-6b39-4f73-8cb2-9fb3915c446d" containerID="63f33c0d92d30a20771be62ddabc5e6228667a5a55df1d3fd6db264e674eb1df" exitCode=1 Apr 20 07:50:36.815660 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:36.815613 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" event={"ID":"f989907c-6b39-4f73-8cb2-9fb3915c446d","Type":"ContainerStarted","Data":"e258cc5d2b82779bbd2723987ea90678ffa27a294bcf2a9b8fc18e1bc52aac23"} Apr 20 07:50:36.815660 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:36.815630 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" event={"ID":"f989907c-6b39-4f73-8cb2-9fb3915c446d","Type":"ContainerDied","Data":"63f33c0d92d30a20771be62ddabc5e6228667a5a55df1d3fd6db264e674eb1df"} Apr 20 07:50:36.815660 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:36.815642 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" event={"ID":"f989907c-6b39-4f73-8cb2-9fb3915c446d","Type":"ContainerStarted","Data":"cafe7ec81d6a3f31a244646ba2d032ff693ed1bf527d890624ea0fb7f2b6595e"} Apr 20 07:50:36.816640 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:36.816623 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" event={"ID":"60a64748-7449-441c-8327-211d296e5ef6","Type":"ContainerStarted","Data":"f2ce4a3e6d7079ff066da5a8f55ce716ac4e98d9f48eaa711689c49954591267"} Apr 20 07:50:36.835711 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:36.835672 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-161.ec2.internal" podStartSLOduration=18.835660057 podStartE2EDuration="18.835660057s" podCreationTimestamp="2026-04-20 07:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:50:21.794288004 +0000 UTC m=+5.707918049" watchObservedRunningTime="2026-04-20 07:50:36.835660057 +0000 UTC m=+20.749290135" Apr 20 07:50:36.851671 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:36.851589 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-l8wjp" podStartSLOduration=3.831767111 podStartE2EDuration="20.8515738s" podCreationTimestamp="2026-04-20 07:50:16 +0000 UTC" firstStartedPulling="2026-04-20 07:50:19.248154905 +0000 UTC m=+3.161784939" lastFinishedPulling="2026-04-20 07:50:36.267961595 +0000 UTC m=+20.181591628" observedRunningTime="2026-04-20 07:50:36.851430701 +0000 UTC m=+20.765060745" watchObservedRunningTime="2026-04-20 07:50:36.8515738 +0000 UTC m=+20.765204201" Apr 20 07:50:36.882888 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:36.882742 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-vnqcw" podStartSLOduration=3.835818152 podStartE2EDuration="20.882727463s" podCreationTimestamp="2026-04-20 07:50:16 +0000 UTC" firstStartedPulling="2026-04-20 07:50:19.248091492 +0000 UTC m=+3.161721516" lastFinishedPulling="2026-04-20 07:50:36.295000803 +0000 UTC m=+20.208630827" observedRunningTime="2026-04-20 07:50:36.882189492 +0000 UTC m=+20.795819536" watchObservedRunningTime="2026-04-20 07:50:36.882727463 +0000 UTC m=+20.796357506" Apr 20 07:50:36.883018 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:36.882967 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xsjgg" podStartSLOduration=3.8253565739999997 podStartE2EDuration="20.882961577s" podCreationTimestamp="2026-04-20 07:50:16 +0000 UTC" firstStartedPulling="2026-04-20 07:50:19.24777188 +0000 UTC m=+3.161401913" lastFinishedPulling="2026-04-20 07:50:36.305376895 +0000 UTC m=+20.219006916" observedRunningTime="2026-04-20 07:50:36.867962431 +0000 UTC m=+20.781592473" watchObservedRunningTime="2026-04-20 07:50:36.882961577 +0000 UTC m=+20.796591620" Apr 20 07:50:36.899893 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:36.899849 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-45gfj" podStartSLOduration=3.855021765 podStartE2EDuration="20.899836787s" podCreationTimestamp="2026-04-20 07:50:16 +0000 UTC" firstStartedPulling="2026-04-20 07:50:19.247856782 +0000 UTC m=+3.161486802" lastFinishedPulling="2026-04-20 07:50:36.292671789 +0000 UTC m=+20.206301824" observedRunningTime="2026-04-20 07:50:36.899627837 +0000 UTC m=+20.813257885" watchObservedRunningTime="2026-04-20 07:50:36.899836787 +0000 UTC m=+20.813466829" Apr 20 07:50:36.916076 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:36.916032 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-fcfwc" podStartSLOduration=11.874867257 podStartE2EDuration="20.916016656s" podCreationTimestamp="2026-04-20 07:50:16 +0000 UTC" firstStartedPulling="2026-04-20 07:50:19.243882855 +0000 UTC m=+3.157512880" lastFinishedPulling="2026-04-20 07:50:28.285032241 +0000 UTC m=+12.198662279" observedRunningTime="2026-04-20 07:50:36.915396568 +0000 UTC m=+20.829026612" watchObservedRunningTime="2026-04-20 07:50:36.916016656 +0000 UTC m=+20.829646697" Apr 20 07:50:37.642010 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:37.641941 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:37.642161 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:37.642056 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-brq5h" podUID="07219834-44d6-42ab-9058-aed46274d1a8" Apr 20 07:50:37.820133 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:37.820107 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovn-acl-logging/0.log" Apr 20 07:50:37.820532 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:37.820504 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" event={"ID":"f989907c-6b39-4f73-8cb2-9fb3915c446d","Type":"ContainerStarted","Data":"27f07a42322a6f9ea504997a5281b398fa57d412818b6e7b15251a7b8a367943"} Apr 20 07:50:37.820631 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:37.820544 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" event={"ID":"f989907c-6b39-4f73-8cb2-9fb3915c446d","Type":"ContainerStarted","Data":"52f71b5a52d14dc799f4cd265faa2c96bd9c149e2c1ec551d4590d729f6b9acf"} Apr 20 07:50:37.820631 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:37.820557 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" event={"ID":"f989907c-6b39-4f73-8cb2-9fb3915c446d","Type":"ContainerStarted","Data":"214baf055efde6e976d87c4a7c1907d97ecf6a99efff0b1f10f375b72835423b"} Apr 20 07:50:37.821710 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:37.821687 2572 generic.go:358] "Generic (PLEG): container finished" podID="893e108a-cd48-4c06-80c6-167a8ad53ac2" containerID="8d0f3951cd892cb23d368732455d53ab8ca3968d6b2405fb165479e1e4970159" exitCode=0 Apr 20 07:50:37.821836 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:37.821751 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lfk6z" event={"ID":"893e108a-cd48-4c06-80c6-167a8ad53ac2","Type":"ContainerDied","Data":"8d0f3951cd892cb23d368732455d53ab8ca3968d6b2405fb165479e1e4970159"} Apr 20 07:50:37.823008 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:37.822975 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-ng5xl" event={"ID":"2fdddb28-e042-4361-94c8-ed537e5237f2","Type":"ContainerStarted","Data":"9c888d69876a14456e3fba331379556e6c2be500ea7c38c91667f91eca5740ea"} Apr 20 07:50:37.855054 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:37.855011 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-ng5xl" podStartSLOduration=4.825637589 podStartE2EDuration="21.854992162s" podCreationTimestamp="2026-04-20 07:50:16 +0000 UTC" firstStartedPulling="2026-04-20 07:50:19.238640331 +0000 UTC m=+3.152270373" lastFinishedPulling="2026-04-20 07:50:36.267994918 +0000 UTC m=+20.181624946" observedRunningTime="2026-04-20 07:50:37.854868464 +0000 UTC m=+21.768498508" watchObservedRunningTime="2026-04-20 07:50:37.854992162 +0000 UTC m=+21.768622204" Apr 20 07:50:38.105228 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:38.105204 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 07:50:38.614197 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:38.614026 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T07:50:38.10522355Z","UUID":"2cc61122-07d0-41a8-8144-6924372168de","Handler":null,"Name":"","Endpoint":""} Apr 20 07:50:38.616266 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:38.616244 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 07:50:38.616379 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:38.616274 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 07:50:38.641285 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:38.641242 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:38.641448 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:38.641359 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qr7pr" podUID="3e84a5ff-c8e6-4c91-95b6-66697b65f3e6" Apr 20 07:50:38.827212 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:38.827172 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" event={"ID":"60a64748-7449-441c-8327-211d296e5ef6","Type":"ContainerStarted","Data":"429db160e005a562073c9f9e144076f57e9655ba9ce4a8b4dbb994439099778e"} Apr 20 07:50:39.641911 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:39.641676 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:39.642069 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:39.642029 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-brq5h" podUID="07219834-44d6-42ab-9058-aed46274d1a8" Apr 20 07:50:39.832722 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:39.832697 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovn-acl-logging/0.log" Apr 20 07:50:39.833156 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:39.833033 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" event={"ID":"f989907c-6b39-4f73-8cb2-9fb3915c446d","Type":"ContainerStarted","Data":"755795081ee875260a888f9d7f5897a6b69fe0fbc2f041250ade9af5a47a4744"} Apr 20 07:50:39.834906 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:39.834880 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" event={"ID":"60a64748-7449-441c-8327-211d296e5ef6","Type":"ContainerStarted","Data":"6c5be461e483ac8ac7b5b1c5abe722dd26783b924373c0d89e610890599f3cdc"} Apr 20 07:50:39.852391 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:39.852345 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-rl8rl" podStartSLOduration=3.734622496 podStartE2EDuration="23.852330439s" podCreationTimestamp="2026-04-20 07:50:16 +0000 UTC" firstStartedPulling="2026-04-20 07:50:19.239872622 +0000 UTC m=+3.153502642" lastFinishedPulling="2026-04-20 07:50:39.357580561 +0000 UTC m=+23.271210585" observedRunningTime="2026-04-20 07:50:39.850791155 +0000 UTC m=+23.764421202" watchObservedRunningTime="2026-04-20 07:50:39.852330439 +0000 UTC m=+23.765960482" Apr 20 07:50:40.641567 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:40.641538 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:40.641770 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:40.641649 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qr7pr" podUID="3e84a5ff-c8e6-4c91-95b6-66697b65f3e6" Apr 20 07:50:41.641002 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:41.640974 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:41.641562 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:41.641093 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-brq5h" podUID="07219834-44d6-42ab-9058-aed46274d1a8" Apr 20 07:50:41.688953 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:41.688912 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-l8wjp" Apr 20 07:50:41.689610 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:41.689589 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-l8wjp" Apr 20 07:50:41.839108 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:41.839040 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-l8wjp" Apr 20 07:50:41.839624 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:41.839607 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-l8wjp" Apr 20 07:50:42.642201 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:42.641991 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:42.642605 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:42.642261 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qr7pr" podUID="3e84a5ff-c8e6-4c91-95b6-66697b65f3e6" Apr 20 07:50:42.843792 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:42.843765 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovn-acl-logging/0.log" Apr 20 07:50:42.844077 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:42.844054 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" event={"ID":"f989907c-6b39-4f73-8cb2-9fb3915c446d","Type":"ContainerStarted","Data":"e900c00b74bb80b5eb7151f748e7a2667eb0457586c9d64da7fd84e3501ee9d3"} Apr 20 07:50:42.844404 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:42.844386 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:42.844547 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:42.844532 2572 scope.go:117] "RemoveContainer" containerID="63f33c0d92d30a20771be62ddabc5e6228667a5a55df1d3fd6db264e674eb1df" Apr 20 07:50:42.845686 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:42.845666 2572 generic.go:358] "Generic (PLEG): container finished" podID="893e108a-cd48-4c06-80c6-167a8ad53ac2" containerID="5972a436195fb673c797843db8bd35cb12d76277ee3ee5f2ba1e51744cb9f5f0" exitCode=0 Apr 20 07:50:42.845782 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:42.845753 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lfk6z" event={"ID":"893e108a-cd48-4c06-80c6-167a8ad53ac2","Type":"ContainerDied","Data":"5972a436195fb673c797843db8bd35cb12d76277ee3ee5f2ba1e51744cb9f5f0"} Apr 20 07:50:42.859273 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:42.859252 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:43.641680 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:43.641650 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:43.641851 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:43.641787 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-brq5h" podUID="07219834-44d6-42ab-9058-aed46274d1a8" Apr 20 07:50:43.851945 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:43.851880 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovn-acl-logging/0.log" Apr 20 07:50:43.852406 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:43.852383 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" event={"ID":"f989907c-6b39-4f73-8cb2-9fb3915c446d","Type":"ContainerStarted","Data":"b1f9dd85136ef54a63a22178587e70c5cadc45b1db534315d4865b04b190201b"} Apr 20 07:50:43.852914 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:43.852890 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:43.853017 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:43.852920 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:43.869365 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:43.869343 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:50:43.888286 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:43.888242 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" podStartSLOduration=10.782413671 podStartE2EDuration="27.888228272s" podCreationTimestamp="2026-04-20 07:50:16 +0000 UTC" firstStartedPulling="2026-04-20 07:50:19.242602827 +0000 UTC m=+3.156232847" lastFinishedPulling="2026-04-20 07:50:36.348417411 +0000 UTC m=+20.262047448" observedRunningTime="2026-04-20 07:50:43.887641058 +0000 UTC m=+27.801271104" watchObservedRunningTime="2026-04-20 07:50:43.888228272 +0000 UTC m=+27.801858315" Apr 20 07:50:44.115261 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:44.115227 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qr7pr"] Apr 20 07:50:44.115419 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:44.115361 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:44.115492 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:44.115462 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qr7pr" podUID="3e84a5ff-c8e6-4c91-95b6-66697b65f3e6" Apr 20 07:50:44.118265 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:44.118243 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-brq5h"] Apr 20 07:50:44.118342 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:44.118321 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:44.118419 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:44.118400 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-brq5h" podUID="07219834-44d6-42ab-9058-aed46274d1a8" Apr 20 07:50:44.856497 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:44.856465 2572 generic.go:358] "Generic (PLEG): container finished" podID="893e108a-cd48-4c06-80c6-167a8ad53ac2" containerID="274d3ff0df90ae7832125ef57d16dff935258f6a4d23c7e56ac0afa34686c487" exitCode=0 Apr 20 07:50:44.857022 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:44.856679 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lfk6z" event={"ID":"893e108a-cd48-4c06-80c6-167a8ad53ac2","Type":"ContainerDied","Data":"274d3ff0df90ae7832125ef57d16dff935258f6a4d23c7e56ac0afa34686c487"} Apr 20 07:50:45.641386 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:45.641201 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:45.641543 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:45.641257 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:45.641543 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:45.641472 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-brq5h" podUID="07219834-44d6-42ab-9058-aed46274d1a8" Apr 20 07:50:45.641623 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:45.641542 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qr7pr" podUID="3e84a5ff-c8e6-4c91-95b6-66697b65f3e6" Apr 20 07:50:46.862284 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:46.862251 2572 generic.go:358] "Generic (PLEG): container finished" podID="893e108a-cd48-4c06-80c6-167a8ad53ac2" containerID="baef7c7018c77f30547318959ed33d1d04288fb986632a479e00cd7b38e109f2" exitCode=0 Apr 20 07:50:46.862775 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:46.862296 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lfk6z" event={"ID":"893e108a-cd48-4c06-80c6-167a8ad53ac2","Type":"ContainerDied","Data":"baef7c7018c77f30547318959ed33d1d04288fb986632a479e00cd7b38e109f2"} Apr 20 07:50:47.641576 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:47.641539 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:47.641763 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:47.641540 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:47.641763 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:47.641657 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-brq5h" podUID="07219834-44d6-42ab-9058-aed46274d1a8" Apr 20 07:50:47.641763 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:47.641723 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qr7pr" podUID="3e84a5ff-c8e6-4c91-95b6-66697b65f3e6" Apr 20 07:50:49.458392 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.458309 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-161.ec2.internal" event="NodeReady" Apr 20 07:50:49.458758 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.458452 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 07:50:49.499409 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.499377 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7fps4"] Apr 20 07:50:49.520507 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.520476 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9bxrl"] Apr 20 07:50:49.520681 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.520662 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7fps4" Apr 20 07:50:49.523393 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.523202 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 07:50:49.523393 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.523382 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 07:50:49.523594 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.523501 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dvf6t\"" Apr 20 07:50:49.536041 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.536019 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7fps4"] Apr 20 07:50:49.536041 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.536043 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9bxrl"] Apr 20 07:50:49.536221 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.536135 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9bxrl" Apr 20 07:50:49.538823 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.538802 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 07:50:49.538962 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.538843 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 07:50:49.539327 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.539309 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mxq77\"" Apr 20 07:50:49.539443 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.539378 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 07:50:49.641292 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.641262 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:49.641487 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.641275 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:49.644377 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.644173 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-x6tc5\"" Apr 20 07:50:49.644377 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.644238 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 07:50:49.644377 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.644291 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 07:50:49.644377 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.644327 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 07:50:49.644678 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.644584 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2g69h\"" Apr 20 07:50:49.708196 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.708159 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert\") pod \"ingress-canary-9bxrl\" (UID: \"b659a68e-b039-4864-b691-ff12b7393ed7\") " pod="openshift-ingress-canary/ingress-canary-9bxrl" Apr 20 07:50:49.708353 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.708220 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl5rs\" (UniqueName: \"kubernetes.io/projected/b659a68e-b039-4864-b691-ff12b7393ed7-kube-api-access-tl5rs\") pod \"ingress-canary-9bxrl\" (UID: \"b659a68e-b039-4864-b691-ff12b7393ed7\") " pod="openshift-ingress-canary/ingress-canary-9bxrl" Apr 20 07:50:49.708353 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.708273 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q45c\" (UniqueName: \"kubernetes.io/projected/0993c493-f978-431e-9000-290ab9fb0bbe-kube-api-access-7q45c\") pod \"dns-default-7fps4\" (UID: \"0993c493-f978-431e-9000-290ab9fb0bbe\") " pod="openshift-dns/dns-default-7fps4" Apr 20 07:50:49.708353 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.708304 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0993c493-f978-431e-9000-290ab9fb0bbe-tmp-dir\") pod \"dns-default-7fps4\" (UID: \"0993c493-f978-431e-9000-290ab9fb0bbe\") " pod="openshift-dns/dns-default-7fps4" Apr 20 07:50:49.708353 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.708332 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls\") pod \"dns-default-7fps4\" (UID: \"0993c493-f978-431e-9000-290ab9fb0bbe\") " pod="openshift-dns/dns-default-7fps4" Apr 20 07:50:49.708589 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.708379 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0993c493-f978-431e-9000-290ab9fb0bbe-config-volume\") pod \"dns-default-7fps4\" (UID: \"0993c493-f978-431e-9000-290ab9fb0bbe\") " pod="openshift-dns/dns-default-7fps4" Apr 20 07:50:49.809546 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.809507 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert\") pod \"ingress-canary-9bxrl\" (UID: \"b659a68e-b039-4864-b691-ff12b7393ed7\") " pod="openshift-ingress-canary/ingress-canary-9bxrl" Apr 20 07:50:49.810511 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.809992 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tl5rs\" (UniqueName: \"kubernetes.io/projected/b659a68e-b039-4864-b691-ff12b7393ed7-kube-api-access-tl5rs\") pod \"ingress-canary-9bxrl\" (UID: \"b659a68e-b039-4864-b691-ff12b7393ed7\") " pod="openshift-ingress-canary/ingress-canary-9bxrl" Apr 20 07:50:49.810511 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:49.810011 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:50:49.810511 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:49.810101 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert podName:b659a68e-b039-4864-b691-ff12b7393ed7 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:50.310079126 +0000 UTC m=+34.223709162 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert") pod "ingress-canary-9bxrl" (UID: "b659a68e-b039-4864-b691-ff12b7393ed7") : secret "canary-serving-cert" not found Apr 20 07:50:49.810511 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.810174 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7q45c\" (UniqueName: \"kubernetes.io/projected/0993c493-f978-431e-9000-290ab9fb0bbe-kube-api-access-7q45c\") pod \"dns-default-7fps4\" (UID: \"0993c493-f978-431e-9000-290ab9fb0bbe\") " pod="openshift-dns/dns-default-7fps4" Apr 20 07:50:49.810511 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.810219 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0993c493-f978-431e-9000-290ab9fb0bbe-tmp-dir\") pod \"dns-default-7fps4\" (UID: \"0993c493-f978-431e-9000-290ab9fb0bbe\") " pod="openshift-dns/dns-default-7fps4" Apr 20 07:50:49.810511 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.810294 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls\") pod \"dns-default-7fps4\" (UID: \"0993c493-f978-431e-9000-290ab9fb0bbe\") " pod="openshift-dns/dns-default-7fps4" Apr 20 07:50:49.810511 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.810346 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0993c493-f978-431e-9000-290ab9fb0bbe-config-volume\") pod \"dns-default-7fps4\" (UID: \"0993c493-f978-431e-9000-290ab9fb0bbe\") " pod="openshift-dns/dns-default-7fps4" Apr 20 07:50:49.810905 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:49.810824 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:50:49.810955 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.810916 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0993c493-f978-431e-9000-290ab9fb0bbe-config-volume\") pod \"dns-default-7fps4\" (UID: \"0993c493-f978-431e-9000-290ab9fb0bbe\") " pod="openshift-dns/dns-default-7fps4" Apr 20 07:50:49.811290 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:49.811275 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls podName:0993c493-f978-431e-9000-290ab9fb0bbe nodeName:}" failed. No retries permitted until 2026-04-20 07:50:50.311254018 +0000 UTC m=+34.224884039 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls") pod "dns-default-7fps4" (UID: "0993c493-f978-431e-9000-290ab9fb0bbe") : secret "dns-default-metrics-tls" not found Apr 20 07:50:49.817001 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.816975 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0993c493-f978-431e-9000-290ab9fb0bbe-tmp-dir\") pod \"dns-default-7fps4\" (UID: \"0993c493-f978-431e-9000-290ab9fb0bbe\") " pod="openshift-dns/dns-default-7fps4" Apr 20 07:50:49.823312 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.823288 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q45c\" (UniqueName: \"kubernetes.io/projected/0993c493-f978-431e-9000-290ab9fb0bbe-kube-api-access-7q45c\") pod \"dns-default-7fps4\" (UID: \"0993c493-f978-431e-9000-290ab9fb0bbe\") " pod="openshift-dns/dns-default-7fps4" Apr 20 07:50:49.823400 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:49.823385 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl5rs\" (UniqueName: \"kubernetes.io/projected/b659a68e-b039-4864-b691-ff12b7393ed7-kube-api-access-tl5rs\") pod \"ingress-canary-9bxrl\" (UID: \"b659a68e-b039-4864-b691-ff12b7393ed7\") " pod="openshift-ingress-canary/ingress-canary-9bxrl" Apr 20 07:50:50.314489 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:50.314454 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert\") pod \"ingress-canary-9bxrl\" (UID: \"b659a68e-b039-4864-b691-ff12b7393ed7\") " pod="openshift-ingress-canary/ingress-canary-9bxrl" Apr 20 07:50:50.314695 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:50.314509 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs\") pod \"network-metrics-daemon-brq5h\" (UID: \"07219834-44d6-42ab-9058-aed46274d1a8\") " pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:50:50.314695 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:50.314546 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls\") pod \"dns-default-7fps4\" (UID: \"0993c493-f978-431e-9000-290ab9fb0bbe\") " pod="openshift-dns/dns-default-7fps4" Apr 20 07:50:50.314695 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:50.314632 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:50:50.314695 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:50.314672 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:50:50.314917 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:50.314714 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert podName:b659a68e-b039-4864-b691-ff12b7393ed7 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:51.31469835 +0000 UTC m=+35.228328370 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert") pod "ingress-canary-9bxrl" (UID: "b659a68e-b039-4864-b691-ff12b7393ed7") : secret "canary-serving-cert" not found Apr 20 07:50:50.314917 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:50.314730 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls podName:0993c493-f978-431e-9000-290ab9fb0bbe nodeName:}" failed. No retries permitted until 2026-04-20 07:50:51.314724391 +0000 UTC m=+35.228354411 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls") pod "dns-default-7fps4" (UID: "0993c493-f978-431e-9000-290ab9fb0bbe") : secret "dns-default-metrics-tls" not found Apr 20 07:50:50.314917 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:50.314759 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 07:50:50.314917 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:50.314803 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs podName:07219834-44d6-42ab-9058-aed46274d1a8 nodeName:}" failed. No retries permitted until 2026-04-20 07:51:22.314784586 +0000 UTC m=+66.228414809 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs") pod "network-metrics-daemon-brq5h" (UID: "07219834-44d6-42ab-9058-aed46274d1a8") : secret "metrics-daemon-secret" not found Apr 20 07:50:50.516773 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:50.516725 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mx9xg\" (UniqueName: \"kubernetes.io/projected/3e84a5ff-c8e6-4c91-95b6-66697b65f3e6-kube-api-access-mx9xg\") pod \"network-check-target-qr7pr\" (UID: \"3e84a5ff-c8e6-4c91-95b6-66697b65f3e6\") " pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:50.519193 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:50.519170 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx9xg\" (UniqueName: \"kubernetes.io/projected/3e84a5ff-c8e6-4c91-95b6-66697b65f3e6-kube-api-access-mx9xg\") pod \"network-check-target-qr7pr\" (UID: \"3e84a5ff-c8e6-4c91-95b6-66697b65f3e6\") " pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:50.552991 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:50.552952 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:51.321873 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:51.321835 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert\") pod \"ingress-canary-9bxrl\" (UID: \"b659a68e-b039-4864-b691-ff12b7393ed7\") " pod="openshift-ingress-canary/ingress-canary-9bxrl" Apr 20 07:50:51.322183 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:51.321996 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls\") pod \"dns-default-7fps4\" (UID: \"0993c493-f978-431e-9000-290ab9fb0bbe\") " pod="openshift-dns/dns-default-7fps4" Apr 20 07:50:51.322183 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:51.322007 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:50:51.322183 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:51.322099 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert podName:b659a68e-b039-4864-b691-ff12b7393ed7 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:53.322077875 +0000 UTC m=+37.235707899 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert") pod "ingress-canary-9bxrl" (UID: "b659a68e-b039-4864-b691-ff12b7393ed7") : secret "canary-serving-cert" not found Apr 20 07:50:51.322183 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:51.322119 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:50:51.322183 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:51.322178 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls podName:0993c493-f978-431e-9000-290ab9fb0bbe nodeName:}" failed. No retries permitted until 2026-04-20 07:50:53.322162673 +0000 UTC m=+37.235792701 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls") pod "dns-default-7fps4" (UID: "0993c493-f978-431e-9000-290ab9fb0bbe") : secret "dns-default-metrics-tls" not found Apr 20 07:50:52.733110 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:52.732870 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qr7pr"] Apr 20 07:50:52.784391 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:50:52.784353 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e84a5ff_c8e6_4c91_95b6_66697b65f3e6.slice/crio-ddf4dfe98c6e95167f6e0bc2f5f4c5bc2f881822409387f2318104949b5c1c31 WatchSource:0}: Error finding container ddf4dfe98c6e95167f6e0bc2f5f4c5bc2f881822409387f2318104949b5c1c31: Status 404 returned error can't find the container with id ddf4dfe98c6e95167f6e0bc2f5f4c5bc2f881822409387f2318104949b5c1c31 Apr 20 07:50:52.875702 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:52.875596 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qr7pr" event={"ID":"3e84a5ff-c8e6-4c91-95b6-66697b65f3e6","Type":"ContainerStarted","Data":"ddf4dfe98c6e95167f6e0bc2f5f4c5bc2f881822409387f2318104949b5c1c31"} Apr 20 07:50:53.335383 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:53.335350 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert\") pod \"ingress-canary-9bxrl\" (UID: \"b659a68e-b039-4864-b691-ff12b7393ed7\") " pod="openshift-ingress-canary/ingress-canary-9bxrl" Apr 20 07:50:53.335562 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:53.335407 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls\") pod \"dns-default-7fps4\" (UID: \"0993c493-f978-431e-9000-290ab9fb0bbe\") " pod="openshift-dns/dns-default-7fps4" Apr 20 07:50:53.335562 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:53.335513 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:50:53.335663 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:53.335577 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert podName:b659a68e-b039-4864-b691-ff12b7393ed7 nodeName:}" failed. No retries permitted until 2026-04-20 07:50:57.335562682 +0000 UTC m=+41.249192707 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert") pod "ingress-canary-9bxrl" (UID: "b659a68e-b039-4864-b691-ff12b7393ed7") : secret "canary-serving-cert" not found Apr 20 07:50:53.335663 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:53.335513 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:50:53.335663 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:53.335639 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls podName:0993c493-f978-431e-9000-290ab9fb0bbe nodeName:}" failed. No retries permitted until 2026-04-20 07:50:57.335627683 +0000 UTC m=+41.249257709 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls") pod "dns-default-7fps4" (UID: "0993c493-f978-431e-9000-290ab9fb0bbe") : secret "dns-default-metrics-tls" not found Apr 20 07:50:53.880623 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:53.880588 2572 generic.go:358] "Generic (PLEG): container finished" podID="893e108a-cd48-4c06-80c6-167a8ad53ac2" containerID="2fdf4849292bd2e048a225ba0080a66b1117582c8e49b97427ea054feabd3bc4" exitCode=0 Apr 20 07:50:53.881070 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:53.880654 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lfk6z" event={"ID":"893e108a-cd48-4c06-80c6-167a8ad53ac2","Type":"ContainerDied","Data":"2fdf4849292bd2e048a225ba0080a66b1117582c8e49b97427ea054feabd3bc4"} Apr 20 07:50:54.886487 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:54.886447 2572 generic.go:358] "Generic (PLEG): container finished" podID="893e108a-cd48-4c06-80c6-167a8ad53ac2" containerID="25670353a954da82e4497643a516ead14280ad81f2d7f8631ed178be1fa27567" exitCode=0 Apr 20 07:50:54.886855 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:54.886511 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lfk6z" event={"ID":"893e108a-cd48-4c06-80c6-167a8ad53ac2","Type":"ContainerDied","Data":"25670353a954da82e4497643a516ead14280ad81f2d7f8631ed178be1fa27567"} Apr 20 07:50:55.891153 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:55.891106 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lfk6z" event={"ID":"893e108a-cd48-4c06-80c6-167a8ad53ac2","Type":"ContainerStarted","Data":"0dfac724f80b84c3c11623d7573172e6e2922ee4883e0864989c58b3c7c4c653"} Apr 20 07:50:55.892302 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:55.892276 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qr7pr" event={"ID":"3e84a5ff-c8e6-4c91-95b6-66697b65f3e6","Type":"ContainerStarted","Data":"5f9d0773ba3759a2073ff3c2e8781aa505168ae5bdd1b9c3b582d8fc09600c21"} Apr 20 07:50:55.892422 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:55.892403 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:50:55.912926 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:55.912877 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-lfk6z" podStartSLOduration=6.334730551 podStartE2EDuration="39.912865783s" podCreationTimestamp="2026-04-20 07:50:16 +0000 UTC" firstStartedPulling="2026-04-20 07:50:19.240088309 +0000 UTC m=+3.153718330" lastFinishedPulling="2026-04-20 07:50:52.818223541 +0000 UTC m=+36.731853562" observedRunningTime="2026-04-20 07:50:55.911600399 +0000 UTC m=+39.825230442" watchObservedRunningTime="2026-04-20 07:50:55.912865783 +0000 UTC m=+39.826495826" Apr 20 07:50:55.927326 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:55.927288 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qr7pr" podStartSLOduration=37.234758913 podStartE2EDuration="39.927276617s" podCreationTimestamp="2026-04-20 07:50:16 +0000 UTC" firstStartedPulling="2026-04-20 07:50:52.79592306 +0000 UTC m=+36.709553080" lastFinishedPulling="2026-04-20 07:50:55.488440761 +0000 UTC m=+39.402070784" observedRunningTime="2026-04-20 07:50:55.927210863 +0000 UTC m=+39.840840903" watchObservedRunningTime="2026-04-20 07:50:55.927276617 +0000 UTC m=+39.840906682" Apr 20 07:50:57.362964 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:57.362927 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls\") pod \"dns-default-7fps4\" (UID: \"0993c493-f978-431e-9000-290ab9fb0bbe\") " pod="openshift-dns/dns-default-7fps4" Apr 20 07:50:57.363380 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:50:57.362978 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert\") pod \"ingress-canary-9bxrl\" (UID: \"b659a68e-b039-4864-b691-ff12b7393ed7\") " pod="openshift-ingress-canary/ingress-canary-9bxrl" Apr 20 07:50:57.363380 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:57.363097 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:50:57.363380 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:57.363112 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:50:57.363380 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:57.363175 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert podName:b659a68e-b039-4864-b691-ff12b7393ed7 nodeName:}" failed. No retries permitted until 2026-04-20 07:51:05.363154944 +0000 UTC m=+49.276784979 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert") pod "ingress-canary-9bxrl" (UID: "b659a68e-b039-4864-b691-ff12b7393ed7") : secret "canary-serving-cert" not found Apr 20 07:50:57.363380 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:50:57.363189 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls podName:0993c493-f978-431e-9000-290ab9fb0bbe nodeName:}" failed. No retries permitted until 2026-04-20 07:51:05.363183083 +0000 UTC m=+49.276813104 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls") pod "dns-default-7fps4" (UID: "0993c493-f978-431e-9000-290ab9fb0bbe") : secret "dns-default-metrics-tls" not found Apr 20 07:51:05.414762 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:51:05.414719 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert\") pod \"ingress-canary-9bxrl\" (UID: \"b659a68e-b039-4864-b691-ff12b7393ed7\") " pod="openshift-ingress-canary/ingress-canary-9bxrl" Apr 20 07:51:05.415262 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:51:05.414793 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls\") pod \"dns-default-7fps4\" (UID: \"0993c493-f978-431e-9000-290ab9fb0bbe\") " pod="openshift-dns/dns-default-7fps4" Apr 20 07:51:05.415262 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:51:05.414886 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:51:05.415262 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:51:05.414908 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:51:05.415262 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:51:05.414957 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert podName:b659a68e-b039-4864-b691-ff12b7393ed7 nodeName:}" failed. No retries permitted until 2026-04-20 07:51:21.414942383 +0000 UTC m=+65.328572409 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert") pod "ingress-canary-9bxrl" (UID: "b659a68e-b039-4864-b691-ff12b7393ed7") : secret "canary-serving-cert" not found Apr 20 07:51:05.415262 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:51:05.414972 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls podName:0993c493-f978-431e-9000-290ab9fb0bbe nodeName:}" failed. No retries permitted until 2026-04-20 07:51:21.414966715 +0000 UTC m=+65.328596736 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls") pod "dns-default-7fps4" (UID: "0993c493-f978-431e-9000-290ab9fb0bbe") : secret "dns-default-metrics-tls" not found Apr 20 07:51:15.875375 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:51:15.875347 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-55zsn" Apr 20 07:51:21.425572 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:51:21.425517 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls\") pod \"dns-default-7fps4\" (UID: \"0993c493-f978-431e-9000-290ab9fb0bbe\") " pod="openshift-dns/dns-default-7fps4" Apr 20 07:51:21.425572 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:51:21.425587 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert\") pod \"ingress-canary-9bxrl\" (UID: \"b659a68e-b039-4864-b691-ff12b7393ed7\") " pod="openshift-ingress-canary/ingress-canary-9bxrl" Apr 20 07:51:21.425997 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:51:21.425668 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:51:21.425997 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:51:21.425754 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls podName:0993c493-f978-431e-9000-290ab9fb0bbe nodeName:}" failed. No retries permitted until 2026-04-20 07:51:53.425736822 +0000 UTC m=+97.339366844 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls") pod "dns-default-7fps4" (UID: "0993c493-f978-431e-9000-290ab9fb0bbe") : secret "dns-default-metrics-tls" not found Apr 20 07:51:21.425997 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:51:21.425673 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:51:21.425997 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:51:21.425809 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert podName:b659a68e-b039-4864-b691-ff12b7393ed7 nodeName:}" failed. No retries permitted until 2026-04-20 07:51:53.425797894 +0000 UTC m=+97.339427915 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert") pod "ingress-canary-9bxrl" (UID: "b659a68e-b039-4864-b691-ff12b7393ed7") : secret "canary-serving-cert" not found Apr 20 07:51:22.332510 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:51:22.332454 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs\") pod \"network-metrics-daemon-brq5h\" (UID: \"07219834-44d6-42ab-9058-aed46274d1a8\") " pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:51:22.332695 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:51:22.332602 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 07:51:22.332695 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:51:22.332663 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs podName:07219834-44d6-42ab-9058-aed46274d1a8 nodeName:}" failed. No retries permitted until 2026-04-20 07:52:26.332648547 +0000 UTC m=+130.246278574 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs") pod "network-metrics-daemon-brq5h" (UID: "07219834-44d6-42ab-9058-aed46274d1a8") : secret "metrics-daemon-secret" not found Apr 20 07:51:26.896810 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:51:26.896781 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qr7pr" Apr 20 07:51:53.445374 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:51:53.445241 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert\") pod \"ingress-canary-9bxrl\" (UID: \"b659a68e-b039-4864-b691-ff12b7393ed7\") " pod="openshift-ingress-canary/ingress-canary-9bxrl" Apr 20 07:51:53.445374 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:51:53.445309 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls\") pod \"dns-default-7fps4\" (UID: \"0993c493-f978-431e-9000-290ab9fb0bbe\") " pod="openshift-dns/dns-default-7fps4" Apr 20 07:51:53.445374 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:51:53.445384 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:51:53.445897 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:51:53.445434 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls podName:0993c493-f978-431e-9000-290ab9fb0bbe nodeName:}" failed. No retries permitted until 2026-04-20 07:52:57.445419525 +0000 UTC m=+161.359049546 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls") pod "dns-default-7fps4" (UID: "0993c493-f978-431e-9000-290ab9fb0bbe") : secret "dns-default-metrics-tls" not found Apr 20 07:51:53.445897 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:51:53.445385 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:51:53.445897 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:51:53.445515 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert podName:b659a68e-b039-4864-b691-ff12b7393ed7 nodeName:}" failed. No retries permitted until 2026-04-20 07:52:57.445498576 +0000 UTC m=+161.359128599 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert") pod "ingress-canary-9bxrl" (UID: "b659a68e-b039-4864-b691-ff12b7393ed7") : secret "canary-serving-cert" not found Apr 20 07:52:09.976293 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:09.976265 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs"] Apr 20 07:52:09.978986 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:09.978970 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs" Apr 20 07:52:09.983918 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:09.983895 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-jmgbc\"" Apr 20 07:52:09.983918 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:09.983896 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 07:52:09.984268 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:09.984161 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 20 07:52:09.984268 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:09.984166 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 07:52:09.984356 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:09.984348 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 20 07:52:09.987965 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:09.987946 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs"] Apr 20 07:52:10.063110 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.063072 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-kb8qs\" (UID: \"b28cd900-1a9e-4c36-a8d1-7409e30f8de9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs" Apr 20 07:52:10.063110 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.063108 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lhks\" (UniqueName: \"kubernetes.io/projected/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-kube-api-access-5lhks\") pod \"cluster-monitoring-operator-75587bd455-kb8qs\" (UID: \"b28cd900-1a9e-4c36-a8d1-7409e30f8de9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs" Apr 20 07:52:10.063340 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.063166 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kb8qs\" (UID: \"b28cd900-1a9e-4c36-a8d1-7409e30f8de9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs" Apr 20 07:52:10.081447 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.080096 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-rjwff"] Apr 20 07:52:10.083441 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.083413 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-656885888d-85974"] Apr 20 07:52:10.083557 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.083540 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-rjwff" Apr 20 07:52:10.085920 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.085898 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 20 07:52:10.086017 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.085948 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:10.086017 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.085996 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 07:52:10.086127 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.086031 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 07:52:10.086621 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.086604 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 20 07:52:10.086621 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.086617 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-dhk79\"" Apr 20 07:52:10.089421 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.089400 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 07:52:10.089528 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.089445 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 07:52:10.089528 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.089459 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 07:52:10.089528 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.089511 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 07:52:10.089810 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.089789 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 07:52:10.089810 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.089803 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 07:52:10.089939 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.089907 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-mcczc\"" Apr 20 07:52:10.092973 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.092955 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-rjwff"] Apr 20 07:52:10.094020 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.094002 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 20 07:52:10.094103 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.094063 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-656885888d-85974"] Apr 20 07:52:10.163868 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.163834 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/520d7307-df9b-4a84-8836-d8b02ebe3ddb-service-ca-bundle\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:10.164034 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.163877 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-metrics-certs\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:10.164034 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.163929 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c112e80-851a-4290-86a7-51a64594d25e-serving-cert\") pod \"insights-operator-585dfdc468-rjwff\" (UID: \"8c112e80-851a-4290-86a7-51a64594d25e\") " pod="openshift-insights/insights-operator-585dfdc468-rjwff" Apr 20 07:52:10.164034 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.163963 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c112e80-851a-4290-86a7-51a64594d25e-tmp\") pod \"insights-operator-585dfdc468-rjwff\" (UID: \"8c112e80-851a-4290-86a7-51a64594d25e\") " pod="openshift-insights/insights-operator-585dfdc468-rjwff" Apr 20 07:52:10.164034 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.164024 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-kb8qs\" (UID: \"b28cd900-1a9e-4c36-a8d1-7409e30f8de9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs" Apr 20 07:52:10.164202 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.164054 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5lhks\" (UniqueName: \"kubernetes.io/projected/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-kube-api-access-5lhks\") pod \"cluster-monitoring-operator-75587bd455-kb8qs\" (UID: \"b28cd900-1a9e-4c36-a8d1-7409e30f8de9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs" Apr 20 07:52:10.164202 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.164074 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-stats-auth\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:10.164202 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.164096 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-default-certificate\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:10.164202 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.164112 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-289nd\" (UniqueName: \"kubernetes.io/projected/520d7307-df9b-4a84-8836-d8b02ebe3ddb-kube-api-access-289nd\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:10.164353 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.164208 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/8c112e80-851a-4290-86a7-51a64594d25e-snapshots\") pod \"insights-operator-585dfdc468-rjwff\" (UID: \"8c112e80-851a-4290-86a7-51a64594d25e\") " pod="openshift-insights/insights-operator-585dfdc468-rjwff" Apr 20 07:52:10.164353 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.164229 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hvd7\" (UniqueName: \"kubernetes.io/projected/8c112e80-851a-4290-86a7-51a64594d25e-kube-api-access-2hvd7\") pod \"insights-operator-585dfdc468-rjwff\" (UID: \"8c112e80-851a-4290-86a7-51a64594d25e\") " pod="openshift-insights/insights-operator-585dfdc468-rjwff" Apr 20 07:52:10.164353 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.164255 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c112e80-851a-4290-86a7-51a64594d25e-service-ca-bundle\") pod \"insights-operator-585dfdc468-rjwff\" (UID: \"8c112e80-851a-4290-86a7-51a64594d25e\") " pod="openshift-insights/insights-operator-585dfdc468-rjwff" Apr 20 07:52:10.164353 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.164272 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c112e80-851a-4290-86a7-51a64594d25e-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-rjwff\" (UID: \"8c112e80-851a-4290-86a7-51a64594d25e\") " pod="openshift-insights/insights-operator-585dfdc468-rjwff" Apr 20 07:52:10.164353 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.164320 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kb8qs\" (UID: \"b28cd900-1a9e-4c36-a8d1-7409e30f8de9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs" Apr 20 07:52:10.164516 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:10.164411 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 07:52:10.164516 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:10.164467 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-cluster-monitoring-operator-tls podName:b28cd900-1a9e-4c36-a8d1-7409e30f8de9 nodeName:}" failed. No retries permitted until 2026-04-20 07:52:10.664453341 +0000 UTC m=+114.578083362 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kb8qs" (UID: "b28cd900-1a9e-4c36-a8d1-7409e30f8de9") : secret "cluster-monitoring-operator-tls" not found Apr 20 07:52:10.164743 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.164725 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-kb8qs\" (UID: \"b28cd900-1a9e-4c36-a8d1-7409e30f8de9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs" Apr 20 07:52:10.174478 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.174459 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lhks\" (UniqueName: \"kubernetes.io/projected/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-kube-api-access-5lhks\") pod \"cluster-monitoring-operator-75587bd455-kb8qs\" (UID: \"b28cd900-1a9e-4c36-a8d1-7409e30f8de9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs" Apr 20 07:52:10.265332 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.265305 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/520d7307-df9b-4a84-8836-d8b02ebe3ddb-service-ca-bundle\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:10.265489 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.265344 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-metrics-certs\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:10.265489 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.265362 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c112e80-851a-4290-86a7-51a64594d25e-serving-cert\") pod \"insights-operator-585dfdc468-rjwff\" (UID: \"8c112e80-851a-4290-86a7-51a64594d25e\") " pod="openshift-insights/insights-operator-585dfdc468-rjwff" Apr 20 07:52:10.265489 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.265382 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c112e80-851a-4290-86a7-51a64594d25e-tmp\") pod \"insights-operator-585dfdc468-rjwff\" (UID: \"8c112e80-851a-4290-86a7-51a64594d25e\") " pod="openshift-insights/insights-operator-585dfdc468-rjwff" Apr 20 07:52:10.265489 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:10.265455 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 07:52:10.265610 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:10.265486 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/520d7307-df9b-4a84-8836-d8b02ebe3ddb-service-ca-bundle podName:520d7307-df9b-4a84-8836-d8b02ebe3ddb nodeName:}" failed. No retries permitted until 2026-04-20 07:52:10.765469243 +0000 UTC m=+114.679099264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/520d7307-df9b-4a84-8836-d8b02ebe3ddb-service-ca-bundle") pod "router-default-656885888d-85974" (UID: "520d7307-df9b-4a84-8836-d8b02ebe3ddb") : configmap references non-existent config key: service-ca.crt Apr 20 07:52:10.265610 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:10.265523 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-metrics-certs podName:520d7307-df9b-4a84-8836-d8b02ebe3ddb nodeName:}" failed. No retries permitted until 2026-04-20 07:52:10.76550904 +0000 UTC m=+114.679139062 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-metrics-certs") pod "router-default-656885888d-85974" (UID: "520d7307-df9b-4a84-8836-d8b02ebe3ddb") : secret "router-metrics-certs-default" not found Apr 20 07:52:10.265610 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.265565 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-stats-auth\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:10.265610 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.265599 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-default-certificate\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:10.265764 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.265626 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-289nd\" (UniqueName: \"kubernetes.io/projected/520d7307-df9b-4a84-8836-d8b02ebe3ddb-kube-api-access-289nd\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:10.265764 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.265667 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/8c112e80-851a-4290-86a7-51a64594d25e-snapshots\") pod \"insights-operator-585dfdc468-rjwff\" (UID: \"8c112e80-851a-4290-86a7-51a64594d25e\") " pod="openshift-insights/insights-operator-585dfdc468-rjwff" Apr 20 07:52:10.265764 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.265694 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hvd7\" (UniqueName: \"kubernetes.io/projected/8c112e80-851a-4290-86a7-51a64594d25e-kube-api-access-2hvd7\") pod \"insights-operator-585dfdc468-rjwff\" (UID: \"8c112e80-851a-4290-86a7-51a64594d25e\") " pod="openshift-insights/insights-operator-585dfdc468-rjwff" Apr 20 07:52:10.265907 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.265834 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c112e80-851a-4290-86a7-51a64594d25e-service-ca-bundle\") pod \"insights-operator-585dfdc468-rjwff\" (UID: \"8c112e80-851a-4290-86a7-51a64594d25e\") " pod="openshift-insights/insights-operator-585dfdc468-rjwff" Apr 20 07:52:10.265907 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.265842 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8c112e80-851a-4290-86a7-51a64594d25e-tmp\") pod \"insights-operator-585dfdc468-rjwff\" (UID: \"8c112e80-851a-4290-86a7-51a64594d25e\") " pod="openshift-insights/insights-operator-585dfdc468-rjwff" Apr 20 07:52:10.266008 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.265905 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c112e80-851a-4290-86a7-51a64594d25e-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-rjwff\" (UID: \"8c112e80-851a-4290-86a7-51a64594d25e\") " pod="openshift-insights/insights-operator-585dfdc468-rjwff" Apr 20 07:52:10.266440 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.266414 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c112e80-851a-4290-86a7-51a64594d25e-service-ca-bundle\") pod \"insights-operator-585dfdc468-rjwff\" (UID: \"8c112e80-851a-4290-86a7-51a64594d25e\") " pod="openshift-insights/insights-operator-585dfdc468-rjwff" Apr 20 07:52:10.266823 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.266806 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/8c112e80-851a-4290-86a7-51a64594d25e-snapshots\") pod \"insights-operator-585dfdc468-rjwff\" (UID: \"8c112e80-851a-4290-86a7-51a64594d25e\") " pod="openshift-insights/insights-operator-585dfdc468-rjwff" Apr 20 07:52:10.266950 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.266927 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c112e80-851a-4290-86a7-51a64594d25e-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-rjwff\" (UID: \"8c112e80-851a-4290-86a7-51a64594d25e\") " pod="openshift-insights/insights-operator-585dfdc468-rjwff" Apr 20 07:52:10.267792 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.267765 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c112e80-851a-4290-86a7-51a64594d25e-serving-cert\") pod \"insights-operator-585dfdc468-rjwff\" (UID: \"8c112e80-851a-4290-86a7-51a64594d25e\") " pod="openshift-insights/insights-operator-585dfdc468-rjwff" Apr 20 07:52:10.268264 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.268244 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-stats-auth\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:10.268315 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.268244 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-default-certificate\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:10.281547 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.281522 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hvd7\" (UniqueName: \"kubernetes.io/projected/8c112e80-851a-4290-86a7-51a64594d25e-kube-api-access-2hvd7\") pod \"insights-operator-585dfdc468-rjwff\" (UID: \"8c112e80-851a-4290-86a7-51a64594d25e\") " pod="openshift-insights/insights-operator-585dfdc468-rjwff" Apr 20 07:52:10.281625 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.281567 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-289nd\" (UniqueName: \"kubernetes.io/projected/520d7307-df9b-4a84-8836-d8b02ebe3ddb-kube-api-access-289nd\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:10.395053 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.394995 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-rjwff" Apr 20 07:52:10.524626 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.524557 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-rjwff"] Apr 20 07:52:10.528352 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:52:10.528328 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c112e80_851a_4290_86a7_51a64594d25e.slice/crio-beec14758a578bd0e985ec8ff82787bbbaee33b98c64754b647444ef765eb622 WatchSource:0}: Error finding container beec14758a578bd0e985ec8ff82787bbbaee33b98c64754b647444ef765eb622: Status 404 returned error can't find the container with id beec14758a578bd0e985ec8ff82787bbbaee33b98c64754b647444ef765eb622 Apr 20 07:52:10.669972 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.669940 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kb8qs\" (UID: \"b28cd900-1a9e-4c36-a8d1-7409e30f8de9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs" Apr 20 07:52:10.670150 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:10.670035 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 07:52:10.670150 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:10.670087 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-cluster-monitoring-operator-tls podName:b28cd900-1a9e-4c36-a8d1-7409e30f8de9 nodeName:}" failed. No retries permitted until 2026-04-20 07:52:11.670072966 +0000 UTC m=+115.583702986 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kb8qs" (UID: "b28cd900-1a9e-4c36-a8d1-7409e30f8de9") : secret "cluster-monitoring-operator-tls" not found Apr 20 07:52:10.771201 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.771164 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/520d7307-df9b-4a84-8836-d8b02ebe3ddb-service-ca-bundle\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:10.771347 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:10.771216 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-metrics-certs\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:10.771347 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:10.771318 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/520d7307-df9b-4a84-8836-d8b02ebe3ddb-service-ca-bundle podName:520d7307-df9b-4a84-8836-d8b02ebe3ddb nodeName:}" failed. No retries permitted until 2026-04-20 07:52:11.771297336 +0000 UTC m=+115.684927371 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/520d7307-df9b-4a84-8836-d8b02ebe3ddb-service-ca-bundle") pod "router-default-656885888d-85974" (UID: "520d7307-df9b-4a84-8836-d8b02ebe3ddb") : configmap references non-existent config key: service-ca.crt Apr 20 07:52:10.771460 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:10.771355 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 07:52:10.771460 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:10.771398 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-metrics-certs podName:520d7307-df9b-4a84-8836-d8b02ebe3ddb nodeName:}" failed. No retries permitted until 2026-04-20 07:52:11.771387014 +0000 UTC m=+115.685017054 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-metrics-certs") pod "router-default-656885888d-85974" (UID: "520d7307-df9b-4a84-8836-d8b02ebe3ddb") : secret "router-metrics-certs-default" not found Apr 20 07:52:11.031790 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:11.031752 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-rjwff" event={"ID":"8c112e80-851a-4290-86a7-51a64594d25e","Type":"ContainerStarted","Data":"beec14758a578bd0e985ec8ff82787bbbaee33b98c64754b647444ef765eb622"} Apr 20 07:52:11.680225 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:11.680182 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kb8qs\" (UID: \"b28cd900-1a9e-4c36-a8d1-7409e30f8de9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs" Apr 20 07:52:11.680372 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:11.680340 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 07:52:11.680432 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:11.680409 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-cluster-monitoring-operator-tls podName:b28cd900-1a9e-4c36-a8d1-7409e30f8de9 nodeName:}" failed. No retries permitted until 2026-04-20 07:52:13.680394707 +0000 UTC m=+117.594024727 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kb8qs" (UID: "b28cd900-1a9e-4c36-a8d1-7409e30f8de9") : secret "cluster-monitoring-operator-tls" not found Apr 20 07:52:11.781207 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:11.781166 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/520d7307-df9b-4a84-8836-d8b02ebe3ddb-service-ca-bundle\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:11.781207 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:11.781211 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-metrics-certs\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:11.781469 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:11.781295 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 07:52:11.781469 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:11.781347 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/520d7307-df9b-4a84-8836-d8b02ebe3ddb-service-ca-bundle podName:520d7307-df9b-4a84-8836-d8b02ebe3ddb nodeName:}" failed. No retries permitted until 2026-04-20 07:52:13.78132567 +0000 UTC m=+117.694955694 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/520d7307-df9b-4a84-8836-d8b02ebe3ddb-service-ca-bundle") pod "router-default-656885888d-85974" (UID: "520d7307-df9b-4a84-8836-d8b02ebe3ddb") : configmap references non-existent config key: service-ca.crt Apr 20 07:52:11.781469 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:11.781396 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-metrics-certs podName:520d7307-df9b-4a84-8836-d8b02ebe3ddb nodeName:}" failed. No retries permitted until 2026-04-20 07:52:13.781385726 +0000 UTC m=+117.695015768 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-metrics-certs") pod "router-default-656885888d-85974" (UID: "520d7307-df9b-4a84-8836-d8b02ebe3ddb") : secret "router-metrics-certs-default" not found Apr 20 07:52:13.037426 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:13.037392 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-rjwff" event={"ID":"8c112e80-851a-4290-86a7-51a64594d25e","Type":"ContainerStarted","Data":"a79e289654a8fe51e547d7201f36da6e0d8d14a1f7719bc5b050a233caee6a3c"} Apr 20 07:52:13.053767 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:13.053711 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-rjwff" podStartSLOduration=1.007180008 podStartE2EDuration="3.053695616s" podCreationTimestamp="2026-04-20 07:52:10 +0000 UTC" firstStartedPulling="2026-04-20 07:52:10.529940742 +0000 UTC m=+114.443570763" lastFinishedPulling="2026-04-20 07:52:12.57645635 +0000 UTC m=+116.490086371" observedRunningTime="2026-04-20 07:52:13.052335348 +0000 UTC m=+116.965965391" watchObservedRunningTime="2026-04-20 07:52:13.053695616 +0000 UTC m=+116.967325660" Apr 20 07:52:13.698892 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:13.698852 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kb8qs\" (UID: \"b28cd900-1a9e-4c36-a8d1-7409e30f8de9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs" Apr 20 07:52:13.699079 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:13.699017 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 07:52:13.699123 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:13.699099 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-cluster-monitoring-operator-tls podName:b28cd900-1a9e-4c36-a8d1-7409e30f8de9 nodeName:}" failed. No retries permitted until 2026-04-20 07:52:17.699079948 +0000 UTC m=+121.612709989 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kb8qs" (UID: "b28cd900-1a9e-4c36-a8d1-7409e30f8de9") : secret "cluster-monitoring-operator-tls" not found Apr 20 07:52:13.799392 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:13.799344 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-metrics-certs\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:13.799571 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:13.799472 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/520d7307-df9b-4a84-8836-d8b02ebe3ddb-service-ca-bundle\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:13.799571 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:13.799500 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 07:52:13.799571 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:13.799569 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-metrics-certs podName:520d7307-df9b-4a84-8836-d8b02ebe3ddb nodeName:}" failed. No retries permitted until 2026-04-20 07:52:17.799554119 +0000 UTC m=+121.713184141 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-metrics-certs") pod "router-default-656885888d-85974" (UID: "520d7307-df9b-4a84-8836-d8b02ebe3ddb") : secret "router-metrics-certs-default" not found Apr 20 07:52:13.799677 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:13.799583 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/520d7307-df9b-4a84-8836-d8b02ebe3ddb-service-ca-bundle podName:520d7307-df9b-4a84-8836-d8b02ebe3ddb nodeName:}" failed. No retries permitted until 2026-04-20 07:52:17.799577208 +0000 UTC m=+121.713207229 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/520d7307-df9b-4a84-8836-d8b02ebe3ddb-service-ca-bundle") pod "router-default-656885888d-85974" (UID: "520d7307-df9b-4a84-8836-d8b02ebe3ddb") : configmap references non-existent config key: service-ca.crt Apr 20 07:52:15.919300 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:15.919272 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-45gfj_38e938ac-334a-46a9-bd54-099927b87530/dns-node-resolver/0.log" Apr 20 07:52:16.719288 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:16.719262 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fcfwc_4462eeb2-94f5-4c1c-bc8e-62fe9f10c78f/node-ca/0.log" Apr 20 07:52:17.728911 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:17.728878 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kb8qs\" (UID: \"b28cd900-1a9e-4c36-a8d1-7409e30f8de9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs" Apr 20 07:52:17.729325 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:17.729030 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 07:52:17.729325 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:17.729096 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-cluster-monitoring-operator-tls podName:b28cd900-1a9e-4c36-a8d1-7409e30f8de9 nodeName:}" failed. No retries permitted until 2026-04-20 07:52:25.729080134 +0000 UTC m=+129.642710155 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kb8qs" (UID: "b28cd900-1a9e-4c36-a8d1-7409e30f8de9") : secret "cluster-monitoring-operator-tls" not found Apr 20 07:52:17.829905 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:17.829856 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/520d7307-df9b-4a84-8836-d8b02ebe3ddb-service-ca-bundle\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:17.829905 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:17.829914 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-metrics-certs\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:17.830079 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:17.830034 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/520d7307-df9b-4a84-8836-d8b02ebe3ddb-service-ca-bundle podName:520d7307-df9b-4a84-8836-d8b02ebe3ddb nodeName:}" failed. No retries permitted until 2026-04-20 07:52:25.830013925 +0000 UTC m=+129.743643948 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/520d7307-df9b-4a84-8836-d8b02ebe3ddb-service-ca-bundle") pod "router-default-656885888d-85974" (UID: "520d7307-df9b-4a84-8836-d8b02ebe3ddb") : configmap references non-existent config key: service-ca.crt Apr 20 07:52:17.830079 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:17.830039 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 07:52:17.830079 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:17.830076 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-metrics-certs podName:520d7307-df9b-4a84-8836-d8b02ebe3ddb nodeName:}" failed. No retries permitted until 2026-04-20 07:52:25.830064545 +0000 UTC m=+129.743694593 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-metrics-certs") pod "router-default-656885888d-85974" (UID: "520d7307-df9b-4a84-8836-d8b02ebe3ddb") : secret "router-metrics-certs-default" not found Apr 20 07:52:20.023462 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.023431 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5fnb5"] Apr 20 07:52:20.026529 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.026513 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5fnb5" Apr 20 07:52:20.029306 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.029277 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-n796w\"" Apr 20 07:52:20.029430 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.029279 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:52:20.030276 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.030259 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 20 07:52:20.032823 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.032786 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5fnb5"] Apr 20 07:52:20.125770 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.125738 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f6qxs"] Apr 20 07:52:20.128658 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.128627 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f6qxs" Apr 20 07:52:20.129553 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.129530 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jldvq"] Apr 20 07:52:20.131946 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.131927 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 20 07:52:20.131946 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.131935 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:52:20.132112 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.132095 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 20 07:52:20.132267 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.132250 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-q8jhm\"" Apr 20 07:52:20.132377 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.132360 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 20 07:52:20.132447 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.132432 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2chjv"] Apr 20 07:52:20.132605 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.132591 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jldvq" Apr 20 07:52:20.135393 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.135219 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 20 07:52:20.135393 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.135228 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-rxrg5\"" Apr 20 07:52:20.135393 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.135339 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" Apr 20 07:52:20.135563 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.135511 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:52:20.135763 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.135745 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 20 07:52:20.138398 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.138327 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f6qxs"] Apr 20 07:52:20.141333 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.141313 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 20 07:52:20.141870 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.141847 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:52:20.142306 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.141963 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 20 07:52:20.142831 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.142089 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 20 07:52:20.142928 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.142271 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-wcjnv\"" Apr 20 07:52:20.143166 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.143132 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jldvq"] Apr 20 07:52:20.144694 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.144675 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2chjv"] Apr 20 07:52:20.145877 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.145856 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 20 07:52:20.147712 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.147689 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6kn2\" (UniqueName: \"kubernetes.io/projected/2d9a4659-6387-4f04-8cbd-81120ca54057-kube-api-access-s6kn2\") pod \"volume-data-source-validator-7c6cbb6c87-5fnb5\" (UID: \"2d9a4659-6387-4f04-8cbd-81120ca54057\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5fnb5" Apr 20 07:52:20.248454 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.248422 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6kn2\" (UniqueName: \"kubernetes.io/projected/2d9a4659-6387-4f04-8cbd-81120ca54057-kube-api-access-s6kn2\") pod \"volume-data-source-validator-7c6cbb6c87-5fnb5\" (UID: \"2d9a4659-6387-4f04-8cbd-81120ca54057\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5fnb5" Apr 20 07:52:20.248648 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.248482 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbzhh\" (UniqueName: \"kubernetes.io/projected/726097dd-28d4-46d8-84be-1e7ae8262ca7-kube-api-access-cbzhh\") pod \"cluster-samples-operator-6dc5bdb6b4-jldvq\" (UID: \"726097dd-28d4-46d8-84be-1e7ae8262ca7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jldvq" Apr 20 07:52:20.248648 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.248504 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29883fa5-e5e1-425a-85c2-3b3bd3ada0aa-config\") pod \"console-operator-9d4b6777b-2chjv\" (UID: \"29883fa5-e5e1-425a-85c2-3b3bd3ada0aa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" Apr 20 07:52:20.248648 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.248522 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29883fa5-e5e1-425a-85c2-3b3bd3ada0aa-serving-cert\") pod \"console-operator-9d4b6777b-2chjv\" (UID: \"29883fa5-e5e1-425a-85c2-3b3bd3ada0aa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" Apr 20 07:52:20.248648 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.248596 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da90ef7d-a0bf-47dd-ab2b-79b951b2d24c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-f6qxs\" (UID: \"da90ef7d-a0bf-47dd-ab2b-79b951b2d24c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f6qxs" Apr 20 07:52:20.248648 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.248637 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/726097dd-28d4-46d8-84be-1e7ae8262ca7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jldvq\" (UID: \"726097dd-28d4-46d8-84be-1e7ae8262ca7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jldvq" Apr 20 07:52:20.248891 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.248663 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kndf9\" (UniqueName: \"kubernetes.io/projected/da90ef7d-a0bf-47dd-ab2b-79b951b2d24c-kube-api-access-kndf9\") pod \"kube-storage-version-migrator-operator-6769c5d45-f6qxs\" (UID: \"da90ef7d-a0bf-47dd-ab2b-79b951b2d24c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f6qxs" Apr 20 07:52:20.248891 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.248689 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7c2t\" (UniqueName: \"kubernetes.io/projected/29883fa5-e5e1-425a-85c2-3b3bd3ada0aa-kube-api-access-j7c2t\") pod \"console-operator-9d4b6777b-2chjv\" (UID: \"29883fa5-e5e1-425a-85c2-3b3bd3ada0aa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" Apr 20 07:52:20.248891 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.248736 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29883fa5-e5e1-425a-85c2-3b3bd3ada0aa-trusted-ca\") pod \"console-operator-9d4b6777b-2chjv\" (UID: \"29883fa5-e5e1-425a-85c2-3b3bd3ada0aa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" Apr 20 07:52:20.248891 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.248781 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da90ef7d-a0bf-47dd-ab2b-79b951b2d24c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-f6qxs\" (UID: \"da90ef7d-a0bf-47dd-ab2b-79b951b2d24c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f6qxs" Apr 20 07:52:20.256206 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.256182 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6kn2\" (UniqueName: \"kubernetes.io/projected/2d9a4659-6387-4f04-8cbd-81120ca54057-kube-api-access-s6kn2\") pod \"volume-data-source-validator-7c6cbb6c87-5fnb5\" (UID: \"2d9a4659-6387-4f04-8cbd-81120ca54057\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5fnb5" Apr 20 07:52:20.335444 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.335361 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5fnb5" Apr 20 07:52:20.349317 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.349288 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29883fa5-e5e1-425a-85c2-3b3bd3ada0aa-trusted-ca\") pod \"console-operator-9d4b6777b-2chjv\" (UID: \"29883fa5-e5e1-425a-85c2-3b3bd3ada0aa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" Apr 20 07:52:20.349423 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.349328 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da90ef7d-a0bf-47dd-ab2b-79b951b2d24c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-f6qxs\" (UID: \"da90ef7d-a0bf-47dd-ab2b-79b951b2d24c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f6qxs" Apr 20 07:52:20.349423 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.349400 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cbzhh\" (UniqueName: \"kubernetes.io/projected/726097dd-28d4-46d8-84be-1e7ae8262ca7-kube-api-access-cbzhh\") pod \"cluster-samples-operator-6dc5bdb6b4-jldvq\" (UID: \"726097dd-28d4-46d8-84be-1e7ae8262ca7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jldvq" Apr 20 07:52:20.349423 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.349417 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29883fa5-e5e1-425a-85c2-3b3bd3ada0aa-config\") pod \"console-operator-9d4b6777b-2chjv\" (UID: \"29883fa5-e5e1-425a-85c2-3b3bd3ada0aa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" Apr 20 07:52:20.349585 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.349435 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29883fa5-e5e1-425a-85c2-3b3bd3ada0aa-serving-cert\") pod \"console-operator-9d4b6777b-2chjv\" (UID: \"29883fa5-e5e1-425a-85c2-3b3bd3ada0aa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" Apr 20 07:52:20.349585 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.349473 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da90ef7d-a0bf-47dd-ab2b-79b951b2d24c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-f6qxs\" (UID: \"da90ef7d-a0bf-47dd-ab2b-79b951b2d24c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f6qxs" Apr 20 07:52:20.349585 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.349500 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/726097dd-28d4-46d8-84be-1e7ae8262ca7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jldvq\" (UID: \"726097dd-28d4-46d8-84be-1e7ae8262ca7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jldvq" Apr 20 07:52:20.349585 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.349526 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kndf9\" (UniqueName: \"kubernetes.io/projected/da90ef7d-a0bf-47dd-ab2b-79b951b2d24c-kube-api-access-kndf9\") pod \"kube-storage-version-migrator-operator-6769c5d45-f6qxs\" (UID: \"da90ef7d-a0bf-47dd-ab2b-79b951b2d24c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f6qxs" Apr 20 07:52:20.349585 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.349552 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7c2t\" (UniqueName: \"kubernetes.io/projected/29883fa5-e5e1-425a-85c2-3b3bd3ada0aa-kube-api-access-j7c2t\") pod \"console-operator-9d4b6777b-2chjv\" (UID: \"29883fa5-e5e1-425a-85c2-3b3bd3ada0aa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" Apr 20 07:52:20.349881 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:20.349633 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 07:52:20.349881 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:20.349712 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/726097dd-28d4-46d8-84be-1e7ae8262ca7-samples-operator-tls podName:726097dd-28d4-46d8-84be-1e7ae8262ca7 nodeName:}" failed. No retries permitted until 2026-04-20 07:52:20.849689355 +0000 UTC m=+124.763319390 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/726097dd-28d4-46d8-84be-1e7ae8262ca7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jldvq" (UID: "726097dd-28d4-46d8-84be-1e7ae8262ca7") : secret "samples-operator-tls" not found Apr 20 07:52:20.350030 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.350007 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da90ef7d-a0bf-47dd-ab2b-79b951b2d24c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-f6qxs\" (UID: \"da90ef7d-a0bf-47dd-ab2b-79b951b2d24c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f6qxs" Apr 20 07:52:20.350555 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.350532 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/29883fa5-e5e1-425a-85c2-3b3bd3ada0aa-trusted-ca\") pod \"console-operator-9d4b6777b-2chjv\" (UID: \"29883fa5-e5e1-425a-85c2-3b3bd3ada0aa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" Apr 20 07:52:20.350651 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.350633 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29883fa5-e5e1-425a-85c2-3b3bd3ada0aa-config\") pod \"console-operator-9d4b6777b-2chjv\" (UID: \"29883fa5-e5e1-425a-85c2-3b3bd3ada0aa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" Apr 20 07:52:20.351738 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.351718 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da90ef7d-a0bf-47dd-ab2b-79b951b2d24c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-f6qxs\" (UID: \"da90ef7d-a0bf-47dd-ab2b-79b951b2d24c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f6qxs" Apr 20 07:52:20.351988 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.351973 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29883fa5-e5e1-425a-85c2-3b3bd3ada0aa-serving-cert\") pod \"console-operator-9d4b6777b-2chjv\" (UID: \"29883fa5-e5e1-425a-85c2-3b3bd3ada0aa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" Apr 20 07:52:20.358605 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.358574 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7c2t\" (UniqueName: \"kubernetes.io/projected/29883fa5-e5e1-425a-85c2-3b3bd3ada0aa-kube-api-access-j7c2t\") pod \"console-operator-9d4b6777b-2chjv\" (UID: \"29883fa5-e5e1-425a-85c2-3b3bd3ada0aa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" Apr 20 07:52:20.358902 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.358875 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kndf9\" (UniqueName: \"kubernetes.io/projected/da90ef7d-a0bf-47dd-ab2b-79b951b2d24c-kube-api-access-kndf9\") pod \"kube-storage-version-migrator-operator-6769c5d45-f6qxs\" (UID: \"da90ef7d-a0bf-47dd-ab2b-79b951b2d24c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f6qxs" Apr 20 07:52:20.359426 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.359408 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbzhh\" (UniqueName: \"kubernetes.io/projected/726097dd-28d4-46d8-84be-1e7ae8262ca7-kube-api-access-cbzhh\") pod \"cluster-samples-operator-6dc5bdb6b4-jldvq\" (UID: \"726097dd-28d4-46d8-84be-1e7ae8262ca7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jldvq" Apr 20 07:52:20.443573 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.443539 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f6qxs" Apr 20 07:52:20.444979 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.444954 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5fnb5"] Apr 20 07:52:20.449075 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:52:20.449052 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d9a4659_6387_4f04_8cbd_81120ca54057.slice/crio-a1a4c11cf8afa1386df12d1593a22858ffeffab76430a031f46aa4b4dcc1531e WatchSource:0}: Error finding container a1a4c11cf8afa1386df12d1593a22858ffeffab76430a031f46aa4b4dcc1531e: Status 404 returned error can't find the container with id a1a4c11cf8afa1386df12d1593a22858ffeffab76430a031f46aa4b4dcc1531e Apr 20 07:52:20.456529 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.456508 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" Apr 20 07:52:20.559352 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.559320 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f6qxs"] Apr 20 07:52:20.562935 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:52:20.562907 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda90ef7d_a0bf_47dd_ab2b_79b951b2d24c.slice/crio-d599d5e2e7b59b04bff6de20c823f839565099efa781f07998018822862824f5 WatchSource:0}: Error finding container d599d5e2e7b59b04bff6de20c823f839565099efa781f07998018822862824f5: Status 404 returned error can't find the container with id d599d5e2e7b59b04bff6de20c823f839565099efa781f07998018822862824f5 Apr 20 07:52:20.577610 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.577578 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2chjv"] Apr 20 07:52:20.579460 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:52:20.579431 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29883fa5_e5e1_425a_85c2_3b3bd3ada0aa.slice/crio-bace4edb6d8ab6db65a3fe5b4d056084f1b6b72079e99761f55d1f31f1f3a177 WatchSource:0}: Error finding container bace4edb6d8ab6db65a3fe5b4d056084f1b6b72079e99761f55d1f31f1f3a177: Status 404 returned error can't find the container with id bace4edb6d8ab6db65a3fe5b4d056084f1b6b72079e99761f55d1f31f1f3a177 Apr 20 07:52:20.853462 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:20.853423 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/726097dd-28d4-46d8-84be-1e7ae8262ca7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jldvq\" (UID: \"726097dd-28d4-46d8-84be-1e7ae8262ca7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jldvq" Apr 20 07:52:20.853616 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:20.853551 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 07:52:20.853616 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:20.853604 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/726097dd-28d4-46d8-84be-1e7ae8262ca7-samples-operator-tls podName:726097dd-28d4-46d8-84be-1e7ae8262ca7 nodeName:}" failed. No retries permitted until 2026-04-20 07:52:21.853590417 +0000 UTC m=+125.767220438 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/726097dd-28d4-46d8-84be-1e7ae8262ca7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jldvq" (UID: "726097dd-28d4-46d8-84be-1e7ae8262ca7") : secret "samples-operator-tls" not found Apr 20 07:52:21.053744 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:21.053698 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5fnb5" event={"ID":"2d9a4659-6387-4f04-8cbd-81120ca54057","Type":"ContainerStarted","Data":"a1a4c11cf8afa1386df12d1593a22858ffeffab76430a031f46aa4b4dcc1531e"} Apr 20 07:52:21.054794 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:21.054765 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f6qxs" event={"ID":"da90ef7d-a0bf-47dd-ab2b-79b951b2d24c","Type":"ContainerStarted","Data":"d599d5e2e7b59b04bff6de20c823f839565099efa781f07998018822862824f5"} Apr 20 07:52:21.055964 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:21.055933 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" event={"ID":"29883fa5-e5e1-425a-85c2-3b3bd3ada0aa","Type":"ContainerStarted","Data":"bace4edb6d8ab6db65a3fe5b4d056084f1b6b72079e99761f55d1f31f1f3a177"} Apr 20 07:52:21.861722 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:21.861674 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/726097dd-28d4-46d8-84be-1e7ae8262ca7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jldvq\" (UID: \"726097dd-28d4-46d8-84be-1e7ae8262ca7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jldvq" Apr 20 07:52:21.861905 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:21.861822 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 07:52:21.862074 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:21.862056 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/726097dd-28d4-46d8-84be-1e7ae8262ca7-samples-operator-tls podName:726097dd-28d4-46d8-84be-1e7ae8262ca7 nodeName:}" failed. No retries permitted until 2026-04-20 07:52:23.862029284 +0000 UTC m=+127.775659316 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/726097dd-28d4-46d8-84be-1e7ae8262ca7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jldvq" (UID: "726097dd-28d4-46d8-84be-1e7ae8262ca7") : secret "samples-operator-tls" not found Apr 20 07:52:22.059341 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:22.059292 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5fnb5" event={"ID":"2d9a4659-6387-4f04-8cbd-81120ca54057","Type":"ContainerStarted","Data":"a40810e0c033b821bdad39da6afe4febf9457ef9c31bd8e1d5fe34b0ab5093d6"} Apr 20 07:52:22.074053 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:22.073985 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-5fnb5" podStartSLOduration=0.618297473 podStartE2EDuration="2.073964658s" podCreationTimestamp="2026-04-20 07:52:20 +0000 UTC" firstStartedPulling="2026-04-20 07:52:20.450879159 +0000 UTC m=+124.364509182" lastFinishedPulling="2026-04-20 07:52:21.90654634 +0000 UTC m=+125.820176367" observedRunningTime="2026-04-20 07:52:22.072556429 +0000 UTC m=+125.986186473" watchObservedRunningTime="2026-04-20 07:52:22.073964658 +0000 UTC m=+125.987594708" Apr 20 07:52:23.882449 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:23.882412 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/726097dd-28d4-46d8-84be-1e7ae8262ca7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jldvq\" (UID: \"726097dd-28d4-46d8-84be-1e7ae8262ca7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jldvq" Apr 20 07:52:23.882927 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:23.882571 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 07:52:23.882927 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:23.882659 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/726097dd-28d4-46d8-84be-1e7ae8262ca7-samples-operator-tls podName:726097dd-28d4-46d8-84be-1e7ae8262ca7 nodeName:}" failed. No retries permitted until 2026-04-20 07:52:27.882637199 +0000 UTC m=+131.796267235 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/726097dd-28d4-46d8-84be-1e7ae8262ca7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jldvq" (UID: "726097dd-28d4-46d8-84be-1e7ae8262ca7") : secret "samples-operator-tls" not found Apr 20 07:52:24.067189 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:24.067163 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2chjv_29883fa5-e5e1-425a-85c2-3b3bd3ada0aa/console-operator/0.log" Apr 20 07:52:24.067365 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:24.067202 2572 generic.go:358] "Generic (PLEG): container finished" podID="29883fa5-e5e1-425a-85c2-3b3bd3ada0aa" containerID="10e6d095cf771ef5ad7d685844f31df72f1acf2a10a6b2963f4810ca7c1d859d" exitCode=255 Apr 20 07:52:24.067365 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:24.067288 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" event={"ID":"29883fa5-e5e1-425a-85c2-3b3bd3ada0aa","Type":"ContainerDied","Data":"10e6d095cf771ef5ad7d685844f31df72f1acf2a10a6b2963f4810ca7c1d859d"} Apr 20 07:52:24.067565 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:24.067542 2572 scope.go:117] "RemoveContainer" containerID="10e6d095cf771ef5ad7d685844f31df72f1acf2a10a6b2963f4810ca7c1d859d" Apr 20 07:52:24.068615 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:24.068595 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f6qxs" event={"ID":"da90ef7d-a0bf-47dd-ab2b-79b951b2d24c","Type":"ContainerStarted","Data":"f7b3386510b2851347a0b497d722bdd24f72943d807bdcdd13e06472946227ab"} Apr 20 07:52:24.095561 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:24.095516 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f6qxs" podStartSLOduration=1.568493194 podStartE2EDuration="4.095503739s" podCreationTimestamp="2026-04-20 07:52:20 +0000 UTC" firstStartedPulling="2026-04-20 07:52:20.564671629 +0000 UTC m=+124.478301650" lastFinishedPulling="2026-04-20 07:52:23.091682174 +0000 UTC m=+127.005312195" observedRunningTime="2026-04-20 07:52:24.094733938 +0000 UTC m=+128.008363981" watchObservedRunningTime="2026-04-20 07:52:24.095503739 +0000 UTC m=+128.009133781" Apr 20 07:52:25.071778 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:25.071750 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2chjv_29883fa5-e5e1-425a-85c2-3b3bd3ada0aa/console-operator/1.log" Apr 20 07:52:25.072246 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:25.072124 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2chjv_29883fa5-e5e1-425a-85c2-3b3bd3ada0aa/console-operator/0.log" Apr 20 07:52:25.072246 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:25.072187 2572 generic.go:358] "Generic (PLEG): container finished" podID="29883fa5-e5e1-425a-85c2-3b3bd3ada0aa" containerID="b70d17d8bbf93e4fc8379ffa54d797930859e1a5ccabcc02462a3297a64afb9e" exitCode=255 Apr 20 07:52:25.072355 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:25.072272 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" event={"ID":"29883fa5-e5e1-425a-85c2-3b3bd3ada0aa","Type":"ContainerDied","Data":"b70d17d8bbf93e4fc8379ffa54d797930859e1a5ccabcc02462a3297a64afb9e"} Apr 20 07:52:25.072355 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:25.072330 2572 scope.go:117] "RemoveContainer" containerID="10e6d095cf771ef5ad7d685844f31df72f1acf2a10a6b2963f4810ca7c1d859d" Apr 20 07:52:25.072519 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:25.072491 2572 scope.go:117] "RemoveContainer" containerID="b70d17d8bbf93e4fc8379ffa54d797930859e1a5ccabcc02462a3297a64afb9e" Apr 20 07:52:25.072771 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:25.072748 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2chjv_openshift-console-operator(29883fa5-e5e1-425a-85c2-3b3bd3ada0aa)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" podUID="29883fa5-e5e1-425a-85c2-3b3bd3ada0aa" Apr 20 07:52:25.798263 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:25.798211 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kb8qs\" (UID: \"b28cd900-1a9e-4c36-a8d1-7409e30f8de9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs" Apr 20 07:52:25.798490 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:25.798367 2572 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 07:52:25.798490 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:25.798434 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-cluster-monitoring-operator-tls podName:b28cd900-1a9e-4c36-a8d1-7409e30f8de9 nodeName:}" failed. No retries permitted until 2026-04-20 07:52:41.798416955 +0000 UTC m=+145.712046981 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kb8qs" (UID: "b28cd900-1a9e-4c36-a8d1-7409e30f8de9") : secret "cluster-monitoring-operator-tls" not found Apr 20 07:52:25.902739 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:25.899956 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/520d7307-df9b-4a84-8836-d8b02ebe3ddb-service-ca-bundle\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:25.902739 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:25.900088 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-metrics-certs\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:25.902739 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:25.900281 2572 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 07:52:25.902739 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:25.900363 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-metrics-certs podName:520d7307-df9b-4a84-8836-d8b02ebe3ddb nodeName:}" failed. No retries permitted until 2026-04-20 07:52:41.900341714 +0000 UTC m=+145.813971749 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-metrics-certs") pod "router-default-656885888d-85974" (UID: "520d7307-df9b-4a84-8836-d8b02ebe3ddb") : secret "router-metrics-certs-default" not found Apr 20 07:52:25.902739 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:25.900896 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/520d7307-df9b-4a84-8836-d8b02ebe3ddb-service-ca-bundle podName:520d7307-df9b-4a84-8836-d8b02ebe3ddb nodeName:}" failed. No retries permitted until 2026-04-20 07:52:41.900878675 +0000 UTC m=+145.814508698 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/520d7307-df9b-4a84-8836-d8b02ebe3ddb-service-ca-bundle") pod "router-default-656885888d-85974" (UID: "520d7307-df9b-4a84-8836-d8b02ebe3ddb") : configmap references non-existent config key: service-ca.crt Apr 20 07:52:26.076707 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:26.076630 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2chjv_29883fa5-e5e1-425a-85c2-3b3bd3ada0aa/console-operator/1.log" Apr 20 07:52:26.077110 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:26.076974 2572 scope.go:117] "RemoveContainer" containerID="b70d17d8bbf93e4fc8379ffa54d797930859e1a5ccabcc02462a3297a64afb9e" Apr 20 07:52:26.077177 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:26.077133 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2chjv_openshift-console-operator(29883fa5-e5e1-425a-85c2-3b3bd3ada0aa)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" podUID="29883fa5-e5e1-425a-85c2-3b3bd3ada0aa" Apr 20 07:52:26.404418 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:26.404313 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs\") pod \"network-metrics-daemon-brq5h\" (UID: \"07219834-44d6-42ab-9058-aed46274d1a8\") " pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:52:26.404568 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:26.404459 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 07:52:26.404568 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:26.404523 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs podName:07219834-44d6-42ab-9058-aed46274d1a8 nodeName:}" failed. No retries permitted until 2026-04-20 07:54:28.404506831 +0000 UTC m=+252.318136863 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs") pod "network-metrics-daemon-brq5h" (UID: "07219834-44d6-42ab-9058-aed46274d1a8") : secret "metrics-daemon-secret" not found Apr 20 07:52:27.915959 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:27.915919 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/726097dd-28d4-46d8-84be-1e7ae8262ca7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jldvq\" (UID: \"726097dd-28d4-46d8-84be-1e7ae8262ca7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jldvq" Apr 20 07:52:27.916373 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:27.916062 2572 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 07:52:27.916373 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:27.916162 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/726097dd-28d4-46d8-84be-1e7ae8262ca7-samples-operator-tls podName:726097dd-28d4-46d8-84be-1e7ae8262ca7 nodeName:}" failed. No retries permitted until 2026-04-20 07:52:35.916123465 +0000 UTC m=+139.829753495 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/726097dd-28d4-46d8-84be-1e7ae8262ca7-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-jldvq" (UID: "726097dd-28d4-46d8-84be-1e7ae8262ca7") : secret "samples-operator-tls" not found Apr 20 07:52:30.457253 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:30.457209 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" Apr 20 07:52:30.457253 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:30.457242 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" Apr 20 07:52:30.457953 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:30.457940 2572 scope.go:117] "RemoveContainer" containerID="b70d17d8bbf93e4fc8379ffa54d797930859e1a5ccabcc02462a3297a64afb9e" Apr 20 07:52:30.458117 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:30.458100 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2chjv_openshift-console-operator(29883fa5-e5e1-425a-85c2-3b3bd3ada0aa)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" podUID="29883fa5-e5e1-425a-85c2-3b3bd3ada0aa" Apr 20 07:52:35.974238 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:35.974185 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/726097dd-28d4-46d8-84be-1e7ae8262ca7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jldvq\" (UID: \"726097dd-28d4-46d8-84be-1e7ae8262ca7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jldvq" Apr 20 07:52:35.976565 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:35.976540 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/726097dd-28d4-46d8-84be-1e7ae8262ca7-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-jldvq\" (UID: \"726097dd-28d4-46d8-84be-1e7ae8262ca7\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jldvq" Apr 20 07:52:36.051469 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:36.051429 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jldvq" Apr 20 07:52:36.164756 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:36.164728 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jldvq"] Apr 20 07:52:37.102688 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:37.102644 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jldvq" event={"ID":"726097dd-28d4-46d8-84be-1e7ae8262ca7","Type":"ContainerStarted","Data":"09af1b3eae607d4aae1a7a65b3378a1ce616f4361ffaf260c6b14e043a991739"} Apr 20 07:52:39.108600 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:39.108564 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jldvq" event={"ID":"726097dd-28d4-46d8-84be-1e7ae8262ca7","Type":"ContainerStarted","Data":"326358258e28e8bd1eb6970e7ca5605fa076aed0f77bc9204d0979aceaa097d8"} Apr 20 07:52:39.108600 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:39.108598 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jldvq" event={"ID":"726097dd-28d4-46d8-84be-1e7ae8262ca7","Type":"ContainerStarted","Data":"15a03ed2326bad08e0a22fdf3469cc7332514e9f42d7ec0678a69fad90342055"} Apr 20 07:52:39.124456 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:39.124410 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-jldvq" podStartSLOduration=17.134376238 podStartE2EDuration="19.124397722s" podCreationTimestamp="2026-04-20 07:52:20 +0000 UTC" firstStartedPulling="2026-04-20 07:52:36.207886791 +0000 UTC m=+140.121516812" lastFinishedPulling="2026-04-20 07:52:38.197908272 +0000 UTC m=+142.111538296" observedRunningTime="2026-04-20 07:52:39.123321263 +0000 UTC m=+143.036951306" watchObservedRunningTime="2026-04-20 07:52:39.124397722 +0000 UTC m=+143.038027765" Apr 20 07:52:41.821961 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:41.821901 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kb8qs\" (UID: \"b28cd900-1a9e-4c36-a8d1-7409e30f8de9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs" Apr 20 07:52:41.824235 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:41.824212 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/b28cd900-1a9e-4c36-a8d1-7409e30f8de9-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kb8qs\" (UID: \"b28cd900-1a9e-4c36-a8d1-7409e30f8de9\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs" Apr 20 07:52:41.922325 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:41.922287 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-metrics-certs\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:41.922492 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:41.922465 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/520d7307-df9b-4a84-8836-d8b02ebe3ddb-service-ca-bundle\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:41.922984 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:41.922958 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/520d7307-df9b-4a84-8836-d8b02ebe3ddb-service-ca-bundle\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:41.924646 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:41.924624 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/520d7307-df9b-4a84-8836-d8b02ebe3ddb-metrics-certs\") pod \"router-default-656885888d-85974\" (UID: \"520d7307-df9b-4a84-8836-d8b02ebe3ddb\") " pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:42.090084 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:42.090000 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-jmgbc\"" Apr 20 07:52:42.097970 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:42.097951 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs" Apr 20 07:52:42.203107 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:42.203081 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-mcczc\"" Apr 20 07:52:42.210906 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:42.210877 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:42.211977 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:42.211957 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs"] Apr 20 07:52:42.216367 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:52:42.216323 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb28cd900_1a9e_4c36_a8d1_7409e30f8de9.slice/crio-6b2c2ecf21f26d6c67bda4a047b6f5d5714c920d44cba5e29911b78d0625c309 WatchSource:0}: Error finding container 6b2c2ecf21f26d6c67bda4a047b6f5d5714c920d44cba5e29911b78d0625c309: Status 404 returned error can't find the container with id 6b2c2ecf21f26d6c67bda4a047b6f5d5714c920d44cba5e29911b78d0625c309 Apr 20 07:52:42.325215 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:42.325179 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-656885888d-85974"] Apr 20 07:52:42.328374 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:52:42.328355 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod520d7307_df9b_4a84_8836_d8b02ebe3ddb.slice/crio-cb3f1453e678cc43a94d922e6a05137d67b7a5452c8b7d1778e756390af5633a WatchSource:0}: Error finding container cb3f1453e678cc43a94d922e6a05137d67b7a5452c8b7d1778e756390af5633a: Status 404 returned error can't find the container with id cb3f1453e678cc43a94d922e6a05137d67b7a5452c8b7d1778e756390af5633a Apr 20 07:52:42.641347 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:42.641266 2572 scope.go:117] "RemoveContainer" containerID="b70d17d8bbf93e4fc8379ffa54d797930859e1a5ccabcc02462a3297a64afb9e" Apr 20 07:52:43.121283 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:43.121240 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-656885888d-85974" event={"ID":"520d7307-df9b-4a84-8836-d8b02ebe3ddb","Type":"ContainerStarted","Data":"a2e2d4e7063f55eddfd71a1207adca93f69a8dd092319755e76ff8bd38af0ae5"} Apr 20 07:52:43.121283 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:43.121286 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-656885888d-85974" event={"ID":"520d7307-df9b-4a84-8836-d8b02ebe3ddb","Type":"ContainerStarted","Data":"cb3f1453e678cc43a94d922e6a05137d67b7a5452c8b7d1778e756390af5633a"} Apr 20 07:52:43.122847 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:43.122826 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2chjv_29883fa5-e5e1-425a-85c2-3b3bd3ada0aa/console-operator/1.log" Apr 20 07:52:43.122966 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:43.122909 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" event={"ID":"29883fa5-e5e1-425a-85c2-3b3bd3ada0aa","Type":"ContainerStarted","Data":"ba44179ed69ad155e5b4f87ae175ce6e830179fadaaa68c556e2907d0eba66a7"} Apr 20 07:52:43.123191 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:43.123172 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" Apr 20 07:52:43.123976 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:43.123952 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs" event={"ID":"b28cd900-1a9e-4c36-a8d1-7409e30f8de9","Type":"ContainerStarted","Data":"6b2c2ecf21f26d6c67bda4a047b6f5d5714c920d44cba5e29911b78d0625c309"} Apr 20 07:52:43.139237 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:43.139196 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-656885888d-85974" podStartSLOduration=33.139184191 podStartE2EDuration="33.139184191s" podCreationTimestamp="2026-04-20 07:52:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:52:43.138133647 +0000 UTC m=+147.051763693" watchObservedRunningTime="2026-04-20 07:52:43.139184191 +0000 UTC m=+147.052814233" Apr 20 07:52:43.153102 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:43.153021 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" podStartSLOduration=20.645902238 podStartE2EDuration="23.153008008s" podCreationTimestamp="2026-04-20 07:52:20 +0000 UTC" firstStartedPulling="2026-04-20 07:52:20.581130647 +0000 UTC m=+124.494760669" lastFinishedPulling="2026-04-20 07:52:23.088236412 +0000 UTC m=+127.001866439" observedRunningTime="2026-04-20 07:52:43.152712274 +0000 UTC m=+147.066342353" watchObservedRunningTime="2026-04-20 07:52:43.153008008 +0000 UTC m=+147.066638051" Apr 20 07:52:43.212078 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:43.212042 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:43.214805 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:43.214780 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:43.244610 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:43.244579 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-2chjv" Apr 20 07:52:44.127707 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:44.127677 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:44.129057 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:44.129030 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-656885888d-85974" Apr 20 07:52:45.131337 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:45.131297 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs" event={"ID":"b28cd900-1a9e-4c36-a8d1-7409e30f8de9","Type":"ContainerStarted","Data":"6ef67e93807183a1b26553d8d9c3a17b383047b14dbca793bb5d8e6a951242b3"} Apr 20 07:52:45.146271 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:45.146231 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kb8qs" podStartSLOduration=34.139340628 podStartE2EDuration="36.146213807s" podCreationTimestamp="2026-04-20 07:52:09 +0000 UTC" firstStartedPulling="2026-04-20 07:52:42.218093167 +0000 UTC m=+146.131723188" lastFinishedPulling="2026-04-20 07:52:44.224966346 +0000 UTC m=+148.138596367" observedRunningTime="2026-04-20 07:52:45.145465272 +0000 UTC m=+149.059095314" watchObservedRunningTime="2026-04-20 07:52:45.146213807 +0000 UTC m=+149.059843848" Apr 20 07:52:47.017268 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.017230 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-59fbcc4b4b-4ngd7"] Apr 20 07:52:47.020616 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.020594 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.023407 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.023384 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 07:52:47.023521 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.023397 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 07:52:47.024556 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.024537 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 07:52:47.024747 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.024733 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-68mlz\"" Apr 20 07:52:47.028993 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.028970 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 07:52:47.033678 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.033657 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-59fbcc4b4b-4ngd7"] Apr 20 07:52:47.065107 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.065076 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/211e6876-c742-49f7-b034-331aaa7d75b3-registry-certificates\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.065107 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.065112 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/211e6876-c742-49f7-b034-331aaa7d75b3-bound-sa-token\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.065348 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.065134 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/211e6876-c742-49f7-b034-331aaa7d75b3-image-registry-private-configuration\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.065348 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.065172 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/211e6876-c742-49f7-b034-331aaa7d75b3-ca-trust-extracted\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.065348 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.065189 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxgbr\" (UniqueName: \"kubernetes.io/projected/211e6876-c742-49f7-b034-331aaa7d75b3-kube-api-access-xxgbr\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.065348 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.065297 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/211e6876-c742-49f7-b034-331aaa7d75b3-trusted-ca\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.065348 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.065338 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/211e6876-c742-49f7-b034-331aaa7d75b3-installation-pull-secrets\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.065604 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.065374 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/211e6876-c742-49f7-b034-331aaa7d75b3-registry-tls\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.070623 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.070601 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-59fbcc4b4b-4ngd7"] Apr 20 07:52:47.070794 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:47.070772 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[bound-sa-token ca-trust-extracted image-registry-private-configuration installation-pull-secrets kube-api-access-xxgbr registry-certificates registry-tls trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" podUID="211e6876-c742-49f7-b034-331aaa7d75b3" Apr 20 07:52:47.107760 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.107731 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-64bbf84747-kdrcj"] Apr 20 07:52:47.111054 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.111030 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.124804 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.124778 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-556kx"] Apr 20 07:52:47.128008 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.127983 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-64bbf84747-kdrcj"] Apr 20 07:52:47.128174 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.128108 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-556kx" Apr 20 07:52:47.130598 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.130578 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-2jzqk\"" Apr 20 07:52:47.130747 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.130658 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 07:52:47.131133 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.130903 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 07:52:47.136296 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.136269 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-92gbs"] Apr 20 07:52:47.145334 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.145307 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-556kx"] Apr 20 07:52:47.145538 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.145367 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.145599 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.145583 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-92gbs" Apr 20 07:52:47.147951 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.147931 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 07:52:47.148059 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.147966 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 07:52:47.148059 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.148052 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-lvs8m\"" Apr 20 07:52:47.150305 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.150288 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.150903 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.150859 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-92gbs"] Apr 20 07:52:47.166204 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166173 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/211e6876-c742-49f7-b034-331aaa7d75b3-installation-pull-secrets\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.166312 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166213 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c4f5392a-e25d-4f52-b745-c32d100b1565-image-registry-private-configuration\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.166312 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166246 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/211e6876-c742-49f7-b034-331aaa7d75b3-registry-tls\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.166312 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166278 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrckw\" (UniqueName: \"kubernetes.io/projected/c4f5392a-e25d-4f52-b745-c32d100b1565-kube-api-access-qrckw\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.166312 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166307 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c4f5392a-e25d-4f52-b745-c32d100b1565-ca-trust-extracted\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.166506 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166329 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c4f5392a-e25d-4f52-b745-c32d100b1565-installation-pull-secrets\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.166506 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166359 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c4f5392a-e25d-4f52-b745-c32d100b1565-registry-tls\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.166506 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166380 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c4f5392a-e25d-4f52-b745-c32d100b1565-trusted-ca\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.166506 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166408 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c4f5392a-e25d-4f52-b745-c32d100b1565-registry-certificates\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.166506 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166434 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/211e6876-c742-49f7-b034-331aaa7d75b3-registry-certificates\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.166506 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166451 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/211e6876-c742-49f7-b034-331aaa7d75b3-bound-sa-token\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.166506 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166475 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2-crio-socket\") pod \"insights-runtime-extractor-92gbs\" (UID: \"5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2\") " pod="openshift-insights/insights-runtime-extractor-92gbs" Apr 20 07:52:47.166506 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166500 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtx5s\" (UniqueName: \"kubernetes.io/projected/f76edbc7-6592-484d-8941-01f12cf229e7-kube-api-access-vtx5s\") pod \"downloads-6bcc868b7-556kx\" (UID: \"f76edbc7-6592-484d-8941-01f12cf229e7\") " pod="openshift-console/downloads-6bcc868b7-556kx" Apr 20 07:52:47.166866 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166551 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c4f5392a-e25d-4f52-b745-c32d100b1565-bound-sa-token\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.166866 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166583 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/211e6876-c742-49f7-b034-331aaa7d75b3-image-registry-private-configuration\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.166866 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166624 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/211e6876-c742-49f7-b034-331aaa7d75b3-ca-trust-extracted\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.166866 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166714 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxgbr\" (UniqueName: \"kubernetes.io/projected/211e6876-c742-49f7-b034-331aaa7d75b3-kube-api-access-xxgbr\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.166866 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166748 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2-data-volume\") pod \"insights-runtime-extractor-92gbs\" (UID: \"5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2\") " pod="openshift-insights/insights-runtime-extractor-92gbs" Apr 20 07:52:47.166866 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166782 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-92gbs\" (UID: \"5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2\") " pod="openshift-insights/insights-runtime-extractor-92gbs" Apr 20 07:52:47.166866 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166817 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-92gbs\" (UID: \"5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2\") " pod="openshift-insights/insights-runtime-extractor-92gbs" Apr 20 07:52:47.166866 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166867 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj8tf\" (UniqueName: \"kubernetes.io/projected/5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2-kube-api-access-pj8tf\") pod \"insights-runtime-extractor-92gbs\" (UID: \"5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2\") " pod="openshift-insights/insights-runtime-extractor-92gbs" Apr 20 07:52:47.167281 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166933 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/211e6876-c742-49f7-b034-331aaa7d75b3-trusted-ca\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.167281 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.166985 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/211e6876-c742-49f7-b034-331aaa7d75b3-ca-trust-extracted\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.167371 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.167350 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/211e6876-c742-49f7-b034-331aaa7d75b3-registry-certificates\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.167627 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.167611 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/211e6876-c742-49f7-b034-331aaa7d75b3-trusted-ca\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.168886 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.168866 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/211e6876-c742-49f7-b034-331aaa7d75b3-image-registry-private-configuration\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.169134 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.169119 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/211e6876-c742-49f7-b034-331aaa7d75b3-registry-tls\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.169194 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.169177 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/211e6876-c742-49f7-b034-331aaa7d75b3-installation-pull-secrets\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.182676 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.182626 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxgbr\" (UniqueName: \"kubernetes.io/projected/211e6876-c742-49f7-b034-331aaa7d75b3-kube-api-access-xxgbr\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.182834 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.182805 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/211e6876-c742-49f7-b034-331aaa7d75b3-bound-sa-token\") pod \"image-registry-59fbcc4b4b-4ngd7\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:47.267815 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.267741 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/211e6876-c742-49f7-b034-331aaa7d75b3-trusted-ca\") pod \"211e6876-c742-49f7-b034-331aaa7d75b3\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " Apr 20 07:52:47.267815 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.267785 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/211e6876-c742-49f7-b034-331aaa7d75b3-ca-trust-extracted\") pod \"211e6876-c742-49f7-b034-331aaa7d75b3\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " Apr 20 07:52:47.267815 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.267803 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/211e6876-c742-49f7-b034-331aaa7d75b3-registry-tls\") pod \"211e6876-c742-49f7-b034-331aaa7d75b3\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " Apr 20 07:52:47.268047 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.267851 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/211e6876-c742-49f7-b034-331aaa7d75b3-image-registry-private-configuration\") pod \"211e6876-c742-49f7-b034-331aaa7d75b3\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " Apr 20 07:52:47.268047 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.267892 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/211e6876-c742-49f7-b034-331aaa7d75b3-registry-certificates\") pod \"211e6876-c742-49f7-b034-331aaa7d75b3\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " Apr 20 07:52:47.268047 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.267924 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxgbr\" (UniqueName: \"kubernetes.io/projected/211e6876-c742-49f7-b034-331aaa7d75b3-kube-api-access-xxgbr\") pod \"211e6876-c742-49f7-b034-331aaa7d75b3\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " Apr 20 07:52:47.268047 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.267961 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/211e6876-c742-49f7-b034-331aaa7d75b3-bound-sa-token\") pod \"211e6876-c742-49f7-b034-331aaa7d75b3\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " Apr 20 07:52:47.268047 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.267987 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/211e6876-c742-49f7-b034-331aaa7d75b3-installation-pull-secrets\") pod \"211e6876-c742-49f7-b034-331aaa7d75b3\" (UID: \"211e6876-c742-49f7-b034-331aaa7d75b3\") " Apr 20 07:52:47.268309 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.268038 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/211e6876-c742-49f7-b034-331aaa7d75b3-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "211e6876-c742-49f7-b034-331aaa7d75b3" (UID: "211e6876-c742-49f7-b034-331aaa7d75b3"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:52:47.268309 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.268097 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2-data-volume\") pod \"insights-runtime-extractor-92gbs\" (UID: \"5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2\") " pod="openshift-insights/insights-runtime-extractor-92gbs" Apr 20 07:52:47.268309 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.268132 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-92gbs\" (UID: \"5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2\") " pod="openshift-insights/insights-runtime-extractor-92gbs" Apr 20 07:52:47.268309 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.268187 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-92gbs\" (UID: \"5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2\") " pod="openshift-insights/insights-runtime-extractor-92gbs" Apr 20 07:52:47.268309 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.268248 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pj8tf\" (UniqueName: \"kubernetes.io/projected/5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2-kube-api-access-pj8tf\") pod \"insights-runtime-extractor-92gbs\" (UID: \"5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2\") " pod="openshift-insights/insights-runtime-extractor-92gbs" Apr 20 07:52:47.268563 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.268217 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/211e6876-c742-49f7-b034-331aaa7d75b3-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "211e6876-c742-49f7-b034-331aaa7d75b3" (UID: "211e6876-c742-49f7-b034-331aaa7d75b3"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:52:47.268563 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.268335 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c4f5392a-e25d-4f52-b745-c32d100b1565-image-registry-private-configuration\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.268563 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.268377 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrckw\" (UniqueName: \"kubernetes.io/projected/c4f5392a-e25d-4f52-b745-c32d100b1565-kube-api-access-qrckw\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.268563 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.268418 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c4f5392a-e25d-4f52-b745-c32d100b1565-ca-trust-extracted\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.268563 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.268429 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/211e6876-c742-49f7-b034-331aaa7d75b3-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "211e6876-c742-49f7-b034-331aaa7d75b3" (UID: "211e6876-c742-49f7-b034-331aaa7d75b3"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:52:47.268563 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.268444 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c4f5392a-e25d-4f52-b745-c32d100b1565-installation-pull-secrets\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.268563 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.268504 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c4f5392a-e25d-4f52-b745-c32d100b1565-registry-tls\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.268563 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.268533 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c4f5392a-e25d-4f52-b745-c32d100b1565-trusted-ca\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.269092 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.268567 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c4f5392a-e25d-4f52-b745-c32d100b1565-registry-certificates\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.269092 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.268605 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2-crio-socket\") pod \"insights-runtime-extractor-92gbs\" (UID: \"5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2\") " pod="openshift-insights/insights-runtime-extractor-92gbs" Apr 20 07:52:47.269092 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.268630 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vtx5s\" (UniqueName: \"kubernetes.io/projected/f76edbc7-6592-484d-8941-01f12cf229e7-kube-api-access-vtx5s\") pod \"downloads-6bcc868b7-556kx\" (UID: \"f76edbc7-6592-484d-8941-01f12cf229e7\") " pod="openshift-console/downloads-6bcc868b7-556kx" Apr 20 07:52:47.269092 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.268666 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c4f5392a-e25d-4f52-b745-c32d100b1565-bound-sa-token\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.269092 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.268824 2572 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/211e6876-c742-49f7-b034-331aaa7d75b3-registry-certificates\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:52:47.269092 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.268845 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/211e6876-c742-49f7-b034-331aaa7d75b3-trusted-ca\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:52:47.269092 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.268861 2572 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/211e6876-c742-49f7-b034-331aaa7d75b3-ca-trust-extracted\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:52:47.269092 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.268914 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-92gbs\" (UID: \"5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2\") " pod="openshift-insights/insights-runtime-extractor-92gbs" Apr 20 07:52:47.269501 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.269204 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2-data-volume\") pod \"insights-runtime-extractor-92gbs\" (UID: \"5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2\") " pod="openshift-insights/insights-runtime-extractor-92gbs" Apr 20 07:52:47.269501 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.269246 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2-crio-socket\") pod \"insights-runtime-extractor-92gbs\" (UID: \"5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2\") " pod="openshift-insights/insights-runtime-extractor-92gbs" Apr 20 07:52:47.269619 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.269594 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c4f5392a-e25d-4f52-b745-c32d100b1565-registry-certificates\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.269862 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.269840 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c4f5392a-e25d-4f52-b745-c32d100b1565-ca-trust-extracted\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.270621 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.270593 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c4f5392a-e25d-4f52-b745-c32d100b1565-trusted-ca\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.271021 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.270991 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/211e6876-c742-49f7-b034-331aaa7d75b3-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "211e6876-c742-49f7-b034-331aaa7d75b3" (UID: "211e6876-c742-49f7-b034-331aaa7d75b3"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:52:47.271551 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.271523 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/211e6876-c742-49f7-b034-331aaa7d75b3-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "211e6876-c742-49f7-b034-331aaa7d75b3" (UID: "211e6876-c742-49f7-b034-331aaa7d75b3"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:52:47.271643 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.271525 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/211e6876-c742-49f7-b034-331aaa7d75b3-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "211e6876-c742-49f7-b034-331aaa7d75b3" (UID: "211e6876-c742-49f7-b034-331aaa7d75b3"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:52:47.271643 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.271602 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c4f5392a-e25d-4f52-b745-c32d100b1565-installation-pull-secrets\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.271745 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.271725 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/211e6876-c742-49f7-b034-331aaa7d75b3-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "211e6876-c742-49f7-b034-331aaa7d75b3" (UID: "211e6876-c742-49f7-b034-331aaa7d75b3"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:52:47.272081 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.272055 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-92gbs\" (UID: \"5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2\") " pod="openshift-insights/insights-runtime-extractor-92gbs" Apr 20 07:52:47.272741 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.272722 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c4f5392a-e25d-4f52-b745-c32d100b1565-image-registry-private-configuration\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.272935 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.272919 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c4f5392a-e25d-4f52-b745-c32d100b1565-registry-tls\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.272973 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.272921 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/211e6876-c742-49f7-b034-331aaa7d75b3-kube-api-access-xxgbr" (OuterVolumeSpecName: "kube-api-access-xxgbr") pod "211e6876-c742-49f7-b034-331aaa7d75b3" (UID: "211e6876-c742-49f7-b034-331aaa7d75b3"). InnerVolumeSpecName "kube-api-access-xxgbr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:52:47.276693 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.276676 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c4f5392a-e25d-4f52-b745-c32d100b1565-bound-sa-token\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.277296 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.277274 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtx5s\" (UniqueName: \"kubernetes.io/projected/f76edbc7-6592-484d-8941-01f12cf229e7-kube-api-access-vtx5s\") pod \"downloads-6bcc868b7-556kx\" (UID: \"f76edbc7-6592-484d-8941-01f12cf229e7\") " pod="openshift-console/downloads-6bcc868b7-556kx" Apr 20 07:52:47.277451 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.277431 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj8tf\" (UniqueName: \"kubernetes.io/projected/5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2-kube-api-access-pj8tf\") pod \"insights-runtime-extractor-92gbs\" (UID: \"5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2\") " pod="openshift-insights/insights-runtime-extractor-92gbs" Apr 20 07:52:47.277557 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.277529 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrckw\" (UniqueName: \"kubernetes.io/projected/c4f5392a-e25d-4f52-b745-c32d100b1565-kube-api-access-qrckw\") pod \"image-registry-64bbf84747-kdrcj\" (UID: \"c4f5392a-e25d-4f52-b745-c32d100b1565\") " pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.370228 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.370187 2572 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/211e6876-c742-49f7-b034-331aaa7d75b3-image-registry-private-configuration\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:52:47.370228 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.370220 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xxgbr\" (UniqueName: \"kubernetes.io/projected/211e6876-c742-49f7-b034-331aaa7d75b3-kube-api-access-xxgbr\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:52:47.370228 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.370234 2572 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/211e6876-c742-49f7-b034-331aaa7d75b3-bound-sa-token\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:52:47.370456 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.370250 2572 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/211e6876-c742-49f7-b034-331aaa7d75b3-installation-pull-secrets\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:52:47.370456 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.370262 2572 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/211e6876-c742-49f7-b034-331aaa7d75b3-registry-tls\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:52:47.419796 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.419769 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:47.444684 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.444652 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-556kx" Apr 20 07:52:47.459152 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.459115 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-92gbs" Apr 20 07:52:47.558056 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.557977 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-64bbf84747-kdrcj"] Apr 20 07:52:47.560948 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:52:47.560895 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4f5392a_e25d_4f52_b745_c32d100b1565.slice/crio-ebe89212105b0d37110b92dd82ca66ea9dd579bbb7269d1246f0a7b99f10e9f2 WatchSource:0}: Error finding container ebe89212105b0d37110b92dd82ca66ea9dd579bbb7269d1246f0a7b99f10e9f2: Status 404 returned error can't find the container with id ebe89212105b0d37110b92dd82ca66ea9dd579bbb7269d1246f0a7b99f10e9f2 Apr 20 07:52:47.601620 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.601584 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-556kx"] Apr 20 07:52:47.604339 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:52:47.604305 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf76edbc7_6592_484d_8941_01f12cf229e7.slice/crio-63f446785135dc5d0851b1bff2027685a7e17ee6b5064a239f86a2c16fa1542a WatchSource:0}: Error finding container 63f446785135dc5d0851b1bff2027685a7e17ee6b5064a239f86a2c16fa1542a: Status 404 returned error can't find the container with id 63f446785135dc5d0851b1bff2027685a7e17ee6b5064a239f86a2c16fa1542a Apr 20 07:52:47.632213 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:47.632188 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-92gbs"] Apr 20 07:52:47.634762 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:52:47.634736 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c7f4336_12d9_4746_8e7e_0aa8c48a9eb2.slice/crio-7e15791c7bcffe218353638f6adb9b557fc55792dbf5df24aec0bf2c41c070f8 WatchSource:0}: Error finding container 7e15791c7bcffe218353638f6adb9b557fc55792dbf5df24aec0bf2c41c070f8: Status 404 returned error can't find the container with id 7e15791c7bcffe218353638f6adb9b557fc55792dbf5df24aec0bf2c41c070f8 Apr 20 07:52:48.147375 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:48.147334 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" event={"ID":"c4f5392a-e25d-4f52-b745-c32d100b1565","Type":"ContainerStarted","Data":"525922f25a73dd439d95bd078cd5e7f89d428d51ad4cd8d685e84d8390a1a948"} Apr 20 07:52:48.147375 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:48.147374 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" event={"ID":"c4f5392a-e25d-4f52-b745-c32d100b1565","Type":"ContainerStarted","Data":"ebe89212105b0d37110b92dd82ca66ea9dd579bbb7269d1246f0a7b99f10e9f2"} Apr 20 07:52:48.147884 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:48.147410 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:52:48.148732 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:48.148709 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-92gbs" event={"ID":"5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2","Type":"ContainerStarted","Data":"9e9db931808c1b485173ac09410e468fd36fedb17e9038c4ecde89b01d76cb6f"} Apr 20 07:52:48.148732 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:48.148734 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-92gbs" event={"ID":"5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2","Type":"ContainerStarted","Data":"7e15791c7bcffe218353638f6adb9b557fc55792dbf5df24aec0bf2c41c070f8"} Apr 20 07:52:48.149625 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:48.149608 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-556kx" event={"ID":"f76edbc7-6592-484d-8941-01f12cf229e7","Type":"ContainerStarted","Data":"63f446785135dc5d0851b1bff2027685a7e17ee6b5064a239f86a2c16fa1542a"} Apr 20 07:52:48.149696 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:48.149641 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-59fbcc4b4b-4ngd7" Apr 20 07:52:48.167033 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:48.166993 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" podStartSLOduration=1.1669826269999999 podStartE2EDuration="1.166982627s" podCreationTimestamp="2026-04-20 07:52:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:52:48.165732057 +0000 UTC m=+152.079362101" watchObservedRunningTime="2026-04-20 07:52:48.166982627 +0000 UTC m=+152.080612669" Apr 20 07:52:48.191674 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:48.191642 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-59fbcc4b4b-4ngd7"] Apr 20 07:52:48.195720 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:48.195693 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-59fbcc4b4b-4ngd7"] Apr 20 07:52:48.646150 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:48.646105 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="211e6876-c742-49f7-b034-331aaa7d75b3" path="/var/lib/kubelet/pods/211e6876-c742-49f7-b034-331aaa7d75b3/volumes" Apr 20 07:52:49.154897 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:49.154854 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-92gbs" event={"ID":"5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2","Type":"ContainerStarted","Data":"bc716b3d44c6c4ff84d5741435812913130ea588a2b407aaf0b489a3e388c63f"} Apr 20 07:52:50.160158 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:50.160105 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-92gbs" event={"ID":"5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2","Type":"ContainerStarted","Data":"6097375ab73e6d3dab2ef5cfab15f2ef6b1accfcf7e944d4ad5554af98353a70"} Apr 20 07:52:50.178191 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:50.178118 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-92gbs" podStartSLOduration=0.895467388 podStartE2EDuration="3.178100097s" podCreationTimestamp="2026-04-20 07:52:47 +0000 UTC" firstStartedPulling="2026-04-20 07:52:47.707732998 +0000 UTC m=+151.621363027" lastFinishedPulling="2026-04-20 07:52:49.99036571 +0000 UTC m=+153.903995736" observedRunningTime="2026-04-20 07:52:50.176294725 +0000 UTC m=+154.089924800" watchObservedRunningTime="2026-04-20 07:52:50.178100097 +0000 UTC m=+154.091730137" Apr 20 07:52:50.755643 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:50.755609 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-pft7m"] Apr 20 07:52:50.759087 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:50.759054 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-pft7m" Apr 20 07:52:50.761871 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:50.761692 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 07:52:50.761871 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:50.761717 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 07:52:50.761871 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:50.761746 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 07:52:50.762793 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:50.762774 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-8jnz4\"" Apr 20 07:52:50.765400 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:50.765380 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-pft7m"] Apr 20 07:52:50.803366 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:50.803333 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/59fb366d-19d3-4935-bf25-956a86cebf06-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-pft7m\" (UID: \"59fb366d-19d3-4935-bf25-956a86cebf06\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pft7m" Apr 20 07:52:50.803537 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:50.803372 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l67qd\" (UniqueName: \"kubernetes.io/projected/59fb366d-19d3-4935-bf25-956a86cebf06-kube-api-access-l67qd\") pod \"prometheus-operator-5676c8c784-pft7m\" (UID: \"59fb366d-19d3-4935-bf25-956a86cebf06\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pft7m" Apr 20 07:52:50.803537 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:50.803465 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/59fb366d-19d3-4935-bf25-956a86cebf06-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-pft7m\" (UID: \"59fb366d-19d3-4935-bf25-956a86cebf06\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pft7m" Apr 20 07:52:50.803537 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:50.803532 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59fb366d-19d3-4935-bf25-956a86cebf06-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-pft7m\" (UID: \"59fb366d-19d3-4935-bf25-956a86cebf06\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pft7m" Apr 20 07:52:50.904829 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:50.904793 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/59fb366d-19d3-4935-bf25-956a86cebf06-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-pft7m\" (UID: \"59fb366d-19d3-4935-bf25-956a86cebf06\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pft7m" Apr 20 07:52:50.905003 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:50.904862 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59fb366d-19d3-4935-bf25-956a86cebf06-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-pft7m\" (UID: \"59fb366d-19d3-4935-bf25-956a86cebf06\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pft7m" Apr 20 07:52:50.905003 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:50.904892 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/59fb366d-19d3-4935-bf25-956a86cebf06-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-pft7m\" (UID: \"59fb366d-19d3-4935-bf25-956a86cebf06\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pft7m" Apr 20 07:52:50.905003 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:50.904917 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l67qd\" (UniqueName: \"kubernetes.io/projected/59fb366d-19d3-4935-bf25-956a86cebf06-kube-api-access-l67qd\") pod \"prometheus-operator-5676c8c784-pft7m\" (UID: \"59fb366d-19d3-4935-bf25-956a86cebf06\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pft7m" Apr 20 07:52:50.905736 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:50.905708 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/59fb366d-19d3-4935-bf25-956a86cebf06-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-pft7m\" (UID: \"59fb366d-19d3-4935-bf25-956a86cebf06\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pft7m" Apr 20 07:52:50.907572 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:50.907548 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/59fb366d-19d3-4935-bf25-956a86cebf06-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-pft7m\" (UID: \"59fb366d-19d3-4935-bf25-956a86cebf06\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pft7m" Apr 20 07:52:50.907688 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:50.907548 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/59fb366d-19d3-4935-bf25-956a86cebf06-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-pft7m\" (UID: \"59fb366d-19d3-4935-bf25-956a86cebf06\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pft7m" Apr 20 07:52:50.912980 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:50.912911 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l67qd\" (UniqueName: \"kubernetes.io/projected/59fb366d-19d3-4935-bf25-956a86cebf06-kube-api-access-l67qd\") pod \"prometheus-operator-5676c8c784-pft7m\" (UID: \"59fb366d-19d3-4935-bf25-956a86cebf06\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-pft7m" Apr 20 07:52:51.071333 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:51.071250 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-pft7m" Apr 20 07:52:51.200253 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:51.200223 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-pft7m"] Apr 20 07:52:51.203000 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:52:51.202968 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59fb366d_19d3_4935_bf25_956a86cebf06.slice/crio-9dc5849b4dc63eb80baaf8a0a00bd0577a74c10db184750175539c162b8ebc8e WatchSource:0}: Error finding container 9dc5849b4dc63eb80baaf8a0a00bd0577a74c10db184750175539c162b8ebc8e: Status 404 returned error can't find the container with id 9dc5849b4dc63eb80baaf8a0a00bd0577a74c10db184750175539c162b8ebc8e Apr 20 07:52:52.170527 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:52.170466 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-pft7m" event={"ID":"59fb366d-19d3-4935-bf25-956a86cebf06","Type":"ContainerStarted","Data":"9dc5849b4dc63eb80baaf8a0a00bd0577a74c10db184750175539c162b8ebc8e"} Apr 20 07:52:52.532236 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:52.532196 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-7fps4" podUID="0993c493-f978-431e-9000-290ab9fb0bbe" Apr 20 07:52:52.546070 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:52.546014 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-9bxrl" podUID="b659a68e-b039-4864-b691-ff12b7393ed7" Apr 20 07:52:52.658914 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:52.658859 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-brq5h" podUID="07219834-44d6-42ab-9058-aed46274d1a8" Apr 20 07:52:53.175631 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:53.175588 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-pft7m" event={"ID":"59fb366d-19d3-4935-bf25-956a86cebf06","Type":"ContainerStarted","Data":"66b894e21eec78dcf7c5a1c3e57ccb138843ec31850097122678daeb71872904"} Apr 20 07:52:53.175631 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:53.175636 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-pft7m" event={"ID":"59fb366d-19d3-4935-bf25-956a86cebf06","Type":"ContainerStarted","Data":"2c729bf125607ec120b2a6c3e61cffd9c321b8b2bd919ab3db4b033b6613e67a"} Apr 20 07:52:53.175961 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:53.175724 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7fps4" Apr 20 07:52:53.192584 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:53.192537 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-pft7m" podStartSLOduration=1.913007453 podStartE2EDuration="3.192521682s" podCreationTimestamp="2026-04-20 07:52:50 +0000 UTC" firstStartedPulling="2026-04-20 07:52:51.205688967 +0000 UTC m=+155.119318992" lastFinishedPulling="2026-04-20 07:52:52.485203186 +0000 UTC m=+156.398833221" observedRunningTime="2026-04-20 07:52:53.191385259 +0000 UTC m=+157.105015302" watchObservedRunningTime="2026-04-20 07:52:53.192521682 +0000 UTC m=+157.106151724" Apr 20 07:52:54.337351 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.337314 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-67c5797466-qh8fs"] Apr 20 07:52:54.342026 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.342006 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:52:54.344674 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.344651 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 07:52:54.344674 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.344657 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 07:52:54.345858 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.345840 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 07:52:54.345933 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.345850 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 07:52:54.345933 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.345899 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-pwpw4\"" Apr 20 07:52:54.345933 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.345912 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 07:52:54.350488 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.350461 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67c5797466-qh8fs"] Apr 20 07:52:54.437710 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.437671 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxjbg\" (UniqueName: \"kubernetes.io/projected/fa1a9513-a1a2-4607-acc6-9c79e136d6af-kube-api-access-gxjbg\") pod \"console-67c5797466-qh8fs\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:52:54.437884 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.437718 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa1a9513-a1a2-4607-acc6-9c79e136d6af-console-serving-cert\") pod \"console-67c5797466-qh8fs\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:52:54.437884 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.437824 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa1a9513-a1a2-4607-acc6-9c79e136d6af-oauth-serving-cert\") pod \"console-67c5797466-qh8fs\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:52:54.437884 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.437864 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa1a9513-a1a2-4607-acc6-9c79e136d6af-console-config\") pod \"console-67c5797466-qh8fs\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:52:54.438005 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.437900 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa1a9513-a1a2-4607-acc6-9c79e136d6af-console-oauth-config\") pod \"console-67c5797466-qh8fs\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:52:54.438005 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.437927 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa1a9513-a1a2-4607-acc6-9c79e136d6af-service-ca\") pod \"console-67c5797466-qh8fs\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:52:54.539336 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.539300 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa1a9513-a1a2-4607-acc6-9c79e136d6af-console-oauth-config\") pod \"console-67c5797466-qh8fs\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:52:54.539518 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.539429 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa1a9513-a1a2-4607-acc6-9c79e136d6af-service-ca\") pod \"console-67c5797466-qh8fs\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:52:54.539576 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.539532 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gxjbg\" (UniqueName: \"kubernetes.io/projected/fa1a9513-a1a2-4607-acc6-9c79e136d6af-kube-api-access-gxjbg\") pod \"console-67c5797466-qh8fs\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:52:54.539630 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.539572 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa1a9513-a1a2-4607-acc6-9c79e136d6af-console-serving-cert\") pod \"console-67c5797466-qh8fs\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:52:54.539681 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.539649 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa1a9513-a1a2-4607-acc6-9c79e136d6af-oauth-serving-cert\") pod \"console-67c5797466-qh8fs\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:52:54.539731 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.539681 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa1a9513-a1a2-4607-acc6-9c79e136d6af-console-config\") pod \"console-67c5797466-qh8fs\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:52:54.540236 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.540202 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa1a9513-a1a2-4607-acc6-9c79e136d6af-service-ca\") pod \"console-67c5797466-qh8fs\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:52:54.540443 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.540415 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa1a9513-a1a2-4607-acc6-9c79e136d6af-oauth-serving-cert\") pod \"console-67c5797466-qh8fs\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:52:54.540929 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.540902 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa1a9513-a1a2-4607-acc6-9c79e136d6af-console-config\") pod \"console-67c5797466-qh8fs\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:52:54.542244 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.542218 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa1a9513-a1a2-4607-acc6-9c79e136d6af-console-oauth-config\") pod \"console-67c5797466-qh8fs\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:52:54.542349 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.542280 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa1a9513-a1a2-4607-acc6-9c79e136d6af-console-serving-cert\") pod \"console-67c5797466-qh8fs\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:52:54.547394 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.547369 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxjbg\" (UniqueName: \"kubernetes.io/projected/fa1a9513-a1a2-4607-acc6-9c79e136d6af-kube-api-access-gxjbg\") pod \"console-67c5797466-qh8fs\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:52:54.652087 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.652012 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:52:54.784077 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:54.782899 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67c5797466-qh8fs"] Apr 20 07:52:54.785314 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:52:54.785275 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa1a9513_a1a2_4607_acc6_9c79e136d6af.slice/crio-98a6e9c66a9526e61257f63536f1dfa37c2287a35d4c2b3d2d87925b1505f298 WatchSource:0}: Error finding container 98a6e9c66a9526e61257f63536f1dfa37c2287a35d4c2b3d2d87925b1505f298: Status 404 returned error can't find the container with id 98a6e9c66a9526e61257f63536f1dfa37c2287a35d4c2b3d2d87925b1505f298 Apr 20 07:52:55.123820 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.123783 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-p5czn"] Apr 20 07:52:55.128954 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.128931 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.131762 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.131735 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 07:52:55.131762 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.131756 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-59mhk\"" Apr 20 07:52:55.131953 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.131821 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 07:52:55.131953 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.131827 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 07:52:55.182127 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.182092 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67c5797466-qh8fs" event={"ID":"fa1a9513-a1a2-4607-acc6-9c79e136d6af","Type":"ContainerStarted","Data":"98a6e9c66a9526e61257f63536f1dfa37c2287a35d4c2b3d2d87925b1505f298"} Apr 20 07:52:55.246933 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.246893 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-node-exporter-wtmp\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.247134 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.246971 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-node-exporter-textfile\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.247134 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.247025 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-node-exporter-tls\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.247134 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.247061 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-sys\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.247134 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.247115 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9jzh\" (UniqueName: \"kubernetes.io/projected/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-kube-api-access-c9jzh\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.247332 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.247208 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.247332 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.247228 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-node-exporter-accelerators-collector-config\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.247332 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.247254 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-root\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.247332 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.247293 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-metrics-client-ca\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.348389 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.348350 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-node-exporter-tls\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.348773 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.348408 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-sys\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.348773 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.348439 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9jzh\" (UniqueName: \"kubernetes.io/projected/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-kube-api-access-c9jzh\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.348773 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.348490 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.348773 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.348521 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-node-exporter-accelerators-collector-config\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.348773 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.348535 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-sys\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.348773 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.348554 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-root\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.348773 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.348581 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-metrics-client-ca\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.348773 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.348613 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-node-exporter-wtmp\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.348773 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.348658 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-node-exporter-textfile\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.349355 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.348847 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-node-exporter-wtmp\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.349355 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.349085 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-node-exporter-textfile\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.349355 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.349191 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-root\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.349500 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.349417 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-node-exporter-accelerators-collector-config\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.349689 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.349670 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-metrics-client-ca\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.351249 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.351202 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.351349 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.351286 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-node-exporter-tls\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.356667 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.356647 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9jzh\" (UniqueName: \"kubernetes.io/projected/c687b47d-f84d-4ae3-a6ce-0f5596edfa15-kube-api-access-c9jzh\") pod \"node-exporter-p5czn\" (UID: \"c687b47d-f84d-4ae3-a6ce-0f5596edfa15\") " pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.441347 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:55.441215 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-p5czn" Apr 20 07:52:55.452129 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:52:55.452102 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc687b47d_f84d_4ae3_a6ce_0f5596edfa15.slice/crio-2f34e643a86d6fbb5c1ec189d735830fd815b4cc13087519dbf8f908d8f24002 WatchSource:0}: Error finding container 2f34e643a86d6fbb5c1ec189d735830fd815b4cc13087519dbf8f908d8f24002: Status 404 returned error can't find the container with id 2f34e643a86d6fbb5c1ec189d735830fd815b4cc13087519dbf8f908d8f24002 Apr 20 07:52:56.165753 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.164893 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 07:52:56.170753 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.170725 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.180547 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.173467 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 07:52:56.180547 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.173903 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 07:52:56.180547 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.174062 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 07:52:56.180547 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.174268 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 07:52:56.180547 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.174494 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 07:52:56.180547 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.174530 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 07:52:56.180547 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.174692 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 07:52:56.180547 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.174741 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 07:52:56.180547 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.174887 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 07:52:56.180547 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.175759 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-szpk9\"" Apr 20 07:52:56.180547 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.179814 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 07:52:56.188464 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.188413 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p5czn" event={"ID":"c687b47d-f84d-4ae3-a6ce-0f5596edfa15","Type":"ContainerStarted","Data":"2f34e643a86d6fbb5c1ec189d735830fd815b4cc13087519dbf8f908d8f24002"} Apr 20 07:52:56.262867 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.262766 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e5b3271-5e8e-407c-8aff-da8f698938dd-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.262867 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.262822 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-config-volume\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.263099 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.262872 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-web-config\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.263099 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.262912 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.263099 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.262939 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.263099 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.262964 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e5b3271-5e8e-407c-8aff-da8f698938dd-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.263099 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.263010 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.263099 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.263058 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8e5b3271-5e8e-407c-8aff-da8f698938dd-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.263420 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.263156 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e5b3271-5e8e-407c-8aff-da8f698938dd-config-out\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.263420 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.263188 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.263420 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.263256 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e5b3271-5e8e-407c-8aff-da8f698938dd-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.263420 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.263282 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.263420 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.263317 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbdhj\" (UniqueName: \"kubernetes.io/projected/8e5b3271-5e8e-407c-8aff-da8f698938dd-kube-api-access-jbdhj\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.366376 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.364918 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e5b3271-5e8e-407c-8aff-da8f698938dd-config-out\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.366376 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.364965 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.366376 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.365021 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e5b3271-5e8e-407c-8aff-da8f698938dd-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.366376 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.365045 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.366376 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.365077 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbdhj\" (UniqueName: \"kubernetes.io/projected/8e5b3271-5e8e-407c-8aff-da8f698938dd-kube-api-access-jbdhj\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.366376 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.365106 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e5b3271-5e8e-407c-8aff-da8f698938dd-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.366376 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.365153 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-config-volume\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.366376 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.365179 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-web-config\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.366376 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.365205 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.366376 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.365231 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.366376 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.365256 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e5b3271-5e8e-407c-8aff-da8f698938dd-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.366376 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.365316 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.366376 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.365361 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8e5b3271-5e8e-407c-8aff-da8f698938dd-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.366376 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.365746 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8e5b3271-5e8e-407c-8aff-da8f698938dd-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.369868 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:52:56.367895 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8e5b3271-5e8e-407c-8aff-da8f698938dd-alertmanager-trusted-ca-bundle podName:8e5b3271-5e8e-407c-8aff-da8f698938dd nodeName:}" failed. No retries permitted until 2026-04-20 07:52:56.867873613 +0000 UTC m=+160.781503639 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/8e5b3271-5e8e-407c-8aff-da8f698938dd-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "8e5b3271-5e8e-407c-8aff-da8f698938dd") : configmap references non-existent config key: ca-bundle.crt Apr 20 07:52:56.370957 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.370932 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e5b3271-5e8e-407c-8aff-da8f698938dd-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.372054 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.372026 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.373173 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.373112 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.374054 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.374005 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.375166 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.375130 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e5b3271-5e8e-407c-8aff-da8f698938dd-config-out\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.375765 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.375561 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-config-volume\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.376326 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.376283 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.376988 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.376962 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.377444 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.377416 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-web-config\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.377787 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.377737 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e5b3271-5e8e-407c-8aff-da8f698938dd-tls-assets\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.379114 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.379045 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbdhj\" (UniqueName: \"kubernetes.io/projected/8e5b3271-5e8e-407c-8aff-da8f698938dd-kube-api-access-jbdhj\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.871272 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.871232 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e5b3271-5e8e-407c-8aff-da8f698938dd-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:56.872242 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:56.872218 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e5b3271-5e8e-407c-8aff-da8f698938dd-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:57.094108 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:57.094067 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:52:57.193280 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:57.193097 2572 generic.go:358] "Generic (PLEG): container finished" podID="c687b47d-f84d-4ae3-a6ce-0f5596edfa15" containerID="7b571dc490bcd5d5435f3c48593eed4844b08a65c5753e6b3b23c284c603bb86" exitCode=0 Apr 20 07:52:57.193280 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:57.193198 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p5czn" event={"ID":"c687b47d-f84d-4ae3-a6ce-0f5596edfa15","Type":"ContainerDied","Data":"7b571dc490bcd5d5435f3c48593eed4844b08a65c5753e6b3b23c284c603bb86"} Apr 20 07:52:57.477700 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:57.477486 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls\") pod \"dns-default-7fps4\" (UID: \"0993c493-f978-431e-9000-290ab9fb0bbe\") " pod="openshift-dns/dns-default-7fps4" Apr 20 07:52:57.477700 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:57.477677 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert\") pod \"ingress-canary-9bxrl\" (UID: \"b659a68e-b039-4864-b691-ff12b7393ed7\") " pod="openshift-ingress-canary/ingress-canary-9bxrl" Apr 20 07:52:57.480985 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:57.480821 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0993c493-f978-431e-9000-290ab9fb0bbe-metrics-tls\") pod \"dns-default-7fps4\" (UID: \"0993c493-f978-431e-9000-290ab9fb0bbe\") " pod="openshift-dns/dns-default-7fps4" Apr 20 07:52:57.481123 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:57.481085 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b659a68e-b039-4864-b691-ff12b7393ed7-cert\") pod \"ingress-canary-9bxrl\" (UID: \"b659a68e-b039-4864-b691-ff12b7393ed7\") " pod="openshift-ingress-canary/ingress-canary-9bxrl" Apr 20 07:52:57.679061 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:57.679029 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dvf6t\"" Apr 20 07:52:57.687373 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:57.687291 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7fps4" Apr 20 07:52:59.883240 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:59.883188 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4s2g8"] Apr 20 07:52:59.886677 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:59.886655 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4s2g8" Apr 20 07:52:59.889579 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:59.889177 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-sldxb\"" Apr 20 07:52:59.889579 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:59.889419 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 20 07:52:59.892722 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:52:59.892700 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4s2g8"] Apr 20 07:53:00.005113 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:00.005077 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1b008e26-4827-4055-94d5-c9036809bd51-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4s2g8\" (UID: \"1b008e26-4827-4055-94d5-c9036809bd51\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4s2g8" Apr 20 07:53:00.106065 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:00.105957 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1b008e26-4827-4055-94d5-c9036809bd51-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4s2g8\" (UID: \"1b008e26-4827-4055-94d5-c9036809bd51\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4s2g8" Apr 20 07:53:00.108970 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:00.108939 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/1b008e26-4827-4055-94d5-c9036809bd51-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4s2g8\" (UID: \"1b008e26-4827-4055-94d5-c9036809bd51\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4s2g8" Apr 20 07:53:00.199945 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:00.199852 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4s2g8" Apr 20 07:53:04.529547 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.529509 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7967d78859-69vhd"] Apr 20 07:53:04.533248 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.533224 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:04.541990 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.541824 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 07:53:04.543513 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.543490 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7967d78859-69vhd"] Apr 20 07:53:04.651303 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.651269 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-trusted-ca-bundle\") pod \"console-7967d78859-69vhd\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:04.651482 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.651319 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b40bc6c0-48ba-464f-a54f-e8678037e82e-console-serving-cert\") pod \"console-7967d78859-69vhd\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:04.651482 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.651393 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbkxf\" (UniqueName: \"kubernetes.io/projected/b40bc6c0-48ba-464f-a54f-e8678037e82e-kube-api-access-lbkxf\") pod \"console-7967d78859-69vhd\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:04.651482 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.651418 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b40bc6c0-48ba-464f-a54f-e8678037e82e-console-oauth-config\") pod \"console-7967d78859-69vhd\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:04.651482 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.651465 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-service-ca\") pod \"console-7967d78859-69vhd\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:04.651655 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.651495 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-oauth-serving-cert\") pod \"console-7967d78859-69vhd\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:04.651655 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.651540 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-console-config\") pod \"console-7967d78859-69vhd\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:04.752956 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.752917 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbkxf\" (UniqueName: \"kubernetes.io/projected/b40bc6c0-48ba-464f-a54f-e8678037e82e-kube-api-access-lbkxf\") pod \"console-7967d78859-69vhd\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:04.752956 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.752960 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b40bc6c0-48ba-464f-a54f-e8678037e82e-console-oauth-config\") pod \"console-7967d78859-69vhd\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:04.753279 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.753043 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-service-ca\") pod \"console-7967d78859-69vhd\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:04.753279 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.753098 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-oauth-serving-cert\") pod \"console-7967d78859-69vhd\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:04.753279 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.753155 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-console-config\") pod \"console-7967d78859-69vhd\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:04.753279 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.753240 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-trusted-ca-bundle\") pod \"console-7967d78859-69vhd\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:04.753279 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.753272 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b40bc6c0-48ba-464f-a54f-e8678037e82e-console-serving-cert\") pod \"console-7967d78859-69vhd\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:04.753933 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.753906 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-oauth-serving-cert\") pod \"console-7967d78859-69vhd\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:04.754074 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.754044 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-service-ca\") pod \"console-7967d78859-69vhd\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:04.754310 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.754278 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-console-config\") pod \"console-7967d78859-69vhd\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:04.754381 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.754343 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-trusted-ca-bundle\") pod \"console-7967d78859-69vhd\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:04.755823 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.755794 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b40bc6c0-48ba-464f-a54f-e8678037e82e-console-serving-cert\") pod \"console-7967d78859-69vhd\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:04.755939 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.755850 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b40bc6c0-48ba-464f-a54f-e8678037e82e-console-oauth-config\") pod \"console-7967d78859-69vhd\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:04.761388 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.761362 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbkxf\" (UniqueName: \"kubernetes.io/projected/b40bc6c0-48ba-464f-a54f-e8678037e82e-kube-api-access-lbkxf\") pod \"console-7967d78859-69vhd\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:04.846033 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:04.845944 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:05.231520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:05.231486 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67c5797466-qh8fs" event={"ID":"fa1a9513-a1a2-4607-acc6-9c79e136d6af","Type":"ContainerStarted","Data":"3cf791d04957d1bdc5fd52e46c59abc7b3ee012fc8bf9b4fc3213a7d95a6f973"} Apr 20 07:53:05.236255 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:05.236211 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p5czn" event={"ID":"c687b47d-f84d-4ae3-a6ce-0f5596edfa15","Type":"ContainerStarted","Data":"6489f72d1f66bc7bf83e7eabbc6554b6008bb12317c3ed2a702de542c6598160"} Apr 20 07:53:05.257820 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:05.257764 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-67c5797466-qh8fs" podStartSLOduration=0.952190385 podStartE2EDuration="11.25774495s" podCreationTimestamp="2026-04-20 07:52:54 +0000 UTC" firstStartedPulling="2026-04-20 07:52:54.787458746 +0000 UTC m=+158.701088774" lastFinishedPulling="2026-04-20 07:53:05.093013304 +0000 UTC m=+169.006643339" observedRunningTime="2026-04-20 07:53:05.256101868 +0000 UTC m=+169.169731923" watchObservedRunningTime="2026-04-20 07:53:05.25774495 +0000 UTC m=+169.171374996" Apr 20 07:53:05.291576 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:05.291532 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7fps4"] Apr 20 07:53:05.309556 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:05.309522 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7967d78859-69vhd"] Apr 20 07:53:05.316700 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:53:05.316664 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb40bc6c0_48ba_464f_a54f_e8678037e82e.slice/crio-380a416aa49977e0917eafe625fa648d92ca2f224d03a3e4f2bfbf8577a82eae WatchSource:0}: Error finding container 380a416aa49977e0917eafe625fa648d92ca2f224d03a3e4f2bfbf8577a82eae: Status 404 returned error can't find the container with id 380a416aa49977e0917eafe625fa648d92ca2f224d03a3e4f2bfbf8577a82eae Apr 20 07:53:05.533796 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:05.533746 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4s2g8"] Apr 20 07:53:05.537174 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:53:05.537130 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b008e26_4827_4055_94d5_c9036809bd51.slice/crio-bf59ffa577ae32afbe510623015e29ee5f4b9ca063b5651e5513d9a11e316527 WatchSource:0}: Error finding container bf59ffa577ae32afbe510623015e29ee5f4b9ca063b5651e5513d9a11e316527: Status 404 returned error can't find the container with id bf59ffa577ae32afbe510623015e29ee5f4b9ca063b5651e5513d9a11e316527 Apr 20 07:53:05.540391 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:05.540252 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 07:53:05.543177 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:53:05.543129 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e5b3271_5e8e_407c_8aff_da8f698938dd.slice/crio-bab2b3881c3cdd27300b954b688fdbee953ca00d5c795861cfaeed630abe25f0 WatchSource:0}: Error finding container bab2b3881c3cdd27300b954b688fdbee953ca00d5c795861cfaeed630abe25f0: Status 404 returned error can't find the container with id bab2b3881c3cdd27300b954b688fdbee953ca00d5c795861cfaeed630abe25f0 Apr 20 07:53:06.242244 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:06.242191 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e5b3271-5e8e-407c-8aff-da8f698938dd","Type":"ContainerStarted","Data":"bab2b3881c3cdd27300b954b688fdbee953ca00d5c795861cfaeed630abe25f0"} Apr 20 07:53:06.247216 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:06.247184 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7967d78859-69vhd" event={"ID":"b40bc6c0-48ba-464f-a54f-e8678037e82e","Type":"ContainerStarted","Data":"8dda1745adef95605672154dfbe582a0d0183eecd04ffd2fa20d8f370895e9ce"} Apr 20 07:53:06.247379 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:06.247227 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7967d78859-69vhd" event={"ID":"b40bc6c0-48ba-464f-a54f-e8678037e82e","Type":"ContainerStarted","Data":"380a416aa49977e0917eafe625fa648d92ca2f224d03a3e4f2bfbf8577a82eae"} Apr 20 07:53:06.251739 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:06.251356 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p5czn" event={"ID":"c687b47d-f84d-4ae3-a6ce-0f5596edfa15","Type":"ContainerStarted","Data":"bd44f6d5972e57e80ae43dc634116f94f96ee5c3356df983aa117997713c45af"} Apr 20 07:53:06.253753 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:06.253700 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7fps4" event={"ID":"0993c493-f978-431e-9000-290ab9fb0bbe","Type":"ContainerStarted","Data":"ae7df058fa912fcee4528cadba14d66b3b4913faa92fb47975349d828743ecfd"} Apr 20 07:53:06.255428 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:06.255401 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4s2g8" event={"ID":"1b008e26-4827-4055-94d5-c9036809bd51","Type":"ContainerStarted","Data":"bf59ffa577ae32afbe510623015e29ee5f4b9ca063b5651e5513d9a11e316527"} Apr 20 07:53:06.258342 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:06.258313 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-556kx" event={"ID":"f76edbc7-6592-484d-8941-01f12cf229e7","Type":"ContainerStarted","Data":"d9b1d567e0aa41a490cd29670e80ac5b1b22bb74b4c2fff4e0788b5d306d3eaf"} Apr 20 07:53:06.268882 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:06.268834 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7967d78859-69vhd" podStartSLOduration=2.268819538 podStartE2EDuration="2.268819538s" podCreationTimestamp="2026-04-20 07:53:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:53:06.266126969 +0000 UTC m=+170.179757033" watchObservedRunningTime="2026-04-20 07:53:06.268819538 +0000 UTC m=+170.182449587" Apr 20 07:53:06.292163 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:06.291837 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-556kx" podStartSLOduration=1.728840014 podStartE2EDuration="19.291819819s" podCreationTimestamp="2026-04-20 07:52:47 +0000 UTC" firstStartedPulling="2026-04-20 07:52:47.606188892 +0000 UTC m=+151.519818917" lastFinishedPulling="2026-04-20 07:53:05.169168689 +0000 UTC m=+169.082798722" observedRunningTime="2026-04-20 07:53:06.290254471 +0000 UTC m=+170.203884517" watchObservedRunningTime="2026-04-20 07:53:06.291819819 +0000 UTC m=+170.205449864" Apr 20 07:53:06.665087 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:06.664987 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-p5czn" podStartSLOduration=10.721663788 podStartE2EDuration="11.664966345s" podCreationTimestamp="2026-04-20 07:52:55 +0000 UTC" firstStartedPulling="2026-04-20 07:52:55.45419962 +0000 UTC m=+159.367829643" lastFinishedPulling="2026-04-20 07:52:56.397502166 +0000 UTC m=+160.311132200" observedRunningTime="2026-04-20 07:53:06.311444696 +0000 UTC m=+170.225074752" watchObservedRunningTime="2026-04-20 07:53:06.664966345 +0000 UTC m=+170.578596389" Apr 20 07:53:07.262672 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:07.262637 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-556kx" Apr 20 07:53:07.279708 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:07.279678 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-556kx" Apr 20 07:53:07.641498 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:07.641388 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:53:07.641741 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:07.641720 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9bxrl" Apr 20 07:53:07.644767 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:07.644736 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-mxq77\"" Apr 20 07:53:07.652935 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:07.652910 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9bxrl" Apr 20 07:53:08.518521 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:08.517603 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9bxrl"] Apr 20 07:53:08.531558 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:53:08.531520 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb659a68e_b039_4864_b691_ff12b7393ed7.slice/crio-11618133a881c50cbc89a1dfccaf91e1d15a1b6836c2eb8cbd0ce75ebadadfe6 WatchSource:0}: Error finding container 11618133a881c50cbc89a1dfccaf91e1d15a1b6836c2eb8cbd0ce75ebadadfe6: Status 404 returned error can't find the container with id 11618133a881c50cbc89a1dfccaf91e1d15a1b6836c2eb8cbd0ce75ebadadfe6 Apr 20 07:53:09.160872 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:09.160841 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-64bbf84747-kdrcj" Apr 20 07:53:09.274200 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:09.274157 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9bxrl" event={"ID":"b659a68e-b039-4864-b691-ff12b7393ed7","Type":"ContainerStarted","Data":"11618133a881c50cbc89a1dfccaf91e1d15a1b6836c2eb8cbd0ce75ebadadfe6"} Apr 20 07:53:09.276706 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:09.276664 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7fps4" event={"ID":"0993c493-f978-431e-9000-290ab9fb0bbe","Type":"ContainerStarted","Data":"f23b6fbcbec5bd669e69c20388ce731c9619825d9443341b37b3bb622fe71567"} Apr 20 07:53:09.276853 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:09.276712 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7fps4" event={"ID":"0993c493-f978-431e-9000-290ab9fb0bbe","Type":"ContainerStarted","Data":"30971a8371efd352ba1f6bf0e4ae1d388b1be600a47389e21e73799ebbfec055"} Apr 20 07:53:09.276853 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:09.276772 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-7fps4" Apr 20 07:53:09.279210 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:09.279182 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4s2g8" event={"ID":"1b008e26-4827-4055-94d5-c9036809bd51","Type":"ContainerStarted","Data":"0ca228d1f2bfb41f73a6ceab77a4cb6950499edc0f160ab13972ac089f758f3a"} Apr 20 07:53:09.279508 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:09.279480 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4s2g8" Apr 20 07:53:09.282840 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:09.282814 2572 generic.go:358] "Generic (PLEG): container finished" podID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerID="70e538e9f9f4b8f477ace9a6fdb89a6e88fe31faf633729c426a1232a13ae2a2" exitCode=0 Apr 20 07:53:09.283395 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:09.283346 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e5b3271-5e8e-407c-8aff-da8f698938dd","Type":"ContainerDied","Data":"70e538e9f9f4b8f477ace9a6fdb89a6e88fe31faf633729c426a1232a13ae2a2"} Apr 20 07:53:09.285856 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:09.285823 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4s2g8" Apr 20 07:53:09.296771 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:09.296658 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7fps4" podStartSLOduration=137.247821556 podStartE2EDuration="2m20.296642374s" podCreationTimestamp="2026-04-20 07:50:49 +0000 UTC" firstStartedPulling="2026-04-20 07:53:05.294706428 +0000 UTC m=+169.208336462" lastFinishedPulling="2026-04-20 07:53:08.343527254 +0000 UTC m=+172.257157280" observedRunningTime="2026-04-20 07:53:09.294834778 +0000 UTC m=+173.208464822" watchObservedRunningTime="2026-04-20 07:53:09.296642374 +0000 UTC m=+173.210272418" Apr 20 07:53:09.336569 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:09.335998 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4s2g8" podStartSLOduration=7.52817477 podStartE2EDuration="10.335976817s" podCreationTimestamp="2026-04-20 07:52:59 +0000 UTC" firstStartedPulling="2026-04-20 07:53:05.539950828 +0000 UTC m=+169.453580863" lastFinishedPulling="2026-04-20 07:53:08.347752889 +0000 UTC m=+172.261382910" observedRunningTime="2026-04-20 07:53:09.333713323 +0000 UTC m=+173.247343368" watchObservedRunningTime="2026-04-20 07:53:09.335976817 +0000 UTC m=+173.249606860" Apr 20 07:53:12.297742 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:12.297698 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e5b3271-5e8e-407c-8aff-da8f698938dd","Type":"ContainerStarted","Data":"854f62be27883756e23021ea714be4fd5de6a4409fa8498d56a0566837017557"} Apr 20 07:53:12.298232 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:12.297749 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e5b3271-5e8e-407c-8aff-da8f698938dd","Type":"ContainerStarted","Data":"897bbb98baaf0b8e51fcc038fda0172d281ec3d0b39da90e7c081ac902fc4bc1"} Apr 20 07:53:12.298232 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:12.297766 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e5b3271-5e8e-407c-8aff-da8f698938dd","Type":"ContainerStarted","Data":"1fce28013923bba5368922770eee73c952a54054eaa34b30187c0b929e3a85d9"} Apr 20 07:53:12.298232 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:12.297780 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e5b3271-5e8e-407c-8aff-da8f698938dd","Type":"ContainerStarted","Data":"a10cc27848af63adea29bfd0f5011191843b0137953e13a6057468e859c5afc1"} Apr 20 07:53:12.298232 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:12.297796 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e5b3271-5e8e-407c-8aff-da8f698938dd","Type":"ContainerStarted","Data":"304cbdd2e6745f6d925d3463ac47001bb44caf4b5e4325b8cc6ab25df580caeb"} Apr 20 07:53:12.299248 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:12.299215 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9bxrl" event={"ID":"b659a68e-b039-4864-b691-ff12b7393ed7","Type":"ContainerStarted","Data":"d9fe25e1c604434556913b6ce269e0554d9ac7f480476573686c05ea9d65edba"} Apr 20 07:53:12.316177 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:12.316105 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9bxrl" podStartSLOduration=140.380259849 podStartE2EDuration="2m23.316091579s" podCreationTimestamp="2026-04-20 07:50:49 +0000 UTC" firstStartedPulling="2026-04-20 07:53:08.534723794 +0000 UTC m=+172.448353815" lastFinishedPulling="2026-04-20 07:53:11.470555514 +0000 UTC m=+175.384185545" observedRunningTime="2026-04-20 07:53:12.31381944 +0000 UTC m=+176.227449482" watchObservedRunningTime="2026-04-20 07:53:12.316091579 +0000 UTC m=+176.229721621" Apr 20 07:53:14.312852 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:14.312813 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e5b3271-5e8e-407c-8aff-da8f698938dd","Type":"ContainerStarted","Data":"7a6effb10feb6710056b920717a6bc23bd3ba3ddf5cdde64bd3cc1128213508e"} Apr 20 07:53:14.339821 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:14.339764 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=10.343310662 podStartE2EDuration="18.339745114s" podCreationTimestamp="2026-04-20 07:52:56 +0000 UTC" firstStartedPulling="2026-04-20 07:53:05.545155605 +0000 UTC m=+169.458785627" lastFinishedPulling="2026-04-20 07:53:13.541590058 +0000 UTC m=+177.455220079" observedRunningTime="2026-04-20 07:53:14.337888287 +0000 UTC m=+178.251518359" watchObservedRunningTime="2026-04-20 07:53:14.339745114 +0000 UTC m=+178.253375158" Apr 20 07:53:14.652296 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:14.652208 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:53:14.652450 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:14.652292 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:53:14.657730 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:14.657707 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:53:14.846793 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:14.846743 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:14.846997 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:14.846850 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:14.852690 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:14.852666 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:15.321190 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:15.321134 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:53:15.321665 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:15.321301 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:53:15.378982 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:15.378953 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67c5797466-qh8fs"] Apr 20 07:53:19.290715 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:19.290684 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7fps4" Apr 20 07:53:42.341882 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:42.341814 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-67c5797466-qh8fs" podUID="fa1a9513-a1a2-4607-acc6-9c79e136d6af" containerName="console" containerID="cri-o://3cf791d04957d1bdc5fd52e46c59abc7b3ee012fc8bf9b4fc3213a7d95a6f973" gracePeriod=15 Apr 20 07:53:42.616218 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:42.616195 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67c5797466-qh8fs_fa1a9513-a1a2-4607-acc6-9c79e136d6af/console/0.log" Apr 20 07:53:42.616335 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:42.616256 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:53:42.692995 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:42.692962 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa1a9513-a1a2-4607-acc6-9c79e136d6af-service-ca\") pod \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " Apr 20 07:53:42.693190 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:42.693015 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa1a9513-a1a2-4607-acc6-9c79e136d6af-console-serving-cert\") pod \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " Apr 20 07:53:42.693190 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:42.693041 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa1a9513-a1a2-4607-acc6-9c79e136d6af-console-config\") pod \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " Apr 20 07:53:42.693190 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:42.693088 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxjbg\" (UniqueName: \"kubernetes.io/projected/fa1a9513-a1a2-4607-acc6-9c79e136d6af-kube-api-access-gxjbg\") pod \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " Apr 20 07:53:42.693190 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:42.693159 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa1a9513-a1a2-4607-acc6-9c79e136d6af-console-oauth-config\") pod \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " Apr 20 07:53:42.693190 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:42.693180 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa1a9513-a1a2-4607-acc6-9c79e136d6af-oauth-serving-cert\") pod \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\" (UID: \"fa1a9513-a1a2-4607-acc6-9c79e136d6af\") " Apr 20 07:53:42.693455 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:42.693404 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa1a9513-a1a2-4607-acc6-9c79e136d6af-service-ca" (OuterVolumeSpecName: "service-ca") pod "fa1a9513-a1a2-4607-acc6-9c79e136d6af" (UID: "fa1a9513-a1a2-4607-acc6-9c79e136d6af"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:53:42.693584 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:42.693558 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa1a9513-a1a2-4607-acc6-9c79e136d6af-console-config" (OuterVolumeSpecName: "console-config") pod "fa1a9513-a1a2-4607-acc6-9c79e136d6af" (UID: "fa1a9513-a1a2-4607-acc6-9c79e136d6af"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:53:42.693676 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:42.693609 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa1a9513-a1a2-4607-acc6-9c79e136d6af-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "fa1a9513-a1a2-4607-acc6-9c79e136d6af" (UID: "fa1a9513-a1a2-4607-acc6-9c79e136d6af"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:53:42.695485 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:42.695459 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa1a9513-a1a2-4607-acc6-9c79e136d6af-kube-api-access-gxjbg" (OuterVolumeSpecName: "kube-api-access-gxjbg") pod "fa1a9513-a1a2-4607-acc6-9c79e136d6af" (UID: "fa1a9513-a1a2-4607-acc6-9c79e136d6af"). InnerVolumeSpecName "kube-api-access-gxjbg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:53:42.695587 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:42.695491 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa1a9513-a1a2-4607-acc6-9c79e136d6af-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "fa1a9513-a1a2-4607-acc6-9c79e136d6af" (UID: "fa1a9513-a1a2-4607-acc6-9c79e136d6af"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:53:42.695587 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:42.695515 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa1a9513-a1a2-4607-acc6-9c79e136d6af-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "fa1a9513-a1a2-4607-acc6-9c79e136d6af" (UID: "fa1a9513-a1a2-4607-acc6-9c79e136d6af"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:53:42.794446 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:42.794408 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gxjbg\" (UniqueName: \"kubernetes.io/projected/fa1a9513-a1a2-4607-acc6-9c79e136d6af-kube-api-access-gxjbg\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:53:42.794446 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:42.794441 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/fa1a9513-a1a2-4607-acc6-9c79e136d6af-console-oauth-config\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:53:42.794446 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:42.794454 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/fa1a9513-a1a2-4607-acc6-9c79e136d6af-oauth-serving-cert\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:53:42.794683 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:42.794468 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/fa1a9513-a1a2-4607-acc6-9c79e136d6af-service-ca\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:53:42.794683 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:42.794482 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/fa1a9513-a1a2-4607-acc6-9c79e136d6af-console-serving-cert\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:53:42.794683 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:42.794495 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/fa1a9513-a1a2-4607-acc6-9c79e136d6af-console-config\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:53:43.396097 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:43.396075 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67c5797466-qh8fs_fa1a9513-a1a2-4607-acc6-9c79e136d6af/console/0.log" Apr 20 07:53:43.396469 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:43.396114 2572 generic.go:358] "Generic (PLEG): container finished" podID="fa1a9513-a1a2-4607-acc6-9c79e136d6af" containerID="3cf791d04957d1bdc5fd52e46c59abc7b3ee012fc8bf9b4fc3213a7d95a6f973" exitCode=2 Apr 20 07:53:43.396469 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:43.396164 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67c5797466-qh8fs" event={"ID":"fa1a9513-a1a2-4607-acc6-9c79e136d6af","Type":"ContainerDied","Data":"3cf791d04957d1bdc5fd52e46c59abc7b3ee012fc8bf9b4fc3213a7d95a6f973"} Apr 20 07:53:43.396469 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:43.396200 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67c5797466-qh8fs" event={"ID":"fa1a9513-a1a2-4607-acc6-9c79e136d6af","Type":"ContainerDied","Data":"98a6e9c66a9526e61257f63536f1dfa37c2287a35d4c2b3d2d87925b1505f298"} Apr 20 07:53:43.396469 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:43.396208 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67c5797466-qh8fs" Apr 20 07:53:43.396469 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:43.396217 2572 scope.go:117] "RemoveContainer" containerID="3cf791d04957d1bdc5fd52e46c59abc7b3ee012fc8bf9b4fc3213a7d95a6f973" Apr 20 07:53:43.404370 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:43.404306 2572 scope.go:117] "RemoveContainer" containerID="3cf791d04957d1bdc5fd52e46c59abc7b3ee012fc8bf9b4fc3213a7d95a6f973" Apr 20 07:53:43.404671 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:53:43.404632 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cf791d04957d1bdc5fd52e46c59abc7b3ee012fc8bf9b4fc3213a7d95a6f973\": container with ID starting with 3cf791d04957d1bdc5fd52e46c59abc7b3ee012fc8bf9b4fc3213a7d95a6f973 not found: ID does not exist" containerID="3cf791d04957d1bdc5fd52e46c59abc7b3ee012fc8bf9b4fc3213a7d95a6f973" Apr 20 07:53:43.404766 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:43.404668 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cf791d04957d1bdc5fd52e46c59abc7b3ee012fc8bf9b4fc3213a7d95a6f973"} err="failed to get container status \"3cf791d04957d1bdc5fd52e46c59abc7b3ee012fc8bf9b4fc3213a7d95a6f973\": rpc error: code = NotFound desc = could not find container \"3cf791d04957d1bdc5fd52e46c59abc7b3ee012fc8bf9b4fc3213a7d95a6f973\": container with ID starting with 3cf791d04957d1bdc5fd52e46c59abc7b3ee012fc8bf9b4fc3213a7d95a6f973 not found: ID does not exist" Apr 20 07:53:43.417213 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:43.417185 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67c5797466-qh8fs"] Apr 20 07:53:43.422540 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:43.422517 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-67c5797466-qh8fs"] Apr 20 07:53:44.400223 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:44.400184 2572 generic.go:358] "Generic (PLEG): container finished" podID="8c112e80-851a-4290-86a7-51a64594d25e" containerID="a79e289654a8fe51e547d7201f36da6e0d8d14a1f7719bc5b050a233caee6a3c" exitCode=0 Apr 20 07:53:44.400635 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:44.400260 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-rjwff" event={"ID":"8c112e80-851a-4290-86a7-51a64594d25e","Type":"ContainerDied","Data":"a79e289654a8fe51e547d7201f36da6e0d8d14a1f7719bc5b050a233caee6a3c"} Apr 20 07:53:44.400635 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:44.400629 2572 scope.go:117] "RemoveContainer" containerID="a79e289654a8fe51e547d7201f36da6e0d8d14a1f7719bc5b050a233caee6a3c" Apr 20 07:53:44.646237 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:44.646202 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa1a9513-a1a2-4607-acc6-9c79e136d6af" path="/var/lib/kubelet/pods/fa1a9513-a1a2-4607-acc6-9c79e136d6af/volumes" Apr 20 07:53:45.405747 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:45.405707 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-rjwff" event={"ID":"8c112e80-851a-4290-86a7-51a64594d25e","Type":"ContainerStarted","Data":"d242ade9e636e9cb1bb6707cd485901beca2ca84046c31a857888bdf2c302570"} Apr 20 07:53:49.417480 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:49.417443 2572 generic.go:358] "Generic (PLEG): container finished" podID="da90ef7d-a0bf-47dd-ab2b-79b951b2d24c" containerID="f7b3386510b2851347a0b497d722bdd24f72943d807bdcdd13e06472946227ab" exitCode=0 Apr 20 07:53:49.417898 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:49.417515 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f6qxs" event={"ID":"da90ef7d-a0bf-47dd-ab2b-79b951b2d24c","Type":"ContainerDied","Data":"f7b3386510b2851347a0b497d722bdd24f72943d807bdcdd13e06472946227ab"} Apr 20 07:53:49.417898 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:49.417844 2572 scope.go:117] "RemoveContainer" containerID="f7b3386510b2851347a0b497d722bdd24f72943d807bdcdd13e06472946227ab" Apr 20 07:53:50.421511 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:53:50.421476 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f6qxs" event={"ID":"da90ef7d-a0bf-47dd-ab2b-79b951b2d24c","Type":"ContainerStarted","Data":"69b8e5bff5cfa827caf8d80292266a7bae10d202b89095f2629907a8fcfd0d4d"} Apr 20 07:54:15.347542 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:15.347506 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 07:54:15.347959 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:15.347920 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="alertmanager" containerID="cri-o://304cbdd2e6745f6d925d3463ac47001bb44caf4b5e4325b8cc6ab25df580caeb" gracePeriod=120 Apr 20 07:54:15.348038 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:15.347999 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="kube-rbac-proxy-web" containerID="cri-o://1fce28013923bba5368922770eee73c952a54054eaa34b30187c0b929e3a85d9" gracePeriod=120 Apr 20 07:54:15.348216 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:15.347997 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="kube-rbac-proxy-metric" containerID="cri-o://854f62be27883756e23021ea714be4fd5de6a4409fa8498d56a0566837017557" gracePeriod=120 Apr 20 07:54:15.348216 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:15.348041 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="config-reloader" containerID="cri-o://a10cc27848af63adea29bfd0f5011191843b0137953e13a6057468e859c5afc1" gracePeriod=120 Apr 20 07:54:15.348343 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:15.348026 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="kube-rbac-proxy" containerID="cri-o://897bbb98baaf0b8e51fcc038fda0172d281ec3d0b39da90e7c081ac902fc4bc1" gracePeriod=120 Apr 20 07:54:15.348343 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:15.348051 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="prom-label-proxy" containerID="cri-o://7a6effb10feb6710056b920717a6bc23bd3ba3ddf5cdde64bd3cc1128213508e" gracePeriod=120 Apr 20 07:54:15.508810 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:15.508782 2572 generic.go:358] "Generic (PLEG): container finished" podID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerID="7a6effb10feb6710056b920717a6bc23bd3ba3ddf5cdde64bd3cc1128213508e" exitCode=0 Apr 20 07:54:15.508810 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:15.508806 2572 generic.go:358] "Generic (PLEG): container finished" podID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerID="854f62be27883756e23021ea714be4fd5de6a4409fa8498d56a0566837017557" exitCode=0 Apr 20 07:54:15.508810 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:15.508813 2572 generic.go:358] "Generic (PLEG): container finished" podID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerID="897bbb98baaf0b8e51fcc038fda0172d281ec3d0b39da90e7c081ac902fc4bc1" exitCode=0 Apr 20 07:54:15.509002 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:15.508818 2572 generic.go:358] "Generic (PLEG): container finished" podID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerID="a10cc27848af63adea29bfd0f5011191843b0137953e13a6057468e859c5afc1" exitCode=0 Apr 20 07:54:15.509002 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:15.508824 2572 generic.go:358] "Generic (PLEG): container finished" podID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerID="304cbdd2e6745f6d925d3463ac47001bb44caf4b5e4325b8cc6ab25df580caeb" exitCode=0 Apr 20 07:54:15.509002 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:15.508861 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e5b3271-5e8e-407c-8aff-da8f698938dd","Type":"ContainerDied","Data":"7a6effb10feb6710056b920717a6bc23bd3ba3ddf5cdde64bd3cc1128213508e"} Apr 20 07:54:15.509002 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:15.508894 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e5b3271-5e8e-407c-8aff-da8f698938dd","Type":"ContainerDied","Data":"854f62be27883756e23021ea714be4fd5de6a4409fa8498d56a0566837017557"} Apr 20 07:54:15.509002 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:15.508905 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e5b3271-5e8e-407c-8aff-da8f698938dd","Type":"ContainerDied","Data":"897bbb98baaf0b8e51fcc038fda0172d281ec3d0b39da90e7c081ac902fc4bc1"} Apr 20 07:54:15.509002 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:15.508914 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e5b3271-5e8e-407c-8aff-da8f698938dd","Type":"ContainerDied","Data":"a10cc27848af63adea29bfd0f5011191843b0137953e13a6057468e859c5afc1"} Apr 20 07:54:15.509002 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:15.508922 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e5b3271-5e8e-407c-8aff-da8f698938dd","Type":"ContainerDied","Data":"304cbdd2e6745f6d925d3463ac47001bb44caf4b5e4325b8cc6ab25df580caeb"} Apr 20 07:54:16.515912 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.515881 2572 generic.go:358] "Generic (PLEG): container finished" podID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerID="1fce28013923bba5368922770eee73c952a54054eaa34b30187c0b929e3a85d9" exitCode=0 Apr 20 07:54:16.516279 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.515983 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e5b3271-5e8e-407c-8aff-da8f698938dd","Type":"ContainerDied","Data":"1fce28013923bba5368922770eee73c952a54054eaa34b30187c0b929e3a85d9"} Apr 20 07:54:16.597323 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.597295 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:16.695982 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.695903 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbdhj\" (UniqueName: \"kubernetes.io/projected/8e5b3271-5e8e-407c-8aff-da8f698938dd-kube-api-access-jbdhj\") pod \"8e5b3271-5e8e-407c-8aff-da8f698938dd\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " Apr 20 07:54:16.695982 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.695948 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-config-volume\") pod \"8e5b3271-5e8e-407c-8aff-da8f698938dd\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " Apr 20 07:54:16.695982 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.695979 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-kube-rbac-proxy-metric\") pod \"8e5b3271-5e8e-407c-8aff-da8f698938dd\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " Apr 20 07:54:16.696283 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.696009 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-kube-rbac-proxy\") pod \"8e5b3271-5e8e-407c-8aff-da8f698938dd\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " Apr 20 07:54:16.696283 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.696037 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e5b3271-5e8e-407c-8aff-da8f698938dd-metrics-client-ca\") pod \"8e5b3271-5e8e-407c-8aff-da8f698938dd\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " Apr 20 07:54:16.696283 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.696065 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8e5b3271-5e8e-407c-8aff-da8f698938dd-alertmanager-main-db\") pod \"8e5b3271-5e8e-407c-8aff-da8f698938dd\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " Apr 20 07:54:16.696283 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.696097 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-web-config\") pod \"8e5b3271-5e8e-407c-8aff-da8f698938dd\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " Apr 20 07:54:16.696283 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.696129 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e5b3271-5e8e-407c-8aff-da8f698938dd-alertmanager-trusted-ca-bundle\") pod \"8e5b3271-5e8e-407c-8aff-da8f698938dd\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " Apr 20 07:54:16.696283 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.696175 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e5b3271-5e8e-407c-8aff-da8f698938dd-config-out\") pod \"8e5b3271-5e8e-407c-8aff-da8f698938dd\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " Apr 20 07:54:16.696283 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.696204 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-kube-rbac-proxy-web\") pod \"8e5b3271-5e8e-407c-8aff-da8f698938dd\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " Apr 20 07:54:16.696283 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.696236 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-cluster-tls-config\") pod \"8e5b3271-5e8e-407c-8aff-da8f698938dd\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " Apr 20 07:54:16.696647 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.696317 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e5b3271-5e8e-407c-8aff-da8f698938dd-tls-assets\") pod \"8e5b3271-5e8e-407c-8aff-da8f698938dd\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " Apr 20 07:54:16.696647 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.696348 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-main-tls\") pod \"8e5b3271-5e8e-407c-8aff-da8f698938dd\" (UID: \"8e5b3271-5e8e-407c-8aff-da8f698938dd\") " Apr 20 07:54:16.696647 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.696467 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e5b3271-5e8e-407c-8aff-da8f698938dd-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "8e5b3271-5e8e-407c-8aff-da8f698938dd" (UID: "8e5b3271-5e8e-407c-8aff-da8f698938dd"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:54:16.696647 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.696634 2572 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8e5b3271-5e8e-407c-8aff-da8f698938dd-metrics-client-ca\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:54:16.696857 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.696672 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e5b3271-5e8e-407c-8aff-da8f698938dd-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "8e5b3271-5e8e-407c-8aff-da8f698938dd" (UID: "8e5b3271-5e8e-407c-8aff-da8f698938dd"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:54:16.699552 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.698730 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "8e5b3271-5e8e-407c-8aff-da8f698938dd" (UID: "8e5b3271-5e8e-407c-8aff-da8f698938dd"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:54:16.699552 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.698851 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e5b3271-5e8e-407c-8aff-da8f698938dd-kube-api-access-jbdhj" (OuterVolumeSpecName: "kube-api-access-jbdhj") pod "8e5b3271-5e8e-407c-8aff-da8f698938dd" (UID: "8e5b3271-5e8e-407c-8aff-da8f698938dd"). InnerVolumeSpecName "kube-api-access-jbdhj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:54:16.699552 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.699088 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "8e5b3271-5e8e-407c-8aff-da8f698938dd" (UID: "8e5b3271-5e8e-407c-8aff-da8f698938dd"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:54:16.699552 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.699125 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "8e5b3271-5e8e-407c-8aff-da8f698938dd" (UID: "8e5b3271-5e8e-407c-8aff-da8f698938dd"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:54:16.699552 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.699373 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-config-volume" (OuterVolumeSpecName: "config-volume") pod "8e5b3271-5e8e-407c-8aff-da8f698938dd" (UID: "8e5b3271-5e8e-407c-8aff-da8f698938dd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:54:16.699552 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.699471 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "8e5b3271-5e8e-407c-8aff-da8f698938dd" (UID: "8e5b3271-5e8e-407c-8aff-da8f698938dd"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:54:16.700004 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.699566 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e5b3271-5e8e-407c-8aff-da8f698938dd-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "8e5b3271-5e8e-407c-8aff-da8f698938dd" (UID: "8e5b3271-5e8e-407c-8aff-da8f698938dd"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:54:16.700066 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.700027 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e5b3271-5e8e-407c-8aff-da8f698938dd-config-out" (OuterVolumeSpecName: "config-out") pod "8e5b3271-5e8e-407c-8aff-da8f698938dd" (UID: "8e5b3271-5e8e-407c-8aff-da8f698938dd"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:54:16.700872 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.700843 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e5b3271-5e8e-407c-8aff-da8f698938dd-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "8e5b3271-5e8e-407c-8aff-da8f698938dd" (UID: "8e5b3271-5e8e-407c-8aff-da8f698938dd"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:54:16.703347 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.703244 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "8e5b3271-5e8e-407c-8aff-da8f698938dd" (UID: "8e5b3271-5e8e-407c-8aff-da8f698938dd"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:54:16.710479 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.710451 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-web-config" (OuterVolumeSpecName: "web-config") pod "8e5b3271-5e8e-407c-8aff-da8f698938dd" (UID: "8e5b3271-5e8e-407c-8aff-da8f698938dd"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:54:16.797047 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.797011 2572 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8e5b3271-5e8e-407c-8aff-da8f698938dd-config-out\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:54:16.797047 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.797042 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:54:16.797047 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.797053 2572 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-cluster-tls-config\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:54:16.797295 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.797063 2572 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8e5b3271-5e8e-407c-8aff-da8f698938dd-tls-assets\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:54:16.797295 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.797072 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-main-tls\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:54:16.797295 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.797081 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jbdhj\" (UniqueName: \"kubernetes.io/projected/8e5b3271-5e8e-407c-8aff-da8f698938dd-kube-api-access-jbdhj\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:54:16.797295 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.797090 2572 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-config-volume\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:54:16.797295 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.797098 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:54:16.797295 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.797108 2572 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:54:16.797295 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.797117 2572 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/8e5b3271-5e8e-407c-8aff-da8f698938dd-alertmanager-main-db\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:54:16.797295 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.797125 2572 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8e5b3271-5e8e-407c-8aff-da8f698938dd-web-config\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:54:16.797295 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:16.797134 2572 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8e5b3271-5e8e-407c-8aff-da8f698938dd-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:54:17.522135 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.522101 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"8e5b3271-5e8e-407c-8aff-da8f698938dd","Type":"ContainerDied","Data":"bab2b3881c3cdd27300b954b688fdbee953ca00d5c795861cfaeed630abe25f0"} Apr 20 07:54:17.522539 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.522163 2572 scope.go:117] "RemoveContainer" containerID="7a6effb10feb6710056b920717a6bc23bd3ba3ddf5cdde64bd3cc1128213508e" Apr 20 07:54:17.522539 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.522167 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.529759 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.529738 2572 scope.go:117] "RemoveContainer" containerID="854f62be27883756e23021ea714be4fd5de6a4409fa8498d56a0566837017557" Apr 20 07:54:17.536669 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.536652 2572 scope.go:117] "RemoveContainer" containerID="897bbb98baaf0b8e51fcc038fda0172d281ec3d0b39da90e7c081ac902fc4bc1" Apr 20 07:54:17.542973 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.542957 2572 scope.go:117] "RemoveContainer" containerID="1fce28013923bba5368922770eee73c952a54054eaa34b30187c0b929e3a85d9" Apr 20 07:54:17.545271 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.545252 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 07:54:17.549362 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.549343 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 07:54:17.549526 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.549512 2572 scope.go:117] "RemoveContainer" containerID="a10cc27848af63adea29bfd0f5011191843b0137953e13a6057468e859c5afc1" Apr 20 07:54:17.555661 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.555643 2572 scope.go:117] "RemoveContainer" containerID="304cbdd2e6745f6d925d3463ac47001bb44caf4b5e4325b8cc6ab25df580caeb" Apr 20 07:54:17.561666 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.561648 2572 scope.go:117] "RemoveContainer" containerID="70e538e9f9f4b8f477ace9a6fdb89a6e88fe31faf633729c426a1232a13ae2a2" Apr 20 07:54:17.575861 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.575840 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 07:54:17.576169 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576133 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="init-config-reloader" Apr 20 07:54:17.576215 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576172 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="init-config-reloader" Apr 20 07:54:17.576215 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576183 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="kube-rbac-proxy" Apr 20 07:54:17.576215 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576188 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="kube-rbac-proxy" Apr 20 07:54:17.576215 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576196 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="kube-rbac-proxy-web" Apr 20 07:54:17.576215 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576202 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="kube-rbac-proxy-web" Apr 20 07:54:17.576215 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576209 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fa1a9513-a1a2-4607-acc6-9c79e136d6af" containerName="console" Apr 20 07:54:17.576215 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576214 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa1a9513-a1a2-4607-acc6-9c79e136d6af" containerName="console" Apr 20 07:54:17.576430 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576224 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="config-reloader" Apr 20 07:54:17.576430 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576229 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="config-reloader" Apr 20 07:54:17.576430 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576236 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="kube-rbac-proxy-metric" Apr 20 07:54:17.576430 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576241 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="kube-rbac-proxy-metric" Apr 20 07:54:17.576430 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576247 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="alertmanager" Apr 20 07:54:17.576430 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576252 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="alertmanager" Apr 20 07:54:17.576430 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576260 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="prom-label-proxy" Apr 20 07:54:17.576430 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576265 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="prom-label-proxy" Apr 20 07:54:17.576430 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576307 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="alertmanager" Apr 20 07:54:17.576430 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576314 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="kube-rbac-proxy-web" Apr 20 07:54:17.576430 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576320 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="kube-rbac-proxy-metric" Apr 20 07:54:17.576430 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576328 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="config-reloader" Apr 20 07:54:17.576430 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576334 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="fa1a9513-a1a2-4607-acc6-9c79e136d6af" containerName="console" Apr 20 07:54:17.576430 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576341 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="kube-rbac-proxy" Apr 20 07:54:17.576430 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.576349 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" containerName="prom-label-proxy" Apr 20 07:54:17.581238 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.581220 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.583778 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.583725 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 07:54:17.583960 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.583945 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 07:54:17.584070 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.583962 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 07:54:17.584224 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.584209 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 07:54:17.584403 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.584387 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 07:54:17.584482 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.584411 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 07:54:17.584666 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.584649 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 07:54:17.584754 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.584738 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-szpk9\"" Apr 20 07:54:17.584912 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.584899 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 07:54:17.589580 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.589557 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 07:54:17.591563 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.591540 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 07:54:17.705668 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.705577 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66717b20-de30-4421-80d5-ccd76ced1dc5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.705668 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.705628 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/66717b20-de30-4421-80d5-ccd76ced1dc5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.705842 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.705708 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/66717b20-de30-4421-80d5-ccd76ced1dc5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.705842 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.705744 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/66717b20-de30-4421-80d5-ccd76ced1dc5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.705842 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.705768 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/66717b20-de30-4421-80d5-ccd76ced1dc5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.705842 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.705798 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwg8w\" (UniqueName: \"kubernetes.io/projected/66717b20-de30-4421-80d5-ccd76ced1dc5-kube-api-access-fwg8w\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.705842 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.705819 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/66717b20-de30-4421-80d5-ccd76ced1dc5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.705842 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.705839 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/66717b20-de30-4421-80d5-ccd76ced1dc5-web-config\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.706045 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.705878 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/66717b20-de30-4421-80d5-ccd76ced1dc5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.706045 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.705937 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/66717b20-de30-4421-80d5-ccd76ced1dc5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.706045 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.705984 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/66717b20-de30-4421-80d5-ccd76ced1dc5-config-volume\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.706045 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.706006 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/66717b20-de30-4421-80d5-ccd76ced1dc5-config-out\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.706045 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.706028 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/66717b20-de30-4421-80d5-ccd76ced1dc5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.806751 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.806711 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/66717b20-de30-4421-80d5-ccd76ced1dc5-config-volume\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.806751 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.806747 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/66717b20-de30-4421-80d5-ccd76ced1dc5-config-out\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.807009 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.806768 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/66717b20-de30-4421-80d5-ccd76ced1dc5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.807009 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.806788 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66717b20-de30-4421-80d5-ccd76ced1dc5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.807009 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.806825 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/66717b20-de30-4421-80d5-ccd76ced1dc5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.807009 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.806870 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/66717b20-de30-4421-80d5-ccd76ced1dc5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.807009 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.806898 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/66717b20-de30-4421-80d5-ccd76ced1dc5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.807009 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.806931 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/66717b20-de30-4421-80d5-ccd76ced1dc5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.807009 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.806962 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fwg8w\" (UniqueName: \"kubernetes.io/projected/66717b20-de30-4421-80d5-ccd76ced1dc5-kube-api-access-fwg8w\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.807009 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.806988 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/66717b20-de30-4421-80d5-ccd76ced1dc5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.807009 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.807013 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/66717b20-de30-4421-80d5-ccd76ced1dc5-web-config\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.807577 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.807036 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/66717b20-de30-4421-80d5-ccd76ced1dc5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.807577 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.807065 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/66717b20-de30-4421-80d5-ccd76ced1dc5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.808634 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.807951 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/66717b20-de30-4421-80d5-ccd76ced1dc5-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.808634 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.808013 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66717b20-de30-4421-80d5-ccd76ced1dc5-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.808634 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.808594 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/66717b20-de30-4421-80d5-ccd76ced1dc5-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.810284 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.810259 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/66717b20-de30-4421-80d5-ccd76ced1dc5-tls-assets\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.810388 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.810276 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/66717b20-de30-4421-80d5-ccd76ced1dc5-web-config\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.810388 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.810371 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/66717b20-de30-4421-80d5-ccd76ced1dc5-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.810696 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.810650 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/66717b20-de30-4421-80d5-ccd76ced1dc5-config-out\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.810794 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.810749 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/66717b20-de30-4421-80d5-ccd76ced1dc5-config-volume\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.810848 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.810796 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/66717b20-de30-4421-80d5-ccd76ced1dc5-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.810848 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.810834 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/66717b20-de30-4421-80d5-ccd76ced1dc5-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.811015 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.810999 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/66717b20-de30-4421-80d5-ccd76ced1dc5-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.811631 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.811615 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/66717b20-de30-4421-80d5-ccd76ced1dc5-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.816580 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.816561 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwg8w\" (UniqueName: \"kubernetes.io/projected/66717b20-de30-4421-80d5-ccd76ced1dc5-kube-api-access-fwg8w\") pod \"alertmanager-main-0\" (UID: \"66717b20-de30-4421-80d5-ccd76ced1dc5\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:17.891637 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:17.891598 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 07:54:18.014549 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:18.014517 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 07:54:18.017598 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:54:18.017575 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66717b20_de30_4421_80d5_ccd76ced1dc5.slice/crio-f80a037414aae7d2afa1d3a88b763ef7299b338b08f5460d31fcba3beab72d6f WatchSource:0}: Error finding container f80a037414aae7d2afa1d3a88b763ef7299b338b08f5460d31fcba3beab72d6f: Status 404 returned error can't find the container with id f80a037414aae7d2afa1d3a88b763ef7299b338b08f5460d31fcba3beab72d6f Apr 20 07:54:18.526929 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:18.526891 2572 generic.go:358] "Generic (PLEG): container finished" podID="66717b20-de30-4421-80d5-ccd76ced1dc5" containerID="e26278e5f561ad6e0d0dea34eb7cc683fbe25d193b675c12e594ec82e8e89ffe" exitCode=0 Apr 20 07:54:18.527333 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:18.526936 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"66717b20-de30-4421-80d5-ccd76ced1dc5","Type":"ContainerDied","Data":"e26278e5f561ad6e0d0dea34eb7cc683fbe25d193b675c12e594ec82e8e89ffe"} Apr 20 07:54:18.527333 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:18.526961 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"66717b20-de30-4421-80d5-ccd76ced1dc5","Type":"ContainerStarted","Data":"f80a037414aae7d2afa1d3a88b763ef7299b338b08f5460d31fcba3beab72d6f"} Apr 20 07:54:18.645754 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:18.645722 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e5b3271-5e8e-407c-8aff-da8f698938dd" path="/var/lib/kubelet/pods/8e5b3271-5e8e-407c-8aff-da8f698938dd/volumes" Apr 20 07:54:19.379088 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.379051 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-566cbb6c9c-bplzf"] Apr 20 07:54:19.382652 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.382621 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.385483 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.385450 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 07:54:19.385677 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.385449 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-mx5hq\"" Apr 20 07:54:19.385677 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.385451 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 07:54:19.385926 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.385903 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 07:54:19.386027 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.385989 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 07:54:19.386027 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.386009 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 07:54:19.390466 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.390428 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 07:54:19.395392 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.395370 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-566cbb6c9c-bplzf"] Apr 20 07:54:19.524324 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.524275 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7sdb\" (UniqueName: \"kubernetes.io/projected/11ff8e8f-b0d7-41a4-8371-85195f48d57f-kube-api-access-w7sdb\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.524512 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.524411 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11ff8e8f-b0d7-41a4-8371-85195f48d57f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.524512 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.524450 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/11ff8e8f-b0d7-41a4-8371-85195f48d57f-telemeter-client-tls\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.524512 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.524475 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/11ff8e8f-b0d7-41a4-8371-85195f48d57f-secret-telemeter-client\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.524512 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.524495 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/11ff8e8f-b0d7-41a4-8371-85195f48d57f-federate-client-tls\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.524708 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.524581 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11ff8e8f-b0d7-41a4-8371-85195f48d57f-serving-certs-ca-bundle\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.524708 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.524674 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11ff8e8f-b0d7-41a4-8371-85195f48d57f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.524800 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.524704 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11ff8e8f-b0d7-41a4-8371-85195f48d57f-metrics-client-ca\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.533548 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.533514 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"66717b20-de30-4421-80d5-ccd76ced1dc5","Type":"ContainerStarted","Data":"24266898074a74733f69bfbbc721d04a0de3a71b717d28554c4cf88326e72ab5"} Apr 20 07:54:19.533996 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.533555 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"66717b20-de30-4421-80d5-ccd76ced1dc5","Type":"ContainerStarted","Data":"4d3c8448f9bb8434b45db3196e66fe6892a71787c1206fcde0d587e97f6e6541"} Apr 20 07:54:19.533996 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.533571 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"66717b20-de30-4421-80d5-ccd76ced1dc5","Type":"ContainerStarted","Data":"2807ab1ace2542c0cae60c3801194bc72340e834b112a15eda2fd55c0e37e723"} Apr 20 07:54:19.533996 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.533583 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"66717b20-de30-4421-80d5-ccd76ced1dc5","Type":"ContainerStarted","Data":"c653990eae6225ed591db26ff9fbdb3c7510b5350b399c82905acb41c2002dbc"} Apr 20 07:54:19.533996 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.533595 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"66717b20-de30-4421-80d5-ccd76ced1dc5","Type":"ContainerStarted","Data":"fbe64c3cbe2ba359c37ba39bad7097bfdd8b10cef20031d887d95293d6bc85cf"} Apr 20 07:54:19.533996 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.533608 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"66717b20-de30-4421-80d5-ccd76ced1dc5","Type":"ContainerStarted","Data":"6aa3925d41563a5e2eaaa6fd59c3cef428420e3e89887451f98c3617b81ee808"} Apr 20 07:54:19.559041 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.558994 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.55898072 podStartE2EDuration="2.55898072s" podCreationTimestamp="2026-04-20 07:54:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:54:19.55847397 +0000 UTC m=+243.472104014" watchObservedRunningTime="2026-04-20 07:54:19.55898072 +0000 UTC m=+243.472610760" Apr 20 07:54:19.625605 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.625569 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11ff8e8f-b0d7-41a4-8371-85195f48d57f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.625804 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.625618 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/11ff8e8f-b0d7-41a4-8371-85195f48d57f-telemeter-client-tls\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.625804 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.625666 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/11ff8e8f-b0d7-41a4-8371-85195f48d57f-secret-telemeter-client\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.625930 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.625831 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/11ff8e8f-b0d7-41a4-8371-85195f48d57f-federate-client-tls\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.625930 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.625879 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11ff8e8f-b0d7-41a4-8371-85195f48d57f-serving-certs-ca-bundle\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.626075 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.626055 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11ff8e8f-b0d7-41a4-8371-85195f48d57f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.626458 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.626429 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11ff8e8f-b0d7-41a4-8371-85195f48d57f-metrics-client-ca\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.626591 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.626520 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7sdb\" (UniqueName: \"kubernetes.io/projected/11ff8e8f-b0d7-41a4-8371-85195f48d57f-kube-api-access-w7sdb\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.626793 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.626768 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11ff8e8f-b0d7-41a4-8371-85195f48d57f-serving-certs-ca-bundle\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.627374 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.627347 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11ff8e8f-b0d7-41a4-8371-85195f48d57f-metrics-client-ca\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.627645 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.627627 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11ff8e8f-b0d7-41a4-8371-85195f48d57f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.628447 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.628422 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/11ff8e8f-b0d7-41a4-8371-85195f48d57f-secret-telemeter-client\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.628568 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.628547 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/11ff8e8f-b0d7-41a4-8371-85195f48d57f-federate-client-tls\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.628628 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.628600 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/11ff8e8f-b0d7-41a4-8371-85195f48d57f-telemeter-client-tls\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.628664 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.628622 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11ff8e8f-b0d7-41a4-8371-85195f48d57f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.633831 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.633779 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7sdb\" (UniqueName: \"kubernetes.io/projected/11ff8e8f-b0d7-41a4-8371-85195f48d57f-kube-api-access-w7sdb\") pod \"telemeter-client-566cbb6c9c-bplzf\" (UID: \"11ff8e8f-b0d7-41a4-8371-85195f48d57f\") " pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.695719 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.695480 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" Apr 20 07:54:19.840130 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:19.840099 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-566cbb6c9c-bplzf"] Apr 20 07:54:19.843323 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:54:19.843298 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11ff8e8f_b0d7_41a4_8371_85195f48d57f.slice/crio-ef6844c6d21651877af20fab76c3507d7d793ff226910c034581afda3db092b8 WatchSource:0}: Error finding container ef6844c6d21651877af20fab76c3507d7d793ff226910c034581afda3db092b8: Status 404 returned error can't find the container with id ef6844c6d21651877af20fab76c3507d7d793ff226910c034581afda3db092b8 Apr 20 07:54:20.538808 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:20.538766 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" event={"ID":"11ff8e8f-b0d7-41a4-8371-85195f48d57f","Type":"ContainerStarted","Data":"ef6844c6d21651877af20fab76c3507d7d793ff226910c034581afda3db092b8"} Apr 20 07:54:21.544228 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:21.544190 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" event={"ID":"11ff8e8f-b0d7-41a4-8371-85195f48d57f","Type":"ContainerStarted","Data":"e368f00a9752184915235a8014ce875d001bdce8c4b630a363040cbe0a64e95b"} Apr 20 07:54:22.548507 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:22.548469 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" event={"ID":"11ff8e8f-b0d7-41a4-8371-85195f48d57f","Type":"ContainerStarted","Data":"7f091d15e23a358d93ff23ba0cbc368407144621194fa5d55a2900469e9fc87b"} Apr 20 07:54:22.548507 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:22.548505 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" event={"ID":"11ff8e8f-b0d7-41a4-8371-85195f48d57f","Type":"ContainerStarted","Data":"9d2ad32c0ba2f8673d24dd8eee39f775009f2a97c6be707571f0d57fbd37679b"} Apr 20 07:54:22.571601 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:22.571544 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-566cbb6c9c-bplzf" podStartSLOduration=1.957248189 podStartE2EDuration="3.571526263s" podCreationTimestamp="2026-04-20 07:54:19 +0000 UTC" firstStartedPulling="2026-04-20 07:54:19.845223889 +0000 UTC m=+243.758853914" lastFinishedPulling="2026-04-20 07:54:21.459501965 +0000 UTC m=+245.373131988" observedRunningTime="2026-04-20 07:54:22.570643618 +0000 UTC m=+246.484273685" watchObservedRunningTime="2026-04-20 07:54:22.571526263 +0000 UTC m=+246.485156358" Apr 20 07:54:23.036103 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.035972 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-67d984d94d-rnqjh"] Apr 20 07:54:23.039506 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.039484 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.049567 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.049539 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67d984d94d-rnqjh"] Apr 20 07:54:23.159747 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.159711 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-trusted-ca-bundle\") pod \"console-67d984d94d-rnqjh\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.159747 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.159754 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8jv4\" (UniqueName: \"kubernetes.io/projected/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-kube-api-access-b8jv4\") pod \"console-67d984d94d-rnqjh\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.159983 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.159773 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-console-serving-cert\") pod \"console-67d984d94d-rnqjh\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.159983 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.159885 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-console-config\") pod \"console-67d984d94d-rnqjh\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.159983 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.159929 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-console-oauth-config\") pod \"console-67d984d94d-rnqjh\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.159983 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.159966 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-service-ca\") pod \"console-67d984d94d-rnqjh\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.160127 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.159996 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-oauth-serving-cert\") pod \"console-67d984d94d-rnqjh\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.261311 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.261272 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-trusted-ca-bundle\") pod \"console-67d984d94d-rnqjh\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.261311 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.261310 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8jv4\" (UniqueName: \"kubernetes.io/projected/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-kube-api-access-b8jv4\") pod \"console-67d984d94d-rnqjh\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.261569 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.261331 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-console-serving-cert\") pod \"console-67d984d94d-rnqjh\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.261569 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.261370 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-console-config\") pod \"console-67d984d94d-rnqjh\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.261569 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.261395 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-console-oauth-config\") pod \"console-67d984d94d-rnqjh\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.261569 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.261413 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-service-ca\") pod \"console-67d984d94d-rnqjh\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.261569 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.261447 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-oauth-serving-cert\") pod \"console-67d984d94d-rnqjh\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.262252 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.262223 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-console-config\") pod \"console-67d984d94d-rnqjh\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.262448 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.262271 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-oauth-serving-cert\") pod \"console-67d984d94d-rnqjh\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.262448 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.262288 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-trusted-ca-bundle\") pod \"console-67d984d94d-rnqjh\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.263784 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.263747 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-service-ca\") pod \"console-67d984d94d-rnqjh\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.263917 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.263898 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-console-serving-cert\") pod \"console-67d984d94d-rnqjh\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.263982 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.263968 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-console-oauth-config\") pod \"console-67d984d94d-rnqjh\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.268817 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.268799 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8jv4\" (UniqueName: \"kubernetes.io/projected/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-kube-api-access-b8jv4\") pod \"console-67d984d94d-rnqjh\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.350279 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.350193 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:23.484874 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.484853 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-67d984d94d-rnqjh"] Apr 20 07:54:23.487629 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:54:23.487593 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59c24e78_d1f2_4b50_86f3_2eb9da0e755c.slice/crio-51928fe902ba4812645701c846d8fe7dd270f4a90a4ff2fee9a5cbdde4ead95e WatchSource:0}: Error finding container 51928fe902ba4812645701c846d8fe7dd270f4a90a4ff2fee9a5cbdde4ead95e: Status 404 returned error can't find the container with id 51928fe902ba4812645701c846d8fe7dd270f4a90a4ff2fee9a5cbdde4ead95e Apr 20 07:54:23.553204 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:23.553174 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67d984d94d-rnqjh" event={"ID":"59c24e78-d1f2-4b50-86f3-2eb9da0e755c","Type":"ContainerStarted","Data":"51928fe902ba4812645701c846d8fe7dd270f4a90a4ff2fee9a5cbdde4ead95e"} Apr 20 07:54:24.557500 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:24.557468 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67d984d94d-rnqjh" event={"ID":"59c24e78-d1f2-4b50-86f3-2eb9da0e755c","Type":"ContainerStarted","Data":"5927853e3efb297b2b7f8b24139a16b2e6dbbd34a0480ea77c58beca40b3d240"} Apr 20 07:54:24.574638 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:24.574592 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-67d984d94d-rnqjh" podStartSLOduration=1.574577051 podStartE2EDuration="1.574577051s" podCreationTimestamp="2026-04-20 07:54:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:54:24.574094186 +0000 UTC m=+248.487724253" watchObservedRunningTime="2026-04-20 07:54:24.574577051 +0000 UTC m=+248.488207093" Apr 20 07:54:28.411057 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:28.411016 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs\") pod \"network-metrics-daemon-brq5h\" (UID: \"07219834-44d6-42ab-9058-aed46274d1a8\") " pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:54:28.413317 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:28.413297 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/07219834-44d6-42ab-9058-aed46274d1a8-metrics-certs\") pod \"network-metrics-daemon-brq5h\" (UID: \"07219834-44d6-42ab-9058-aed46274d1a8\") " pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:54:28.645511 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:28.645478 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-2g69h\"" Apr 20 07:54:28.652948 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:28.652927 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-brq5h" Apr 20 07:54:28.770807 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:28.770781 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-brq5h"] Apr 20 07:54:28.772998 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:54:28.772969 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07219834_44d6_42ab_9058_aed46274d1a8.slice/crio-cf3dd00bddf4e7a47426219aa2e1ebdd49a5c1ad55e468d03af59d573f39715c WatchSource:0}: Error finding container cf3dd00bddf4e7a47426219aa2e1ebdd49a5c1ad55e468d03af59d573f39715c: Status 404 returned error can't find the container with id cf3dd00bddf4e7a47426219aa2e1ebdd49a5c1ad55e468d03af59d573f39715c Apr 20 07:54:29.576701 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:29.576666 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-brq5h" event={"ID":"07219834-44d6-42ab-9058-aed46274d1a8","Type":"ContainerStarted","Data":"cf3dd00bddf4e7a47426219aa2e1ebdd49a5c1ad55e468d03af59d573f39715c"} Apr 20 07:54:30.581043 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:30.581004 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-brq5h" event={"ID":"07219834-44d6-42ab-9058-aed46274d1a8","Type":"ContainerStarted","Data":"85f0f5c3aedf264a9ed0491da12337f31940695bb570ac10979676d4d6ae00fb"} Apr 20 07:54:30.581478 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:30.581051 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-brq5h" event={"ID":"07219834-44d6-42ab-9058-aed46274d1a8","Type":"ContainerStarted","Data":"e5e10de5cae8e9afa9380b10634431076bc61c110f34676171235b1a78c8b558"} Apr 20 07:54:30.597213 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:30.597158 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-brq5h" podStartSLOduration=253.34907428 podStartE2EDuration="4m14.59712538s" podCreationTimestamp="2026-04-20 07:50:16 +0000 UTC" firstStartedPulling="2026-04-20 07:54:28.775237809 +0000 UTC m=+252.688867833" lastFinishedPulling="2026-04-20 07:54:30.023288894 +0000 UTC m=+253.936918933" observedRunningTime="2026-04-20 07:54:30.595289553 +0000 UTC m=+254.508919598" watchObservedRunningTime="2026-04-20 07:54:30.59712538 +0000 UTC m=+254.510755423" Apr 20 07:54:33.350532 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:33.350484 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:33.350532 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:33.350534 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:33.355237 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:33.355217 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:33.593258 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:33.593229 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:54:33.639891 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:33.639820 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7967d78859-69vhd"] Apr 20 07:54:58.662219 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:58.662160 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7967d78859-69vhd" podUID="b40bc6c0-48ba-464f-a54f-e8678037e82e" containerName="console" containerID="cri-o://8dda1745adef95605672154dfbe582a0d0183eecd04ffd2fa20d8f370895e9ce" gracePeriod=15 Apr 20 07:54:58.899022 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:58.898998 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7967d78859-69vhd_b40bc6c0-48ba-464f-a54f-e8678037e82e/console/0.log" Apr 20 07:54:58.899135 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:58.899057 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:54:58.977813 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:58.977726 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-service-ca\") pod \"b40bc6c0-48ba-464f-a54f-e8678037e82e\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " Apr 20 07:54:58.977813 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:58.977803 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbkxf\" (UniqueName: \"kubernetes.io/projected/b40bc6c0-48ba-464f-a54f-e8678037e82e-kube-api-access-lbkxf\") pod \"b40bc6c0-48ba-464f-a54f-e8678037e82e\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " Apr 20 07:54:58.978022 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:58.977828 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-oauth-serving-cert\") pod \"b40bc6c0-48ba-464f-a54f-e8678037e82e\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " Apr 20 07:54:58.978022 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:58.977874 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-trusted-ca-bundle\") pod \"b40bc6c0-48ba-464f-a54f-e8678037e82e\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " Apr 20 07:54:58.978022 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:58.977931 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-console-config\") pod \"b40bc6c0-48ba-464f-a54f-e8678037e82e\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " Apr 20 07:54:58.978022 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:58.977971 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b40bc6c0-48ba-464f-a54f-e8678037e82e-console-serving-cert\") pod \"b40bc6c0-48ba-464f-a54f-e8678037e82e\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " Apr 20 07:54:58.978022 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:58.978004 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b40bc6c0-48ba-464f-a54f-e8678037e82e-console-oauth-config\") pod \"b40bc6c0-48ba-464f-a54f-e8678037e82e\" (UID: \"b40bc6c0-48ba-464f-a54f-e8678037e82e\") " Apr 20 07:54:58.978300 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:58.978201 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-service-ca" (OuterVolumeSpecName: "service-ca") pod "b40bc6c0-48ba-464f-a54f-e8678037e82e" (UID: "b40bc6c0-48ba-464f-a54f-e8678037e82e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:54:58.978386 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:58.978348 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b40bc6c0-48ba-464f-a54f-e8678037e82e" (UID: "b40bc6c0-48ba-464f-a54f-e8678037e82e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:54:58.978441 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:58.978381 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-console-config" (OuterVolumeSpecName: "console-config") pod "b40bc6c0-48ba-464f-a54f-e8678037e82e" (UID: "b40bc6c0-48ba-464f-a54f-e8678037e82e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:54:58.978441 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:58.978425 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-service-ca\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:54:58.978535 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:58.978446 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b40bc6c0-48ba-464f-a54f-e8678037e82e" (UID: "b40bc6c0-48ba-464f-a54f-e8678037e82e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:54:58.980035 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:58.980007 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b40bc6c0-48ba-464f-a54f-e8678037e82e-kube-api-access-lbkxf" (OuterVolumeSpecName: "kube-api-access-lbkxf") pod "b40bc6c0-48ba-464f-a54f-e8678037e82e" (UID: "b40bc6c0-48ba-464f-a54f-e8678037e82e"). InnerVolumeSpecName "kube-api-access-lbkxf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:54:58.980190 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:58.980096 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b40bc6c0-48ba-464f-a54f-e8678037e82e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b40bc6c0-48ba-464f-a54f-e8678037e82e" (UID: "b40bc6c0-48ba-464f-a54f-e8678037e82e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:54:58.980190 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:58.980123 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b40bc6c0-48ba-464f-a54f-e8678037e82e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b40bc6c0-48ba-464f-a54f-e8678037e82e" (UID: "b40bc6c0-48ba-464f-a54f-e8678037e82e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:54:59.079294 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:59.079255 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lbkxf\" (UniqueName: \"kubernetes.io/projected/b40bc6c0-48ba-464f-a54f-e8678037e82e-kube-api-access-lbkxf\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:54:59.079294 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:59.079285 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-oauth-serving-cert\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:54:59.079294 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:59.079295 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-trusted-ca-bundle\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:54:59.079294 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:59.079304 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b40bc6c0-48ba-464f-a54f-e8678037e82e-console-config\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:54:59.079545 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:59.079314 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b40bc6c0-48ba-464f-a54f-e8678037e82e-console-serving-cert\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:54:59.079545 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:59.079322 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b40bc6c0-48ba-464f-a54f-e8678037e82e-console-oauth-config\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:54:59.664644 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:59.664617 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7967d78859-69vhd_b40bc6c0-48ba-464f-a54f-e8678037e82e/console/0.log" Apr 20 07:54:59.665197 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:59.664658 2572 generic.go:358] "Generic (PLEG): container finished" podID="b40bc6c0-48ba-464f-a54f-e8678037e82e" containerID="8dda1745adef95605672154dfbe582a0d0183eecd04ffd2fa20d8f370895e9ce" exitCode=2 Apr 20 07:54:59.665197 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:59.664714 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7967d78859-69vhd" event={"ID":"b40bc6c0-48ba-464f-a54f-e8678037e82e","Type":"ContainerDied","Data":"8dda1745adef95605672154dfbe582a0d0183eecd04ffd2fa20d8f370895e9ce"} Apr 20 07:54:59.665197 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:59.664736 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7967d78859-69vhd" event={"ID":"b40bc6c0-48ba-464f-a54f-e8678037e82e","Type":"ContainerDied","Data":"380a416aa49977e0917eafe625fa648d92ca2f224d03a3e4f2bfbf8577a82eae"} Apr 20 07:54:59.665197 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:59.664739 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7967d78859-69vhd" Apr 20 07:54:59.665197 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:59.664751 2572 scope.go:117] "RemoveContainer" containerID="8dda1745adef95605672154dfbe582a0d0183eecd04ffd2fa20d8f370895e9ce" Apr 20 07:54:59.673017 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:59.673000 2572 scope.go:117] "RemoveContainer" containerID="8dda1745adef95605672154dfbe582a0d0183eecd04ffd2fa20d8f370895e9ce" Apr 20 07:54:59.673317 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:54:59.673291 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dda1745adef95605672154dfbe582a0d0183eecd04ffd2fa20d8f370895e9ce\": container with ID starting with 8dda1745adef95605672154dfbe582a0d0183eecd04ffd2fa20d8f370895e9ce not found: ID does not exist" containerID="8dda1745adef95605672154dfbe582a0d0183eecd04ffd2fa20d8f370895e9ce" Apr 20 07:54:59.673419 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:59.673325 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dda1745adef95605672154dfbe582a0d0183eecd04ffd2fa20d8f370895e9ce"} err="failed to get container status \"8dda1745adef95605672154dfbe582a0d0183eecd04ffd2fa20d8f370895e9ce\": rpc error: code = NotFound desc = could not find container \"8dda1745adef95605672154dfbe582a0d0183eecd04ffd2fa20d8f370895e9ce\": container with ID starting with 8dda1745adef95605672154dfbe582a0d0183eecd04ffd2fa20d8f370895e9ce not found: ID does not exist" Apr 20 07:54:59.685371 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:59.685347 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7967d78859-69vhd"] Apr 20 07:54:59.689193 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:54:59.689173 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7967d78859-69vhd"] Apr 20 07:55:00.645264 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:00.645230 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b40bc6c0-48ba-464f-a54f-e8678037e82e" path="/var/lib/kubelet/pods/b40bc6c0-48ba-464f-a54f-e8678037e82e/volumes" Apr 20 07:55:16.551924 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:16.551896 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2chjv_29883fa5-e5e1-425a-85c2-3b3bd3ada0aa/console-operator/1.log" Apr 20 07:55:16.552436 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:16.551979 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2chjv_29883fa5-e5e1-425a-85c2-3b3bd3ada0aa/console-operator/1.log" Apr 20 07:55:16.557217 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:16.557197 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovn-acl-logging/0.log" Apr 20 07:55:16.557316 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:16.557234 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovn-acl-logging/0.log" Apr 20 07:55:16.562743 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:16.562722 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 07:55:43.626679 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.626622 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-55b5d5c58d-k48ss"] Apr 20 07:55:43.629162 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.626956 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b40bc6c0-48ba-464f-a54f-e8678037e82e" containerName="console" Apr 20 07:55:43.629162 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.626966 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b40bc6c0-48ba-464f-a54f-e8678037e82e" containerName="console" Apr 20 07:55:43.629162 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.627017 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b40bc6c0-48ba-464f-a54f-e8678037e82e" containerName="console" Apr 20 07:55:43.630022 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.630001 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:43.637222 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.637195 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55b5d5c58d-k48ss"] Apr 20 07:55:43.746453 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.746417 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36381530-e8a0-4b66-87f4-c815e4685fbf-console-oauth-config\") pod \"console-55b5d5c58d-k48ss\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:43.746651 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.746467 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdplr\" (UniqueName: \"kubernetes.io/projected/36381530-e8a0-4b66-87f4-c815e4685fbf-kube-api-access-sdplr\") pod \"console-55b5d5c58d-k48ss\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:43.746651 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.746541 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-oauth-serving-cert\") pod \"console-55b5d5c58d-k48ss\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:43.746651 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.746591 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36381530-e8a0-4b66-87f4-c815e4685fbf-console-serving-cert\") pod \"console-55b5d5c58d-k48ss\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:43.746651 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.746629 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-service-ca\") pod \"console-55b5d5c58d-k48ss\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:43.746816 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.746651 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-trusted-ca-bundle\") pod \"console-55b5d5c58d-k48ss\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:43.746816 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.746705 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-console-config\") pod \"console-55b5d5c58d-k48ss\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:43.847570 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.847540 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdplr\" (UniqueName: \"kubernetes.io/projected/36381530-e8a0-4b66-87f4-c815e4685fbf-kube-api-access-sdplr\") pod \"console-55b5d5c58d-k48ss\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:43.847758 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.847586 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-oauth-serving-cert\") pod \"console-55b5d5c58d-k48ss\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:43.847758 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.847617 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36381530-e8a0-4b66-87f4-c815e4685fbf-console-serving-cert\") pod \"console-55b5d5c58d-k48ss\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:43.847758 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.847642 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-service-ca\") pod \"console-55b5d5c58d-k48ss\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:43.847758 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.847659 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-trusted-ca-bundle\") pod \"console-55b5d5c58d-k48ss\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:43.847758 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.847699 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-console-config\") pod \"console-55b5d5c58d-k48ss\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:43.848100 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.847744 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36381530-e8a0-4b66-87f4-c815e4685fbf-console-oauth-config\") pod \"console-55b5d5c58d-k48ss\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:43.848504 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.848476 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-service-ca\") pod \"console-55b5d5c58d-k48ss\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:43.848611 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.848476 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-console-config\") pod \"console-55b5d5c58d-k48ss\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:43.848776 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.848752 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-trusted-ca-bundle\") pod \"console-55b5d5c58d-k48ss\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:43.848849 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.848828 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-oauth-serving-cert\") pod \"console-55b5d5c58d-k48ss\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:43.850277 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.850256 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36381530-e8a0-4b66-87f4-c815e4685fbf-console-serving-cert\") pod \"console-55b5d5c58d-k48ss\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:43.850359 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.850282 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36381530-e8a0-4b66-87f4-c815e4685fbf-console-oauth-config\") pod \"console-55b5d5c58d-k48ss\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:43.856550 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.856515 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdplr\" (UniqueName: \"kubernetes.io/projected/36381530-e8a0-4b66-87f4-c815e4685fbf-kube-api-access-sdplr\") pod \"console-55b5d5c58d-k48ss\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:43.943038 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:43.942938 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:44.061298 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:44.061269 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-55b5d5c58d-k48ss"] Apr 20 07:55:44.063734 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:55:44.063706 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36381530_e8a0_4b66_87f4_c815e4685fbf.slice/crio-0dce5dea4a8d61dd4bdbd1182b7a43e14b50e9df9968594488de48b9f75dfc64 WatchSource:0}: Error finding container 0dce5dea4a8d61dd4bdbd1182b7a43e14b50e9df9968594488de48b9f75dfc64: Status 404 returned error can't find the container with id 0dce5dea4a8d61dd4bdbd1182b7a43e14b50e9df9968594488de48b9f75dfc64 Apr 20 07:55:44.065510 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:44.065491 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 07:55:44.798588 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:44.798549 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55b5d5c58d-k48ss" event={"ID":"36381530-e8a0-4b66-87f4-c815e4685fbf","Type":"ContainerStarted","Data":"94678a55169bc555a0068f9fc6ec1c2a7744096a71e8e76b89e5a74c1a746b54"} Apr 20 07:55:44.798588 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:44.798584 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55b5d5c58d-k48ss" event={"ID":"36381530-e8a0-4b66-87f4-c815e4685fbf","Type":"ContainerStarted","Data":"0dce5dea4a8d61dd4bdbd1182b7a43e14b50e9df9968594488de48b9f75dfc64"} Apr 20 07:55:44.817025 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:44.816976 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-55b5d5c58d-k48ss" podStartSLOduration=1.816962204 podStartE2EDuration="1.816962204s" podCreationTimestamp="2026-04-20 07:55:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:55:44.815859528 +0000 UTC m=+328.729489592" watchObservedRunningTime="2026-04-20 07:55:44.816962204 +0000 UTC m=+328.730592246" Apr 20 07:55:53.943272 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:53.943177 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:53.943272 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:53.943236 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:53.947945 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:53.947921 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:54.832378 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:54.832351 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 07:55:54.877415 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:55:54.877378 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67d984d94d-rnqjh"] Apr 20 07:56:19.903888 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:19.903781 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-67d984d94d-rnqjh" podUID="59c24e78-d1f2-4b50-86f3-2eb9da0e755c" containerName="console" containerID="cri-o://5927853e3efb297b2b7f8b24139a16b2e6dbbd34a0480ea77c58beca40b3d240" gracePeriod=15 Apr 20 07:56:20.133324 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.133303 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67d984d94d-rnqjh_59c24e78-d1f2-4b50-86f3-2eb9da0e755c/console/0.log" Apr 20 07:56:20.133427 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.133361 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:56:20.160004 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.159919 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-trusted-ca-bundle\") pod \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " Apr 20 07:56:20.160004 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.159969 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8jv4\" (UniqueName: \"kubernetes.io/projected/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-kube-api-access-b8jv4\") pod \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " Apr 20 07:56:20.160249 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.160020 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-console-oauth-config\") pod \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " Apr 20 07:56:20.160249 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.160052 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-oauth-serving-cert\") pod \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " Apr 20 07:56:20.160249 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.160082 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-service-ca\") pod \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " Apr 20 07:56:20.160249 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.160107 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-console-serving-cert\") pod \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " Apr 20 07:56:20.160249 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.160170 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-console-config\") pod \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\" (UID: \"59c24e78-d1f2-4b50-86f3-2eb9da0e755c\") " Apr 20 07:56:20.160511 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.160351 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "59c24e78-d1f2-4b50-86f3-2eb9da0e755c" (UID: "59c24e78-d1f2-4b50-86f3-2eb9da0e755c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:56:20.160511 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.160421 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-trusted-ca-bundle\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:56:20.160618 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.160592 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "59c24e78-d1f2-4b50-86f3-2eb9da0e755c" (UID: "59c24e78-d1f2-4b50-86f3-2eb9da0e755c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:56:20.160677 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.160621 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-service-ca" (OuterVolumeSpecName: "service-ca") pod "59c24e78-d1f2-4b50-86f3-2eb9da0e755c" (UID: "59c24e78-d1f2-4b50-86f3-2eb9da0e755c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:56:20.160811 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.160785 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-console-config" (OuterVolumeSpecName: "console-config") pod "59c24e78-d1f2-4b50-86f3-2eb9da0e755c" (UID: "59c24e78-d1f2-4b50-86f3-2eb9da0e755c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:56:20.162300 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.162272 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "59c24e78-d1f2-4b50-86f3-2eb9da0e755c" (UID: "59c24e78-d1f2-4b50-86f3-2eb9da0e755c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:56:20.162434 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.162409 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-kube-api-access-b8jv4" (OuterVolumeSpecName: "kube-api-access-b8jv4") pod "59c24e78-d1f2-4b50-86f3-2eb9da0e755c" (UID: "59c24e78-d1f2-4b50-86f3-2eb9da0e755c"). InnerVolumeSpecName "kube-api-access-b8jv4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:56:20.162930 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.162913 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "59c24e78-d1f2-4b50-86f3-2eb9da0e755c" (UID: "59c24e78-d1f2-4b50-86f3-2eb9da0e755c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:56:20.261271 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.261210 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-console-serving-cert\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:56:20.261271 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.261247 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-console-config\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:56:20.261271 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.261258 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b8jv4\" (UniqueName: \"kubernetes.io/projected/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-kube-api-access-b8jv4\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:56:20.261612 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.261289 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-console-oauth-config\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:56:20.261612 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.261299 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-oauth-serving-cert\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:56:20.261612 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.261309 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/59c24e78-d1f2-4b50-86f3-2eb9da0e755c-service-ca\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:56:20.907649 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.907575 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-67d984d94d-rnqjh_59c24e78-d1f2-4b50-86f3-2eb9da0e755c/console/0.log" Apr 20 07:56:20.907649 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.907614 2572 generic.go:358] "Generic (PLEG): container finished" podID="59c24e78-d1f2-4b50-86f3-2eb9da0e755c" containerID="5927853e3efb297b2b7f8b24139a16b2e6dbbd34a0480ea77c58beca40b3d240" exitCode=2 Apr 20 07:56:20.908061 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.907685 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67d984d94d-rnqjh" event={"ID":"59c24e78-d1f2-4b50-86f3-2eb9da0e755c","Type":"ContainerDied","Data":"5927853e3efb297b2b7f8b24139a16b2e6dbbd34a0480ea77c58beca40b3d240"} Apr 20 07:56:20.908061 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.907710 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-67d984d94d-rnqjh" event={"ID":"59c24e78-d1f2-4b50-86f3-2eb9da0e755c","Type":"ContainerDied","Data":"51928fe902ba4812645701c846d8fe7dd270f4a90a4ff2fee9a5cbdde4ead95e"} Apr 20 07:56:20.908061 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.907719 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-67d984d94d-rnqjh" Apr 20 07:56:20.908061 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.907725 2572 scope.go:117] "RemoveContainer" containerID="5927853e3efb297b2b7f8b24139a16b2e6dbbd34a0480ea77c58beca40b3d240" Apr 20 07:56:20.915626 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.915598 2572 scope.go:117] "RemoveContainer" containerID="5927853e3efb297b2b7f8b24139a16b2e6dbbd34a0480ea77c58beca40b3d240" Apr 20 07:56:20.915894 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:56:20.915871 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5927853e3efb297b2b7f8b24139a16b2e6dbbd34a0480ea77c58beca40b3d240\": container with ID starting with 5927853e3efb297b2b7f8b24139a16b2e6dbbd34a0480ea77c58beca40b3d240 not found: ID does not exist" containerID="5927853e3efb297b2b7f8b24139a16b2e6dbbd34a0480ea77c58beca40b3d240" Apr 20 07:56:20.915984 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.915901 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5927853e3efb297b2b7f8b24139a16b2e6dbbd34a0480ea77c58beca40b3d240"} err="failed to get container status \"5927853e3efb297b2b7f8b24139a16b2e6dbbd34a0480ea77c58beca40b3d240\": rpc error: code = NotFound desc = could not find container \"5927853e3efb297b2b7f8b24139a16b2e6dbbd34a0480ea77c58beca40b3d240\": container with ID starting with 5927853e3efb297b2b7f8b24139a16b2e6dbbd34a0480ea77c58beca40b3d240 not found: ID does not exist" Apr 20 07:56:20.925470 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.925442 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-67d984d94d-rnqjh"] Apr 20 07:56:20.927349 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:20.927327 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-67d984d94d-rnqjh"] Apr 20 07:56:22.646021 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:22.645991 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59c24e78-d1f2-4b50-86f3-2eb9da0e755c" path="/var/lib/kubelet/pods/59c24e78-d1f2-4b50-86f3-2eb9da0e755c/volumes" Apr 20 07:56:27.257349 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:27.257311 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-fxwgx"] Apr 20 07:56:27.257828 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:27.257639 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59c24e78-d1f2-4b50-86f3-2eb9da0e755c" containerName="console" Apr 20 07:56:27.257828 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:27.257650 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="59c24e78-d1f2-4b50-86f3-2eb9da0e755c" containerName="console" Apr 20 07:56:27.257828 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:27.257705 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="59c24e78-d1f2-4b50-86f3-2eb9da0e755c" containerName="console" Apr 20 07:56:27.261878 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:27.261861 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fxwgx" Apr 20 07:56:27.264626 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:27.264603 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 07:56:27.266911 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:27.266891 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fxwgx"] Apr 20 07:56:27.319717 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:27.319679 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/723bdbde-5646-4818-b4d4-06690f364a5a-dbus\") pod \"global-pull-secret-syncer-fxwgx\" (UID: \"723bdbde-5646-4818-b4d4-06690f364a5a\") " pod="kube-system/global-pull-secret-syncer-fxwgx" Apr 20 07:56:27.319889 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:27.319745 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/723bdbde-5646-4818-b4d4-06690f364a5a-original-pull-secret\") pod \"global-pull-secret-syncer-fxwgx\" (UID: \"723bdbde-5646-4818-b4d4-06690f364a5a\") " pod="kube-system/global-pull-secret-syncer-fxwgx" Apr 20 07:56:27.319889 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:27.319768 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/723bdbde-5646-4818-b4d4-06690f364a5a-kubelet-config\") pod \"global-pull-secret-syncer-fxwgx\" (UID: \"723bdbde-5646-4818-b4d4-06690f364a5a\") " pod="kube-system/global-pull-secret-syncer-fxwgx" Apr 20 07:56:27.421107 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:27.421074 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/723bdbde-5646-4818-b4d4-06690f364a5a-original-pull-secret\") pod \"global-pull-secret-syncer-fxwgx\" (UID: \"723bdbde-5646-4818-b4d4-06690f364a5a\") " pod="kube-system/global-pull-secret-syncer-fxwgx" Apr 20 07:56:27.421295 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:27.421115 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/723bdbde-5646-4818-b4d4-06690f364a5a-kubelet-config\") pod \"global-pull-secret-syncer-fxwgx\" (UID: \"723bdbde-5646-4818-b4d4-06690f364a5a\") " pod="kube-system/global-pull-secret-syncer-fxwgx" Apr 20 07:56:27.421295 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:27.421194 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/723bdbde-5646-4818-b4d4-06690f364a5a-dbus\") pod \"global-pull-secret-syncer-fxwgx\" (UID: \"723bdbde-5646-4818-b4d4-06690f364a5a\") " pod="kube-system/global-pull-secret-syncer-fxwgx" Apr 20 07:56:27.421412 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:27.421293 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/723bdbde-5646-4818-b4d4-06690f364a5a-kubelet-config\") pod \"global-pull-secret-syncer-fxwgx\" (UID: \"723bdbde-5646-4818-b4d4-06690f364a5a\") " pod="kube-system/global-pull-secret-syncer-fxwgx" Apr 20 07:56:27.421412 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:27.421361 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/723bdbde-5646-4818-b4d4-06690f364a5a-dbus\") pod \"global-pull-secret-syncer-fxwgx\" (UID: \"723bdbde-5646-4818-b4d4-06690f364a5a\") " pod="kube-system/global-pull-secret-syncer-fxwgx" Apr 20 07:56:27.424003 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:27.423976 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/723bdbde-5646-4818-b4d4-06690f364a5a-original-pull-secret\") pod \"global-pull-secret-syncer-fxwgx\" (UID: \"723bdbde-5646-4818-b4d4-06690f364a5a\") " pod="kube-system/global-pull-secret-syncer-fxwgx" Apr 20 07:56:27.572098 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:27.572008 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fxwgx" Apr 20 07:56:27.689512 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:27.689377 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fxwgx"] Apr 20 07:56:27.694613 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:56:27.694582 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod723bdbde_5646_4818_b4d4_06690f364a5a.slice/crio-8f16ff38bf6b98a93e2cea281e1545f812b1f7723ea5eddf2b5178eed68d6bae WatchSource:0}: Error finding container 8f16ff38bf6b98a93e2cea281e1545f812b1f7723ea5eddf2b5178eed68d6bae: Status 404 returned error can't find the container with id 8f16ff38bf6b98a93e2cea281e1545f812b1f7723ea5eddf2b5178eed68d6bae Apr 20 07:56:27.929932 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:27.929845 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fxwgx" event={"ID":"723bdbde-5646-4818-b4d4-06690f364a5a","Type":"ContainerStarted","Data":"8f16ff38bf6b98a93e2cea281e1545f812b1f7723ea5eddf2b5178eed68d6bae"} Apr 20 07:56:31.942919 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:31.942828 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fxwgx" event={"ID":"723bdbde-5646-4818-b4d4-06690f364a5a","Type":"ContainerStarted","Data":"23e21e36120e12bad7a0d09d4a59d49433c4d7dd56a6f04a87c9f69d596cc62b"} Apr 20 07:56:31.957100 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:56:31.957055 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-fxwgx" podStartSLOduration=1.095084438 podStartE2EDuration="4.957040436s" podCreationTimestamp="2026-04-20 07:56:27 +0000 UTC" firstStartedPulling="2026-04-20 07:56:27.696171538 +0000 UTC m=+371.609801559" lastFinishedPulling="2026-04-20 07:56:31.558127536 +0000 UTC m=+375.471757557" observedRunningTime="2026-04-20 07:56:31.956501411 +0000 UTC m=+375.870131455" watchObservedRunningTime="2026-04-20 07:56:31.957040436 +0000 UTC m=+375.870670485" Apr 20 07:57:22.303685 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:22.303650 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-wmkk6"] Apr 20 07:57:22.307281 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:22.307263 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-wmkk6" Apr 20 07:57:22.310075 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:22.310051 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 07:57:22.311364 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:22.311340 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-dm2pn\"" Apr 20 07:57:22.311364 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:22.311347 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 07:57:22.316856 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:22.316835 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-wmkk6"] Apr 20 07:57:22.377366 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:22.377334 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b582b281-cb27-421f-be72-35e9fe58e8a9-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-wmkk6\" (UID: \"b582b281-cb27-421f-be72-35e9fe58e8a9\") " pod="cert-manager/cert-manager-webhook-597b96b99b-wmkk6" Apr 20 07:57:22.377544 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:22.377394 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdhzp\" (UniqueName: \"kubernetes.io/projected/b582b281-cb27-421f-be72-35e9fe58e8a9-kube-api-access-qdhzp\") pod \"cert-manager-webhook-597b96b99b-wmkk6\" (UID: \"b582b281-cb27-421f-be72-35e9fe58e8a9\") " pod="cert-manager/cert-manager-webhook-597b96b99b-wmkk6" Apr 20 07:57:22.478131 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:22.478092 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b582b281-cb27-421f-be72-35e9fe58e8a9-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-wmkk6\" (UID: \"b582b281-cb27-421f-be72-35e9fe58e8a9\") " pod="cert-manager/cert-manager-webhook-597b96b99b-wmkk6" Apr 20 07:57:22.478332 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:22.478184 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qdhzp\" (UniqueName: \"kubernetes.io/projected/b582b281-cb27-421f-be72-35e9fe58e8a9-kube-api-access-qdhzp\") pod \"cert-manager-webhook-597b96b99b-wmkk6\" (UID: \"b582b281-cb27-421f-be72-35e9fe58e8a9\") " pod="cert-manager/cert-manager-webhook-597b96b99b-wmkk6" Apr 20 07:57:22.487261 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:22.487237 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b582b281-cb27-421f-be72-35e9fe58e8a9-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-wmkk6\" (UID: \"b582b281-cb27-421f-be72-35e9fe58e8a9\") " pod="cert-manager/cert-manager-webhook-597b96b99b-wmkk6" Apr 20 07:57:22.487389 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:22.487303 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdhzp\" (UniqueName: \"kubernetes.io/projected/b582b281-cb27-421f-be72-35e9fe58e8a9-kube-api-access-qdhzp\") pod \"cert-manager-webhook-597b96b99b-wmkk6\" (UID: \"b582b281-cb27-421f-be72-35e9fe58e8a9\") " pod="cert-manager/cert-manager-webhook-597b96b99b-wmkk6" Apr 20 07:57:22.624962 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:22.624877 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-wmkk6" Apr 20 07:57:22.750120 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:22.750091 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-wmkk6"] Apr 20 07:57:22.752977 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:57:22.752949 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb582b281_cb27_421f_be72_35e9fe58e8a9.slice/crio-af28650bf89c40c3e49e7f909b1a5caddb5e8b57572fe53d30e432747858d3ca WatchSource:0}: Error finding container af28650bf89c40c3e49e7f909b1a5caddb5e8b57572fe53d30e432747858d3ca: Status 404 returned error can't find the container with id af28650bf89c40c3e49e7f909b1a5caddb5e8b57572fe53d30e432747858d3ca Apr 20 07:57:23.100276 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:23.100235 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-wmkk6" event={"ID":"b582b281-cb27-421f-be72-35e9fe58e8a9","Type":"ContainerStarted","Data":"af28650bf89c40c3e49e7f909b1a5caddb5e8b57572fe53d30e432747858d3ca"} Apr 20 07:57:26.112106 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:26.112020 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-wmkk6" event={"ID":"b582b281-cb27-421f-be72-35e9fe58e8a9","Type":"ContainerStarted","Data":"5bb624d5864619facf47897a382675b3b4f548ba534b12ccac43dd98be4a4fa8"} Apr 20 07:57:26.112106 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:26.112078 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-wmkk6" Apr 20 07:57:26.127817 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:26.127771 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-wmkk6" podStartSLOduration=1.108375069 podStartE2EDuration="4.127757451s" podCreationTimestamp="2026-04-20 07:57:22 +0000 UTC" firstStartedPulling="2026-04-20 07:57:22.755024955 +0000 UTC m=+426.668654980" lastFinishedPulling="2026-04-20 07:57:25.774407329 +0000 UTC m=+429.688037362" observedRunningTime="2026-04-20 07:57:26.127381821 +0000 UTC m=+430.041011864" watchObservedRunningTime="2026-04-20 07:57:26.127757451 +0000 UTC m=+430.041387494" Apr 20 07:57:32.118031 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:32.117959 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-wmkk6" Apr 20 07:57:53.508944 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:53.508899 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-687c889b9-kgmn5"] Apr 20 07:57:53.512092 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:53.512071 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-kgmn5" Apr 20 07:57:53.514850 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:53.514824 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 07:57:53.514991 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:53.514887 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 07:57:53.514991 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:53.514934 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 07:57:53.515200 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:53.515184 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-9x2zw\"" Apr 20 07:57:53.515273 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:53.515208 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 07:57:53.524375 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:53.524355 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-687c889b9-kgmn5"] Apr 20 07:57:53.643895 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:53.643857 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0972f1b7-083d-4e02-bbba-f354c5c4e05f-apiservice-cert\") pod \"opendatahub-operator-controller-manager-687c889b9-kgmn5\" (UID: \"0972f1b7-083d-4e02-bbba-f354c5c4e05f\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-kgmn5" Apr 20 07:57:53.644073 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:53.643920 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6ss6\" (UniqueName: \"kubernetes.io/projected/0972f1b7-083d-4e02-bbba-f354c5c4e05f-kube-api-access-j6ss6\") pod \"opendatahub-operator-controller-manager-687c889b9-kgmn5\" (UID: \"0972f1b7-083d-4e02-bbba-f354c5c4e05f\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-kgmn5" Apr 20 07:57:53.644073 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:53.643956 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0972f1b7-083d-4e02-bbba-f354c5c4e05f-webhook-cert\") pod \"opendatahub-operator-controller-manager-687c889b9-kgmn5\" (UID: \"0972f1b7-083d-4e02-bbba-f354c5c4e05f\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-kgmn5" Apr 20 07:57:53.745248 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:53.745206 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0972f1b7-083d-4e02-bbba-f354c5c4e05f-webhook-cert\") pod \"opendatahub-operator-controller-manager-687c889b9-kgmn5\" (UID: \"0972f1b7-083d-4e02-bbba-f354c5c4e05f\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-kgmn5" Apr 20 07:57:53.745421 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:53.745269 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0972f1b7-083d-4e02-bbba-f354c5c4e05f-apiservice-cert\") pod \"opendatahub-operator-controller-manager-687c889b9-kgmn5\" (UID: \"0972f1b7-083d-4e02-bbba-f354c5c4e05f\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-kgmn5" Apr 20 07:57:53.745421 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:53.745324 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6ss6\" (UniqueName: \"kubernetes.io/projected/0972f1b7-083d-4e02-bbba-f354c5c4e05f-kube-api-access-j6ss6\") pod \"opendatahub-operator-controller-manager-687c889b9-kgmn5\" (UID: \"0972f1b7-083d-4e02-bbba-f354c5c4e05f\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-kgmn5" Apr 20 07:57:53.747736 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:53.747701 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0972f1b7-083d-4e02-bbba-f354c5c4e05f-webhook-cert\") pod \"opendatahub-operator-controller-manager-687c889b9-kgmn5\" (UID: \"0972f1b7-083d-4e02-bbba-f354c5c4e05f\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-kgmn5" Apr 20 07:57:53.747736 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:53.747717 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0972f1b7-083d-4e02-bbba-f354c5c4e05f-apiservice-cert\") pod \"opendatahub-operator-controller-manager-687c889b9-kgmn5\" (UID: \"0972f1b7-083d-4e02-bbba-f354c5c4e05f\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-kgmn5" Apr 20 07:57:53.756197 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:53.756173 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6ss6\" (UniqueName: \"kubernetes.io/projected/0972f1b7-083d-4e02-bbba-f354c5c4e05f-kube-api-access-j6ss6\") pod \"opendatahub-operator-controller-manager-687c889b9-kgmn5\" (UID: \"0972f1b7-083d-4e02-bbba-f354c5c4e05f\") " pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-kgmn5" Apr 20 07:57:53.822853 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:53.822769 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-kgmn5" Apr 20 07:57:53.958312 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:53.958281 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-687c889b9-kgmn5"] Apr 20 07:57:53.962448 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:57:53.962419 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0972f1b7_083d_4e02_bbba_f354c5c4e05f.slice/crio-55fcfff14e1c67ef0eece916f07657036e4ae4f7fb115c6c89e000f79e3f7f21 WatchSource:0}: Error finding container 55fcfff14e1c67ef0eece916f07657036e4ae4f7fb115c6c89e000f79e3f7f21: Status 404 returned error can't find the container with id 55fcfff14e1c67ef0eece916f07657036e4ae4f7fb115c6c89e000f79e3f7f21 Apr 20 07:57:54.200973 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:54.200888 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-kgmn5" event={"ID":"0972f1b7-083d-4e02-bbba-f354c5c4e05f","Type":"ContainerStarted","Data":"55fcfff14e1c67ef0eece916f07657036e4ae4f7fb115c6c89e000f79e3f7f21"} Apr 20 07:57:57.214045 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:57.214013 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-kgmn5" event={"ID":"0972f1b7-083d-4e02-bbba-f354c5c4e05f","Type":"ContainerStarted","Data":"54145ec2a7865bc23678c7c70fc2c95f52286a5f09fa1368e8ca03907d47a60b"} Apr 20 07:57:57.214551 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:57.214077 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-kgmn5" Apr 20 07:57:57.237817 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:57:57.237769 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-kgmn5" podStartSLOduration=1.85009251 podStartE2EDuration="4.237753748s" podCreationTimestamp="2026-04-20 07:57:53 +0000 UTC" firstStartedPulling="2026-04-20 07:57:53.964224892 +0000 UTC m=+457.877854913" lastFinishedPulling="2026-04-20 07:57:56.351886126 +0000 UTC m=+460.265516151" observedRunningTime="2026-04-20 07:57:57.235888475 +0000 UTC m=+461.149518518" watchObservedRunningTime="2026-04-20 07:57:57.237753748 +0000 UTC m=+461.151383790" Apr 20 07:58:08.220307 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:08.220272 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-687c889b9-kgmn5" Apr 20 07:58:08.850251 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:08.850217 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-56d8f7c9b7-2vgfb"] Apr 20 07:58:08.853317 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:08.853301 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-2vgfb" Apr 20 07:58:08.856101 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:08.856077 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 07:58:08.857266 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:08.857246 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 07:58:08.857266 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:08.857257 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 07:58:08.857449 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:08.857287 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 07:58:08.857523 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:08.857443 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:58:08.857595 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:08.857577 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-rd9xb\"" Apr 20 07:58:08.863403 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:08.863383 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-56d8f7c9b7-2vgfb"] Apr 20 07:58:08.973080 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:08.973044 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a75ba04-f150-4f98-9d3e-b2f056ad7cad-cert\") pod \"lws-controller-manager-56d8f7c9b7-2vgfb\" (UID: \"5a75ba04-f150-4f98-9d3e-b2f056ad7cad\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-2vgfb" Apr 20 07:58:08.973080 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:08.973090 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5a75ba04-f150-4f98-9d3e-b2f056ad7cad-metrics-cert\") pod \"lws-controller-manager-56d8f7c9b7-2vgfb\" (UID: \"5a75ba04-f150-4f98-9d3e-b2f056ad7cad\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-2vgfb" Apr 20 07:58:08.973320 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:08.973135 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/5a75ba04-f150-4f98-9d3e-b2f056ad7cad-manager-config\") pod \"lws-controller-manager-56d8f7c9b7-2vgfb\" (UID: \"5a75ba04-f150-4f98-9d3e-b2f056ad7cad\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-2vgfb" Apr 20 07:58:08.973320 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:08.973235 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqbck\" (UniqueName: \"kubernetes.io/projected/5a75ba04-f150-4f98-9d3e-b2f056ad7cad-kube-api-access-lqbck\") pod \"lws-controller-manager-56d8f7c9b7-2vgfb\" (UID: \"5a75ba04-f150-4f98-9d3e-b2f056ad7cad\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-2vgfb" Apr 20 07:58:09.074556 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:09.074518 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5a75ba04-f150-4f98-9d3e-b2f056ad7cad-metrics-cert\") pod \"lws-controller-manager-56d8f7c9b7-2vgfb\" (UID: \"5a75ba04-f150-4f98-9d3e-b2f056ad7cad\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-2vgfb" Apr 20 07:58:09.074705 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:09.074569 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/5a75ba04-f150-4f98-9d3e-b2f056ad7cad-manager-config\") pod \"lws-controller-manager-56d8f7c9b7-2vgfb\" (UID: \"5a75ba04-f150-4f98-9d3e-b2f056ad7cad\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-2vgfb" Apr 20 07:58:09.074705 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:09.074601 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqbck\" (UniqueName: \"kubernetes.io/projected/5a75ba04-f150-4f98-9d3e-b2f056ad7cad-kube-api-access-lqbck\") pod \"lws-controller-manager-56d8f7c9b7-2vgfb\" (UID: \"5a75ba04-f150-4f98-9d3e-b2f056ad7cad\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-2vgfb" Apr 20 07:58:09.074705 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:09.074645 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a75ba04-f150-4f98-9d3e-b2f056ad7cad-cert\") pod \"lws-controller-manager-56d8f7c9b7-2vgfb\" (UID: \"5a75ba04-f150-4f98-9d3e-b2f056ad7cad\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-2vgfb" Apr 20 07:58:09.075349 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:09.075324 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/5a75ba04-f150-4f98-9d3e-b2f056ad7cad-manager-config\") pod \"lws-controller-manager-56d8f7c9b7-2vgfb\" (UID: \"5a75ba04-f150-4f98-9d3e-b2f056ad7cad\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-2vgfb" Apr 20 07:58:09.077058 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:09.077030 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5a75ba04-f150-4f98-9d3e-b2f056ad7cad-cert\") pod \"lws-controller-manager-56d8f7c9b7-2vgfb\" (UID: \"5a75ba04-f150-4f98-9d3e-b2f056ad7cad\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-2vgfb" Apr 20 07:58:09.077201 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:09.077061 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5a75ba04-f150-4f98-9d3e-b2f056ad7cad-metrics-cert\") pod \"lws-controller-manager-56d8f7c9b7-2vgfb\" (UID: \"5a75ba04-f150-4f98-9d3e-b2f056ad7cad\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-2vgfb" Apr 20 07:58:09.087554 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:09.087532 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqbck\" (UniqueName: \"kubernetes.io/projected/5a75ba04-f150-4f98-9d3e-b2f056ad7cad-kube-api-access-lqbck\") pod \"lws-controller-manager-56d8f7c9b7-2vgfb\" (UID: \"5a75ba04-f150-4f98-9d3e-b2f056ad7cad\") " pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-2vgfb" Apr 20 07:58:09.163838 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:09.163763 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-2vgfb" Apr 20 07:58:09.298436 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:09.298404 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-56d8f7c9b7-2vgfb"] Apr 20 07:58:09.301885 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:58:09.301861 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a75ba04_f150_4f98_9d3e_b2f056ad7cad.slice/crio-6ad9b06dc15ef2b142823876f062baab64fd4dda7dabd00c6f06599957cfe0a0 WatchSource:0}: Error finding container 6ad9b06dc15ef2b142823876f062baab64fd4dda7dabd00c6f06599957cfe0a0: Status 404 returned error can't find the container with id 6ad9b06dc15ef2b142823876f062baab64fd4dda7dabd00c6f06599957cfe0a0 Apr 20 07:58:10.264220 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:10.264183 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-2vgfb" event={"ID":"5a75ba04-f150-4f98-9d3e-b2f056ad7cad","Type":"ContainerStarted","Data":"6ad9b06dc15ef2b142823876f062baab64fd4dda7dabd00c6f06599957cfe0a0"} Apr 20 07:58:14.279110 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:14.279071 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-2vgfb" event={"ID":"5a75ba04-f150-4f98-9d3e-b2f056ad7cad","Type":"ContainerStarted","Data":"2a58ae6a70d400cdd62e8a9b814978432cffc5350972e9f59a8cb5b0ffff38eb"} Apr 20 07:58:14.279622 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:14.279194 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-2vgfb" Apr 20 07:58:14.296611 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:14.296564 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-2vgfb" podStartSLOduration=2.057536136 podStartE2EDuration="6.29655097s" podCreationTimestamp="2026-04-20 07:58:08 +0000 UTC" firstStartedPulling="2026-04-20 07:58:09.303619799 +0000 UTC m=+473.217249820" lastFinishedPulling="2026-04-20 07:58:13.542634632 +0000 UTC m=+477.456264654" observedRunningTime="2026-04-20 07:58:14.294131874 +0000 UTC m=+478.207761916" watchObservedRunningTime="2026-04-20 07:58:14.29655097 +0000 UTC m=+478.210181094" Apr 20 07:58:25.285813 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:25.285776 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-56d8f7c9b7-2vgfb" Apr 20 07:58:45.092673 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.092631 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f"] Apr 20 07:58:45.096198 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.096175 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.098880 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.098858 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 07:58:45.099001 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.098858 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-k4jtp\"" Apr 20 07:58:45.105123 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.105098 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f"] Apr 20 07:58:45.181922 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.181884 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/806e8845-f708-4670-8e6f-195d03dd6803-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.182079 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.181929 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/806e8845-f708-4670-8e6f-195d03dd6803-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.182079 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.181956 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/806e8845-f708-4670-8e6f-195d03dd6803-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.182079 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.182003 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/806e8845-f708-4670-8e6f-195d03dd6803-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.182079 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.182023 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvlcj\" (UniqueName: \"kubernetes.io/projected/806e8845-f708-4670-8e6f-195d03dd6803-kube-api-access-jvlcj\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.182079 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.182052 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/806e8845-f708-4670-8e6f-195d03dd6803-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.182079 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.182073 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/806e8845-f708-4670-8e6f-195d03dd6803-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.182317 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.182090 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/806e8845-f708-4670-8e6f-195d03dd6803-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.182317 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.182169 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/806e8845-f708-4670-8e6f-195d03dd6803-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.283597 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.283562 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/806e8845-f708-4670-8e6f-195d03dd6803-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.283597 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.283603 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvlcj\" (UniqueName: \"kubernetes.io/projected/806e8845-f708-4670-8e6f-195d03dd6803-kube-api-access-jvlcj\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.283792 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.283629 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/806e8845-f708-4670-8e6f-195d03dd6803-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.283792 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.283758 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/806e8845-f708-4670-8e6f-195d03dd6803-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.283899 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.283808 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/806e8845-f708-4670-8e6f-195d03dd6803-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.283899 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.283860 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/806e8845-f708-4670-8e6f-195d03dd6803-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.284029 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.283949 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/806e8845-f708-4670-8e6f-195d03dd6803-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.284029 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.283943 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/806e8845-f708-4670-8e6f-195d03dd6803-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.284164 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.284028 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/806e8845-f708-4670-8e6f-195d03dd6803-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.284164 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.284057 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/806e8845-f708-4670-8e6f-195d03dd6803-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.284164 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.284107 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/806e8845-f708-4670-8e6f-195d03dd6803-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.284326 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.284252 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/806e8845-f708-4670-8e6f-195d03dd6803-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.284453 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.284426 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/806e8845-f708-4670-8e6f-195d03dd6803-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.284707 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.284681 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/806e8845-f708-4670-8e6f-195d03dd6803-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.286623 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.286596 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/806e8845-f708-4670-8e6f-195d03dd6803-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.286774 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.286756 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/806e8845-f708-4670-8e6f-195d03dd6803-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.291310 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.291285 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/806e8845-f708-4670-8e6f-195d03dd6803-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.291387 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.291310 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvlcj\" (UniqueName: \"kubernetes.io/projected/806e8845-f708-4670-8e6f-195d03dd6803-kube-api-access-jvlcj\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fg825f\" (UID: \"806e8845-f708-4670-8e6f-195d03dd6803\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.408654 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.408559 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:45.532547 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:45.532512 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f"] Apr 20 07:58:45.536027 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:58:45.535998 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod806e8845_f708_4670_8e6f_195d03dd6803.slice/crio-03445988906e4bb3f62844f942d1425a0ef0f597124abf1958d0d73ab76c1b8f WatchSource:0}: Error finding container 03445988906e4bb3f62844f942d1425a0ef0f597124abf1958d0d73ab76c1b8f: Status 404 returned error can't find the container with id 03445988906e4bb3f62844f942d1425a0ef0f597124abf1958d0d73ab76c1b8f Apr 20 07:58:46.393660 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:46.393600 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" event={"ID":"806e8845-f708-4670-8e6f-195d03dd6803","Type":"ContainerStarted","Data":"03445988906e4bb3f62844f942d1425a0ef0f597124abf1958d0d73ab76c1b8f"} Apr 20 07:58:47.922480 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:47.922439 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 20 07:58:47.922776 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:47.922515 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 20 07:58:47.922776 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:47.922549 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 20 07:58:48.401651 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:48.401619 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" event={"ID":"806e8845-f708-4670-8e6f-195d03dd6803","Type":"ContainerStarted","Data":"c31d26df2d7806e8389c4d32048d612b89d5ec70c7f80548827102850f863e88"} Apr 20 07:58:48.409437 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:48.409418 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:48.414225 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:48.414197 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:48.422408 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:48.422361 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" podStartSLOduration=1.037952006 podStartE2EDuration="3.422348059s" podCreationTimestamp="2026-04-20 07:58:45 +0000 UTC" firstStartedPulling="2026-04-20 07:58:45.537784209 +0000 UTC m=+509.451414234" lastFinishedPulling="2026-04-20 07:58:47.92218026 +0000 UTC m=+511.835810287" observedRunningTime="2026-04-20 07:58:48.419983814 +0000 UTC m=+512.333613857" watchObservedRunningTime="2026-04-20 07:58:48.422348059 +0000 UTC m=+512.335978203" Apr 20 07:58:49.405212 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:49.405185 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:58:49.406213 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:58:49.406195 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fg825f" Apr 20 07:59:11.203762 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:11.202214 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-86459"] Apr 20 07:59:11.207328 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:11.207303 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-86459" Apr 20 07:59:11.210380 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:11.210358 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-6qcwb\"" Apr 20 07:59:11.210481 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:11.210359 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 07:59:11.211702 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:11.211685 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 07:59:11.213723 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:11.213699 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-86459"] Apr 20 07:59:11.308581 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:11.308536 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vjlq\" (UniqueName: \"kubernetes.io/projected/b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b-kube-api-access-4vjlq\") pod \"kuadrant-operator-catalog-86459\" (UID: \"b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b\") " pod="kuadrant-system/kuadrant-operator-catalog-86459" Apr 20 07:59:11.409150 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:11.409114 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vjlq\" (UniqueName: \"kubernetes.io/projected/b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b-kube-api-access-4vjlq\") pod \"kuadrant-operator-catalog-86459\" (UID: \"b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b\") " pod="kuadrant-system/kuadrant-operator-catalog-86459" Apr 20 07:59:11.417972 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:11.417944 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vjlq\" (UniqueName: \"kubernetes.io/projected/b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b-kube-api-access-4vjlq\") pod \"kuadrant-operator-catalog-86459\" (UID: \"b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b\") " pod="kuadrant-system/kuadrant-operator-catalog-86459" Apr 20 07:59:11.518933 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:11.518898 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-86459" Apr 20 07:59:11.574975 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:11.574945 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-86459"] Apr 20 07:59:11.639079 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:11.639053 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-86459"] Apr 20 07:59:11.641093 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:59:11.641065 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb30dbf88_bb0f_43ea_9e4d_1d7bbe096a1b.slice/crio-ea88bb73ac37841fbc2df9292fd2b66752a20859c59a1977f0f788178422e0f1 WatchSource:0}: Error finding container ea88bb73ac37841fbc2df9292fd2b66752a20859c59a1977f0f788178422e0f1: Status 404 returned error can't find the container with id ea88bb73ac37841fbc2df9292fd2b66752a20859c59a1977f0f788178422e0f1 Apr 20 07:59:11.782948 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:11.782855 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-t2984"] Apr 20 07:59:11.787553 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:11.787529 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-t2984" Apr 20 07:59:11.792713 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:11.792681 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-t2984"] Apr 20 07:59:11.811835 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:11.811809 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v74l6\" (UniqueName: \"kubernetes.io/projected/6695d048-f5af-437e-955c-d4c90e9d091d-kube-api-access-v74l6\") pod \"kuadrant-operator-catalog-t2984\" (UID: \"6695d048-f5af-437e-955c-d4c90e9d091d\") " pod="kuadrant-system/kuadrant-operator-catalog-t2984" Apr 20 07:59:11.912956 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:11.912921 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v74l6\" (UniqueName: \"kubernetes.io/projected/6695d048-f5af-437e-955c-d4c90e9d091d-kube-api-access-v74l6\") pod \"kuadrant-operator-catalog-t2984\" (UID: \"6695d048-f5af-437e-955c-d4c90e9d091d\") " pod="kuadrant-system/kuadrant-operator-catalog-t2984" Apr 20 07:59:11.921390 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:11.921369 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v74l6\" (UniqueName: \"kubernetes.io/projected/6695d048-f5af-437e-955c-d4c90e9d091d-kube-api-access-v74l6\") pod \"kuadrant-operator-catalog-t2984\" (UID: \"6695d048-f5af-437e-955c-d4c90e9d091d\") " pod="kuadrant-system/kuadrant-operator-catalog-t2984" Apr 20 07:59:12.098625 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:12.098536 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-t2984" Apr 20 07:59:12.222042 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:12.221984 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-t2984"] Apr 20 07:59:12.224413 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:59:12.224384 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6695d048_f5af_437e_955c_d4c90e9d091d.slice/crio-176001bd8dec4ca8fcdb95ba3c42cd15dba4436054cd68666ca0655f735f0ed0 WatchSource:0}: Error finding container 176001bd8dec4ca8fcdb95ba3c42cd15dba4436054cd68666ca0655f735f0ed0: Status 404 returned error can't find the container with id 176001bd8dec4ca8fcdb95ba3c42cd15dba4436054cd68666ca0655f735f0ed0 Apr 20 07:59:12.484900 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:12.484866 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-t2984" event={"ID":"6695d048-f5af-437e-955c-d4c90e9d091d","Type":"ContainerStarted","Data":"176001bd8dec4ca8fcdb95ba3c42cd15dba4436054cd68666ca0655f735f0ed0"} Apr 20 07:59:12.485804 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:12.485775 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-86459" event={"ID":"b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b","Type":"ContainerStarted","Data":"ea88bb73ac37841fbc2df9292fd2b66752a20859c59a1977f0f788178422e0f1"} Apr 20 07:59:14.494115 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:14.494064 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-86459" event={"ID":"b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b","Type":"ContainerStarted","Data":"04b5e11a3e495158d10d2aafc74e63c8504f72988aacfd3ebaf99d794dc1e2cd"} Apr 20 07:59:14.494115 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:14.494097 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-86459" podUID="b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b" containerName="registry-server" containerID="cri-o://04b5e11a3e495158d10d2aafc74e63c8504f72988aacfd3ebaf99d794dc1e2cd" gracePeriod=2 Apr 20 07:59:14.495351 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:14.495330 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-t2984" event={"ID":"6695d048-f5af-437e-955c-d4c90e9d091d","Type":"ContainerStarted","Data":"0dd3741e7e014ad6903d8f4b7948fecca81ef996b0aef9bd84a27e5786230aa3"} Apr 20 07:59:14.508778 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:14.508726 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-86459" podStartSLOduration=1.475172822 podStartE2EDuration="3.508709713s" podCreationTimestamp="2026-04-20 07:59:11 +0000 UTC" firstStartedPulling="2026-04-20 07:59:11.642433595 +0000 UTC m=+535.556063615" lastFinishedPulling="2026-04-20 07:59:13.675970484 +0000 UTC m=+537.589600506" observedRunningTime="2026-04-20 07:59:14.508684752 +0000 UTC m=+538.422314794" watchObservedRunningTime="2026-04-20 07:59:14.508709713 +0000 UTC m=+538.422339757" Apr 20 07:59:14.522674 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:14.522624 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-t2984" podStartSLOduration=2.071713267 podStartE2EDuration="3.52261067s" podCreationTimestamp="2026-04-20 07:59:11 +0000 UTC" firstStartedPulling="2026-04-20 07:59:12.225957458 +0000 UTC m=+536.139587479" lastFinishedPulling="2026-04-20 07:59:13.676854848 +0000 UTC m=+537.590484882" observedRunningTime="2026-04-20 07:59:14.521888483 +0000 UTC m=+538.435518526" watchObservedRunningTime="2026-04-20 07:59:14.52261067 +0000 UTC m=+538.436240713" Apr 20 07:59:14.734658 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:14.734635 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-86459" Apr 20 07:59:14.834799 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:14.834717 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vjlq\" (UniqueName: \"kubernetes.io/projected/b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b-kube-api-access-4vjlq\") pod \"b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b\" (UID: \"b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b\") " Apr 20 07:59:14.836863 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:14.836820 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b-kube-api-access-4vjlq" (OuterVolumeSpecName: "kube-api-access-4vjlq") pod "b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b" (UID: "b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b"). InnerVolumeSpecName "kube-api-access-4vjlq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:59:14.935759 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:14.935723 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4vjlq\" (UniqueName: \"kubernetes.io/projected/b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b-kube-api-access-4vjlq\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 07:59:15.499906 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:15.499872 2572 generic.go:358] "Generic (PLEG): container finished" podID="b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b" containerID="04b5e11a3e495158d10d2aafc74e63c8504f72988aacfd3ebaf99d794dc1e2cd" exitCode=0 Apr 20 07:59:15.500329 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:15.499931 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-86459" Apr 20 07:59:15.500329 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:15.499963 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-86459" event={"ID":"b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b","Type":"ContainerDied","Data":"04b5e11a3e495158d10d2aafc74e63c8504f72988aacfd3ebaf99d794dc1e2cd"} Apr 20 07:59:15.500329 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:15.499998 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-86459" event={"ID":"b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b","Type":"ContainerDied","Data":"ea88bb73ac37841fbc2df9292fd2b66752a20859c59a1977f0f788178422e0f1"} Apr 20 07:59:15.500329 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:15.500015 2572 scope.go:117] "RemoveContainer" containerID="04b5e11a3e495158d10d2aafc74e63c8504f72988aacfd3ebaf99d794dc1e2cd" Apr 20 07:59:15.515233 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:15.515212 2572 scope.go:117] "RemoveContainer" containerID="04b5e11a3e495158d10d2aafc74e63c8504f72988aacfd3ebaf99d794dc1e2cd" Apr 20 07:59:15.515526 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:59:15.515507 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04b5e11a3e495158d10d2aafc74e63c8504f72988aacfd3ebaf99d794dc1e2cd\": container with ID starting with 04b5e11a3e495158d10d2aafc74e63c8504f72988aacfd3ebaf99d794dc1e2cd not found: ID does not exist" containerID="04b5e11a3e495158d10d2aafc74e63c8504f72988aacfd3ebaf99d794dc1e2cd" Apr 20 07:59:15.515586 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:15.515536 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04b5e11a3e495158d10d2aafc74e63c8504f72988aacfd3ebaf99d794dc1e2cd"} err="failed to get container status \"04b5e11a3e495158d10d2aafc74e63c8504f72988aacfd3ebaf99d794dc1e2cd\": rpc error: code = NotFound desc = could not find container \"04b5e11a3e495158d10d2aafc74e63c8504f72988aacfd3ebaf99d794dc1e2cd\": container with ID starting with 04b5e11a3e495158d10d2aafc74e63c8504f72988aacfd3ebaf99d794dc1e2cd not found: ID does not exist" Apr 20 07:59:15.525075 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:15.525049 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-86459"] Apr 20 07:59:15.528472 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:15.528445 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-86459"] Apr 20 07:59:16.646563 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:16.646532 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b" path="/var/lib/kubelet/pods/b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b/volumes" Apr 20 07:59:22.099237 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:22.099179 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-t2984" Apr 20 07:59:22.099779 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:22.099338 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-t2984" Apr 20 07:59:22.121067 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:22.121033 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-t2984" Apr 20 07:59:22.546320 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:22.546296 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-t2984" Apr 20 07:59:29.045293 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.045259 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-c5fff674-2tc24"] Apr 20 07:59:29.045696 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.045605 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b" containerName="registry-server" Apr 20 07:59:29.045696 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.045617 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b" containerName="registry-server" Apr 20 07:59:29.045696 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.045673 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="b30dbf88-bb0f-43ea-9e4d-1d7bbe096a1b" containerName="registry-server" Apr 20 07:59:29.054308 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.054285 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.059989 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.059959 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c5fff674-2tc24"] Apr 20 07:59:29.158614 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.158575 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0f25198-a444-44da-8838-ebd671c31732-oauth-serving-cert\") pod \"console-c5fff674-2tc24\" (UID: \"c0f25198-a444-44da-8838-ebd671c31732\") " pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.158794 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.158629 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0f25198-a444-44da-8838-ebd671c31732-console-serving-cert\") pod \"console-c5fff674-2tc24\" (UID: \"c0f25198-a444-44da-8838-ebd671c31732\") " pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.158794 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.158663 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0f25198-a444-44da-8838-ebd671c31732-trusted-ca-bundle\") pod \"console-c5fff674-2tc24\" (UID: \"c0f25198-a444-44da-8838-ebd671c31732\") " pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.158794 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.158755 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0f25198-a444-44da-8838-ebd671c31732-console-config\") pod \"console-c5fff674-2tc24\" (UID: \"c0f25198-a444-44da-8838-ebd671c31732\") " pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.158914 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.158812 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0f25198-a444-44da-8838-ebd671c31732-console-oauth-config\") pod \"console-c5fff674-2tc24\" (UID: \"c0f25198-a444-44da-8838-ebd671c31732\") " pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.158914 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.158841 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx5d7\" (UniqueName: \"kubernetes.io/projected/c0f25198-a444-44da-8838-ebd671c31732-kube-api-access-xx5d7\") pod \"console-c5fff674-2tc24\" (UID: \"c0f25198-a444-44da-8838-ebd671c31732\") " pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.158914 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.158870 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0f25198-a444-44da-8838-ebd671c31732-service-ca\") pod \"console-c5fff674-2tc24\" (UID: \"c0f25198-a444-44da-8838-ebd671c31732\") " pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.260164 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.260102 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0f25198-a444-44da-8838-ebd671c31732-console-config\") pod \"console-c5fff674-2tc24\" (UID: \"c0f25198-a444-44da-8838-ebd671c31732\") " pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.260342 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.260176 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0f25198-a444-44da-8838-ebd671c31732-console-oauth-config\") pod \"console-c5fff674-2tc24\" (UID: \"c0f25198-a444-44da-8838-ebd671c31732\") " pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.260342 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.260205 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xx5d7\" (UniqueName: \"kubernetes.io/projected/c0f25198-a444-44da-8838-ebd671c31732-kube-api-access-xx5d7\") pod \"console-c5fff674-2tc24\" (UID: \"c0f25198-a444-44da-8838-ebd671c31732\") " pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.260342 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.260242 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0f25198-a444-44da-8838-ebd671c31732-service-ca\") pod \"console-c5fff674-2tc24\" (UID: \"c0f25198-a444-44da-8838-ebd671c31732\") " pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.260342 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.260293 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0f25198-a444-44da-8838-ebd671c31732-oauth-serving-cert\") pod \"console-c5fff674-2tc24\" (UID: \"c0f25198-a444-44da-8838-ebd671c31732\") " pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.260342 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.260327 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0f25198-a444-44da-8838-ebd671c31732-console-serving-cert\") pod \"console-c5fff674-2tc24\" (UID: \"c0f25198-a444-44da-8838-ebd671c31732\") " pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.260612 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.260364 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0f25198-a444-44da-8838-ebd671c31732-trusted-ca-bundle\") pod \"console-c5fff674-2tc24\" (UID: \"c0f25198-a444-44da-8838-ebd671c31732\") " pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.260962 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.260933 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0f25198-a444-44da-8838-ebd671c31732-console-config\") pod \"console-c5fff674-2tc24\" (UID: \"c0f25198-a444-44da-8838-ebd671c31732\") " pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.261101 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.260987 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0f25198-a444-44da-8838-ebd671c31732-service-ca\") pod \"console-c5fff674-2tc24\" (UID: \"c0f25198-a444-44da-8838-ebd671c31732\") " pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.261237 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.261215 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0f25198-a444-44da-8838-ebd671c31732-oauth-serving-cert\") pod \"console-c5fff674-2tc24\" (UID: \"c0f25198-a444-44da-8838-ebd671c31732\") " pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.261313 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.261294 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0f25198-a444-44da-8838-ebd671c31732-trusted-ca-bundle\") pod \"console-c5fff674-2tc24\" (UID: \"c0f25198-a444-44da-8838-ebd671c31732\") " pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.262763 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.262743 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0f25198-a444-44da-8838-ebd671c31732-console-oauth-config\") pod \"console-c5fff674-2tc24\" (UID: \"c0f25198-a444-44da-8838-ebd671c31732\") " pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.262954 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.262937 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0f25198-a444-44da-8838-ebd671c31732-console-serving-cert\") pod \"console-c5fff674-2tc24\" (UID: \"c0f25198-a444-44da-8838-ebd671c31732\") " pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.267525 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.267506 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx5d7\" (UniqueName: \"kubernetes.io/projected/c0f25198-a444-44da-8838-ebd671c31732-kube-api-access-xx5d7\") pod \"console-c5fff674-2tc24\" (UID: \"c0f25198-a444-44da-8838-ebd671c31732\") " pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.364252 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.364169 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:29.486347 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.486321 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c5fff674-2tc24"] Apr 20 07:59:29.488458 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:59:29.488424 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0f25198_a444_44da_8838_ebd671c31732.slice/crio-7318b848fa0dfbd8339a37a15bdb8cc92bf7513092081c9fa7527f1217f93469 WatchSource:0}: Error finding container 7318b848fa0dfbd8339a37a15bdb8cc92bf7513092081c9fa7527f1217f93469: Status 404 returned error can't find the container with id 7318b848fa0dfbd8339a37a15bdb8cc92bf7513092081c9fa7527f1217f93469 Apr 20 07:59:29.551237 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:29.551211 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c5fff674-2tc24" event={"ID":"c0f25198-a444-44da-8838-ebd671c31732","Type":"ContainerStarted","Data":"7318b848fa0dfbd8339a37a15bdb8cc92bf7513092081c9fa7527f1217f93469"} Apr 20 07:59:30.555993 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:30.555961 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c5fff674-2tc24" event={"ID":"c0f25198-a444-44da-8838-ebd671c31732","Type":"ContainerStarted","Data":"be10f47d2970b93eb4e05878860a6e8dee8ec3d5794b8ce9287b86ab5599f91c"} Apr 20 07:59:30.576250 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:30.576199 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c5fff674-2tc24" podStartSLOduration=1.5761830570000002 podStartE2EDuration="1.576183057s" podCreationTimestamp="2026-04-20 07:59:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:59:30.574454487 +0000 UTC m=+554.488084543" watchObservedRunningTime="2026-04-20 07:59:30.576183057 +0000 UTC m=+554.489813099" Apr 20 07:59:39.365006 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:39.364971 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:39.365407 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:39.365041 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:39.369699 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:39.369676 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:39.588588 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:39.588558 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c5fff674-2tc24" Apr 20 07:59:39.645169 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:39.645070 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55b5d5c58d-k48ss"] Apr 20 07:59:44.309036 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:44.309000 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8"] Apr 20 07:59:44.317237 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:44.317213 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8" Apr 20 07:59:44.320729 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:44.320703 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-6dggc\"" Apr 20 07:59:44.325801 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:44.325783 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8"] Apr 20 07:59:44.397118 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:44.397076 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7b7c173e-13e1-46a9-b092-4f0379a57fb0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-448m8\" (UID: \"7b7c173e-13e1-46a9-b092-4f0379a57fb0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8" Apr 20 07:59:44.397334 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:44.397165 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x9tl\" (UniqueName: \"kubernetes.io/projected/7b7c173e-13e1-46a9-b092-4f0379a57fb0-kube-api-access-2x9tl\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-448m8\" (UID: \"7b7c173e-13e1-46a9-b092-4f0379a57fb0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8" Apr 20 07:59:44.498362 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:44.498325 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7b7c173e-13e1-46a9-b092-4f0379a57fb0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-448m8\" (UID: \"7b7c173e-13e1-46a9-b092-4f0379a57fb0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8" Apr 20 07:59:44.498559 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:44.498392 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2x9tl\" (UniqueName: \"kubernetes.io/projected/7b7c173e-13e1-46a9-b092-4f0379a57fb0-kube-api-access-2x9tl\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-448m8\" (UID: \"7b7c173e-13e1-46a9-b092-4f0379a57fb0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8" Apr 20 07:59:44.498773 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:44.498747 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7b7c173e-13e1-46a9-b092-4f0379a57fb0-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-448m8\" (UID: \"7b7c173e-13e1-46a9-b092-4f0379a57fb0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8" Apr 20 07:59:44.512596 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:44.512568 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x9tl\" (UniqueName: \"kubernetes.io/projected/7b7c173e-13e1-46a9-b092-4f0379a57fb0-kube-api-access-2x9tl\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-448m8\" (UID: \"7b7c173e-13e1-46a9-b092-4f0379a57fb0\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8" Apr 20 07:59:44.629274 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:44.629193 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8" Apr 20 07:59:44.765382 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:44.765356 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8"] Apr 20 07:59:44.767126 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:59:44.767088 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b7c173e_13e1_46a9_b092_4f0379a57fb0.slice/crio-0ff1e29c57aeaefef3b0904ced0e81eb1eeeeb357e723c84309e2888c8d207e9 WatchSource:0}: Error finding container 0ff1e29c57aeaefef3b0904ced0e81eb1eeeeb357e723c84309e2888c8d207e9: Status 404 returned error can't find the container with id 0ff1e29c57aeaefef3b0904ced0e81eb1eeeeb357e723c84309e2888c8d207e9 Apr 20 07:59:45.605860 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:45.605822 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8" event={"ID":"7b7c173e-13e1-46a9-b092-4f0379a57fb0","Type":"ContainerStarted","Data":"0ff1e29c57aeaefef3b0904ced0e81eb1eeeeb357e723c84309e2888c8d207e9"} Apr 20 07:59:49.450285 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:49.450207 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-g6z8n"] Apr 20 07:59:49.453574 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:49.453558 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-g6z8n" Apr 20 07:59:49.456611 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:49.456593 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 20 07:59:49.456677 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:49.456594 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-xv5pv\"" Apr 20 07:59:49.467050 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:49.467022 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-g6z8n"] Apr 20 07:59:49.550412 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:49.550363 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmn45\" (UniqueName: \"kubernetes.io/projected/5950397d-fe46-4d7f-9ca1-584e4f8ee9f7-kube-api-access-cmn45\") pod \"dns-operator-controller-manager-648d5c98bc-g6z8n\" (UID: \"5950397d-fe46-4d7f-9ca1-584e4f8ee9f7\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-g6z8n" Apr 20 07:59:49.623062 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:49.623029 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8" event={"ID":"7b7c173e-13e1-46a9-b092-4f0379a57fb0","Type":"ContainerStarted","Data":"f489b939931f04b9f6a8d5cfd5778ef78b5ac337776f0616db0ee28610b700d7"} Apr 20 07:59:49.623271 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:49.623170 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8" Apr 20 07:59:49.644520 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:49.644445 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8" podStartSLOduration=1.259888371 podStartE2EDuration="5.644425041s" podCreationTimestamp="2026-04-20 07:59:44 +0000 UTC" firstStartedPulling="2026-04-20 07:59:44.769633904 +0000 UTC m=+568.683263924" lastFinishedPulling="2026-04-20 07:59:49.154170569 +0000 UTC m=+573.067800594" observedRunningTime="2026-04-20 07:59:49.642338589 +0000 UTC m=+573.555968632" watchObservedRunningTime="2026-04-20 07:59:49.644425041 +0000 UTC m=+573.558055085" Apr 20 07:59:49.651704 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:49.651671 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmn45\" (UniqueName: \"kubernetes.io/projected/5950397d-fe46-4d7f-9ca1-584e4f8ee9f7-kube-api-access-cmn45\") pod \"dns-operator-controller-manager-648d5c98bc-g6z8n\" (UID: \"5950397d-fe46-4d7f-9ca1-584e4f8ee9f7\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-g6z8n" Apr 20 07:59:49.660489 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:49.660456 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmn45\" (UniqueName: \"kubernetes.io/projected/5950397d-fe46-4d7f-9ca1-584e4f8ee9f7-kube-api-access-cmn45\") pod \"dns-operator-controller-manager-648d5c98bc-g6z8n\" (UID: \"5950397d-fe46-4d7f-9ca1-584e4f8ee9f7\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-g6z8n" Apr 20 07:59:49.764122 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:49.764087 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-g6z8n" Apr 20 07:59:49.893859 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:49.893805 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-g6z8n"] Apr 20 07:59:49.896048 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:59:49.896018 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5950397d_fe46_4d7f_9ca1_584e4f8ee9f7.slice/crio-4627de3361293def0046a5c6bceb3c3fbaace722493dd9192017458baa78cc48 WatchSource:0}: Error finding container 4627de3361293def0046a5c6bceb3c3fbaace722493dd9192017458baa78cc48: Status 404 returned error can't find the container with id 4627de3361293def0046a5c6bceb3c3fbaace722493dd9192017458baa78cc48 Apr 20 07:59:50.628359 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:50.628319 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-g6z8n" event={"ID":"5950397d-fe46-4d7f-9ca1-584e4f8ee9f7","Type":"ContainerStarted","Data":"4627de3361293def0046a5c6bceb3c3fbaace722493dd9192017458baa78cc48"} Apr 20 07:59:52.169342 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:52.169259 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q22cn"] Apr 20 07:59:52.172963 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:52.172939 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q22cn" Apr 20 07:59:52.175729 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:52.175709 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 20 07:59:52.175836 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:52.175712 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-7v55j\"" Apr 20 07:59:52.175836 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:52.175712 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 20 07:59:52.182209 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:52.182183 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q22cn"] Apr 20 07:59:52.279567 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:52.279523 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0750385-e266-4f01-b87d-6b1800b78342-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-q22cn\" (UID: \"d0750385-e266-4f01-b87d-6b1800b78342\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q22cn" Apr 20 07:59:52.279567 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:52.279565 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d0750385-e266-4f01-b87d-6b1800b78342-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-q22cn\" (UID: \"d0750385-e266-4f01-b87d-6b1800b78342\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q22cn" Apr 20 07:59:52.279782 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:52.279594 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcfxm\" (UniqueName: \"kubernetes.io/projected/d0750385-e266-4f01-b87d-6b1800b78342-kube-api-access-wcfxm\") pod \"kuadrant-console-plugin-6cb54b5c86-q22cn\" (UID: \"d0750385-e266-4f01-b87d-6b1800b78342\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q22cn" Apr 20 07:59:52.381154 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:52.381114 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0750385-e266-4f01-b87d-6b1800b78342-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-q22cn\" (UID: \"d0750385-e266-4f01-b87d-6b1800b78342\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q22cn" Apr 20 07:59:52.381302 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:52.381174 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d0750385-e266-4f01-b87d-6b1800b78342-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-q22cn\" (UID: \"d0750385-e266-4f01-b87d-6b1800b78342\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q22cn" Apr 20 07:59:52.381302 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:52.381204 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wcfxm\" (UniqueName: \"kubernetes.io/projected/d0750385-e266-4f01-b87d-6b1800b78342-kube-api-access-wcfxm\") pod \"kuadrant-console-plugin-6cb54b5c86-q22cn\" (UID: \"d0750385-e266-4f01-b87d-6b1800b78342\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q22cn" Apr 20 07:59:52.381302 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:59:52.381281 2572 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 20 07:59:52.381445 ip-10-0-133-161 kubenswrapper[2572]: E0420 07:59:52.381358 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0750385-e266-4f01-b87d-6b1800b78342-plugin-serving-cert podName:d0750385-e266-4f01-b87d-6b1800b78342 nodeName:}" failed. No retries permitted until 2026-04-20 07:59:52.881336766 +0000 UTC m=+576.794966791 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/d0750385-e266-4f01-b87d-6b1800b78342-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-q22cn" (UID: "d0750385-e266-4f01-b87d-6b1800b78342") : secret "plugin-serving-cert" not found Apr 20 07:59:52.381840 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:52.381821 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d0750385-e266-4f01-b87d-6b1800b78342-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-q22cn\" (UID: \"d0750385-e266-4f01-b87d-6b1800b78342\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q22cn" Apr 20 07:59:52.390867 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:52.390843 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcfxm\" (UniqueName: \"kubernetes.io/projected/d0750385-e266-4f01-b87d-6b1800b78342-kube-api-access-wcfxm\") pod \"kuadrant-console-plugin-6cb54b5c86-q22cn\" (UID: \"d0750385-e266-4f01-b87d-6b1800b78342\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q22cn" Apr 20 07:59:52.637823 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:52.637784 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-g6z8n" event={"ID":"5950397d-fe46-4d7f-9ca1-584e4f8ee9f7","Type":"ContainerStarted","Data":"742273177a690ff500cc13dbefeef948893c6fb5c9605240e94f4e78139b36e5"} Apr 20 07:59:52.637992 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:52.637910 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-g6z8n" Apr 20 07:59:52.660277 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:52.660198 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-g6z8n" podStartSLOduration=1.676846018 podStartE2EDuration="3.660183161s" podCreationTimestamp="2026-04-20 07:59:49 +0000 UTC" firstStartedPulling="2026-04-20 07:59:49.897874995 +0000 UTC m=+573.811505015" lastFinishedPulling="2026-04-20 07:59:51.881212134 +0000 UTC m=+575.794842158" observedRunningTime="2026-04-20 07:59:52.659200418 +0000 UTC m=+576.572830461" watchObservedRunningTime="2026-04-20 07:59:52.660183161 +0000 UTC m=+576.573813204" Apr 20 07:59:52.885428 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:52.885370 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0750385-e266-4f01-b87d-6b1800b78342-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-q22cn\" (UID: \"d0750385-e266-4f01-b87d-6b1800b78342\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q22cn" Apr 20 07:59:52.887791 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:52.887770 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/d0750385-e266-4f01-b87d-6b1800b78342-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-q22cn\" (UID: \"d0750385-e266-4f01-b87d-6b1800b78342\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q22cn" Apr 20 07:59:53.084001 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:53.083966 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q22cn" Apr 20 07:59:53.205974 ip-10-0-133-161 kubenswrapper[2572]: W0420 07:59:53.205942 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0750385_e266_4f01_b87d_6b1800b78342.slice/crio-0155896ce3f1df319f7579790e9953df6176524edecb5f4d03768837202455d1 WatchSource:0}: Error finding container 0155896ce3f1df319f7579790e9953df6176524edecb5f4d03768837202455d1: Status 404 returned error can't find the container with id 0155896ce3f1df319f7579790e9953df6176524edecb5f4d03768837202455d1 Apr 20 07:59:53.211226 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:53.211208 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q22cn"] Apr 20 07:59:53.641869 ip-10-0-133-161 kubenswrapper[2572]: I0420 07:59:53.641833 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q22cn" event={"ID":"d0750385-e266-4f01-b87d-6b1800b78342","Type":"ContainerStarted","Data":"0155896ce3f1df319f7579790e9953df6176524edecb5f4d03768837202455d1"} Apr 20 08:00:00.631445 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:00.631416 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8" Apr 20 08:00:02.333930 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:02.333891 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8"] Apr 20 08:00:02.334435 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:02.334169 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8" podUID="7b7c173e-13e1-46a9-b092-4f0379a57fb0" containerName="manager" containerID="cri-o://f489b939931f04b9f6a8d5cfd5778ef78b5ac337776f0616db0ee28610b700d7" gracePeriod=2 Apr 20 08:00:02.342616 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:02.342582 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8"] Apr 20 08:00:02.353788 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:02.353754 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd"] Apr 20 08:00:02.354288 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:02.354269 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b7c173e-13e1-46a9-b092-4f0379a57fb0" containerName="manager" Apr 20 08:00:02.354374 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:02.354292 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b7c173e-13e1-46a9-b092-4f0379a57fb0" containerName="manager" Apr 20 08:00:02.354423 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:02.354414 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b7c173e-13e1-46a9-b092-4f0379a57fb0" containerName="manager" Apr 20 08:00:02.360012 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:02.359961 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd" Apr 20 08:00:02.362580 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:02.362483 2572 status_manager.go:895] "Failed to get status for pod" podUID="7b7c173e-13e1-46a9-b092-4f0379a57fb0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-448m8\" is forbidden: User \"system:node:ip-10-0-133-161.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-161.ec2.internal' and this object" Apr 20 08:00:02.370510 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:02.369908 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd"] Apr 20 08:00:02.475131 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:02.475083 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c8e884a2-1bec-4439-8d6e-4d6a101a8123-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-txprd\" (UID: \"c8e884a2-1bec-4439-8d6e-4d6a101a8123\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd" Apr 20 08:00:02.475341 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:02.475277 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4htk5\" (UniqueName: \"kubernetes.io/projected/c8e884a2-1bec-4439-8d6e-4d6a101a8123-kube-api-access-4htk5\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-txprd\" (UID: \"c8e884a2-1bec-4439-8d6e-4d6a101a8123\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd" Apr 20 08:00:02.576300 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:02.576256 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c8e884a2-1bec-4439-8d6e-4d6a101a8123-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-txprd\" (UID: \"c8e884a2-1bec-4439-8d6e-4d6a101a8123\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd" Apr 20 08:00:02.576498 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:02.576384 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4htk5\" (UniqueName: \"kubernetes.io/projected/c8e884a2-1bec-4439-8d6e-4d6a101a8123-kube-api-access-4htk5\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-txprd\" (UID: \"c8e884a2-1bec-4439-8d6e-4d6a101a8123\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd" Apr 20 08:00:02.576686 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:02.576662 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c8e884a2-1bec-4439-8d6e-4d6a101a8123-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-txprd\" (UID: \"c8e884a2-1bec-4439-8d6e-4d6a101a8123\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd" Apr 20 08:00:02.589054 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:02.588969 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4htk5\" (UniqueName: \"kubernetes.io/projected/c8e884a2-1bec-4439-8d6e-4d6a101a8123-kube-api-access-4htk5\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-txprd\" (UID: \"c8e884a2-1bec-4439-8d6e-4d6a101a8123\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd" Apr 20 08:00:02.739517 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:02.739471 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd" Apr 20 08:00:03.645409 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:03.645376 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-g6z8n" Apr 20 08:00:04.666924 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:04.666853 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-55b5d5c58d-k48ss" podUID="36381530-e8a0-4b66-87f4-c815e4685fbf" containerName="console" containerID="cri-o://94678a55169bc555a0068f9fc6ec1c2a7744096a71e8e76b89e5a74c1a746b54" gracePeriod=15 Apr 20 08:00:04.829473 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:04.829417 2572 patch_prober.go:28] interesting pod/console-55b5d5c58d-k48ss container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.134.0.26:8443/health\": dial tcp 10.134.0.26:8443: connect: connection refused" start-of-body= Apr 20 08:00:04.829647 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:04.829491 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-55b5d5c58d-k48ss" podUID="36381530-e8a0-4b66-87f4-c815e4685fbf" containerName="console" probeResult="failure" output="Get \"https://10.134.0.26:8443/health\": dial tcp 10.134.0.26:8443: connect: connection refused" Apr 20 08:00:14.742962 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.742824 2572 generic.go:358] "Generic (PLEG): container finished" podID="7b7c173e-13e1-46a9-b092-4f0379a57fb0" containerID="f489b939931f04b9f6a8d5cfd5778ef78b5ac337776f0616db0ee28610b700d7" exitCode=0 Apr 20 08:00:14.745544 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.745357 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55b5d5c58d-k48ss_36381530-e8a0-4b66-87f4-c815e4685fbf/console/0.log" Apr 20 08:00:14.745544 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.745396 2572 generic.go:358] "Generic (PLEG): container finished" podID="36381530-e8a0-4b66-87f4-c815e4685fbf" containerID="94678a55169bc555a0068f9fc6ec1c2a7744096a71e8e76b89e5a74c1a746b54" exitCode=2 Apr 20 08:00:14.745544 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.745475 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55b5d5c58d-k48ss" event={"ID":"36381530-e8a0-4b66-87f4-c815e4685fbf","Type":"ContainerDied","Data":"94678a55169bc555a0068f9fc6ec1c2a7744096a71e8e76b89e5a74c1a746b54"} Apr 20 08:00:14.797949 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.797927 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55b5d5c58d-k48ss_36381530-e8a0-4b66-87f4-c815e4685fbf/console/0.log" Apr 20 08:00:14.798082 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.798003 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 08:00:14.803950 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.803924 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd"] Apr 20 08:00:14.805068 ip-10-0-133-161 kubenswrapper[2572]: W0420 08:00:14.805043 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8e884a2_1bec_4439_8d6e_4d6a101a8123.slice/crio-47a0587d0ddaf223a2d4c8b7ca8920dd9288e93b9510b3969ff209453180a8f9 WatchSource:0}: Error finding container 47a0587d0ddaf223a2d4c8b7ca8920dd9288e93b9510b3969ff209453180a8f9: Status 404 returned error can't find the container with id 47a0587d0ddaf223a2d4c8b7ca8920dd9288e93b9510b3969ff209453180a8f9 Apr 20 08:00:14.813161 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.813125 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8" Apr 20 08:00:14.822936 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.822904 2572 status_manager.go:895] "Failed to get status for pod" podUID="7b7c173e-13e1-46a9-b092-4f0379a57fb0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-448m8\" is forbidden: User \"system:node:ip-10-0-133-161.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-161.ec2.internal' and this object" Apr 20 08:00:14.895090 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.895063 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-oauth-serving-cert\") pod \"36381530-e8a0-4b66-87f4-c815e4685fbf\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " Apr 20 08:00:14.895225 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.895113 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdplr\" (UniqueName: \"kubernetes.io/projected/36381530-e8a0-4b66-87f4-c815e4685fbf-kube-api-access-sdplr\") pod \"36381530-e8a0-4b66-87f4-c815e4685fbf\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " Apr 20 08:00:14.895225 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.895178 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-service-ca\") pod \"36381530-e8a0-4b66-87f4-c815e4685fbf\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " Apr 20 08:00:14.895330 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.895224 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-trusted-ca-bundle\") pod \"36381530-e8a0-4b66-87f4-c815e4685fbf\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " Apr 20 08:00:14.895330 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.895243 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-console-config\") pod \"36381530-e8a0-4b66-87f4-c815e4685fbf\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " Apr 20 08:00:14.895330 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.895273 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36381530-e8a0-4b66-87f4-c815e4685fbf-console-serving-cert\") pod \"36381530-e8a0-4b66-87f4-c815e4685fbf\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " Apr 20 08:00:14.895636 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.895585 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "36381530-e8a0-4b66-87f4-c815e4685fbf" (UID: "36381530-e8a0-4b66-87f4-c815e4685fbf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 08:00:14.895636 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.895623 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-console-config" (OuterVolumeSpecName: "console-config") pod "36381530-e8a0-4b66-87f4-c815e4685fbf" (UID: "36381530-e8a0-4b66-87f4-c815e4685fbf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 08:00:14.895818 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.895634 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-service-ca" (OuterVolumeSpecName: "service-ca") pod "36381530-e8a0-4b66-87f4-c815e4685fbf" (UID: "36381530-e8a0-4b66-87f4-c815e4685fbf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 08:00:14.895818 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.895653 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "36381530-e8a0-4b66-87f4-c815e4685fbf" (UID: "36381530-e8a0-4b66-87f4-c815e4685fbf"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 08:00:14.895818 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.895677 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7b7c173e-13e1-46a9-b092-4f0379a57fb0-extensions-socket-volume\") pod \"7b7c173e-13e1-46a9-b092-4f0379a57fb0\" (UID: \"7b7c173e-13e1-46a9-b092-4f0379a57fb0\") " Apr 20 08:00:14.895818 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.895744 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36381530-e8a0-4b66-87f4-c815e4685fbf-console-oauth-config\") pod \"36381530-e8a0-4b66-87f4-c815e4685fbf\" (UID: \"36381530-e8a0-4b66-87f4-c815e4685fbf\") " Apr 20 08:00:14.895818 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.895776 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x9tl\" (UniqueName: \"kubernetes.io/projected/7b7c173e-13e1-46a9-b092-4f0379a57fb0-kube-api-access-2x9tl\") pod \"7b7c173e-13e1-46a9-b092-4f0379a57fb0\" (UID: \"7b7c173e-13e1-46a9-b092-4f0379a57fb0\") " Apr 20 08:00:14.896457 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.896207 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-service-ca\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:00:14.896457 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.896230 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-trusted-ca-bundle\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:00:14.896457 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.896246 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-console-config\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:00:14.896457 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.896261 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/36381530-e8a0-4b66-87f4-c815e4685fbf-oauth-serving-cert\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:00:14.896457 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.896307 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b7c173e-13e1-46a9-b092-4f0379a57fb0-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "7b7c173e-13e1-46a9-b092-4f0379a57fb0" (UID: "7b7c173e-13e1-46a9-b092-4f0379a57fb0"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 08:00:14.897709 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.897685 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36381530-e8a0-4b66-87f4-c815e4685fbf-kube-api-access-sdplr" (OuterVolumeSpecName: "kube-api-access-sdplr") pod "36381530-e8a0-4b66-87f4-c815e4685fbf" (UID: "36381530-e8a0-4b66-87f4-c815e4685fbf"). InnerVolumeSpecName "kube-api-access-sdplr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 08:00:14.898026 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.897995 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36381530-e8a0-4b66-87f4-c815e4685fbf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "36381530-e8a0-4b66-87f4-c815e4685fbf" (UID: "36381530-e8a0-4b66-87f4-c815e4685fbf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 08:00:14.898105 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.898039 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b7c173e-13e1-46a9-b092-4f0379a57fb0-kube-api-access-2x9tl" (OuterVolumeSpecName: "kube-api-access-2x9tl") pod "7b7c173e-13e1-46a9-b092-4f0379a57fb0" (UID: "7b7c173e-13e1-46a9-b092-4f0379a57fb0"). InnerVolumeSpecName "kube-api-access-2x9tl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 08:00:14.898105 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.898070 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36381530-e8a0-4b66-87f4-c815e4685fbf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "36381530-e8a0-4b66-87f4-c815e4685fbf" (UID: "36381530-e8a0-4b66-87f4-c815e4685fbf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 08:00:14.996980 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.996931 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sdplr\" (UniqueName: \"kubernetes.io/projected/36381530-e8a0-4b66-87f4-c815e4685fbf-kube-api-access-sdplr\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:00:14.996980 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.996970 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/36381530-e8a0-4b66-87f4-c815e4685fbf-console-serving-cert\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:00:14.996980 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.996980 2572 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/7b7c173e-13e1-46a9-b092-4f0379a57fb0-extensions-socket-volume\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:00:14.996980 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.996991 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/36381530-e8a0-4b66-87f4-c815e4685fbf-console-oauth-config\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:00:14.997306 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:14.997001 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2x9tl\" (UniqueName: \"kubernetes.io/projected/7b7c173e-13e1-46a9-b092-4f0379a57fb0-kube-api-access-2x9tl\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:00:15.750080 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:15.750040 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q22cn" event={"ID":"d0750385-e266-4f01-b87d-6b1800b78342","Type":"ContainerStarted","Data":"96b91efa3f919b920edc9b5c98bff110fdb2c9327f0f2ea57c0b27f922527ca4"} Apr 20 08:00:15.751570 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:15.751542 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd" event={"ID":"c8e884a2-1bec-4439-8d6e-4d6a101a8123","Type":"ContainerStarted","Data":"7bbd4c086a0f9298e21482651288ca5bfd257975bc2def6d71106edccedfde33"} Apr 20 08:00:15.751704 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:15.751575 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd" event={"ID":"c8e884a2-1bec-4439-8d6e-4d6a101a8123","Type":"ContainerStarted","Data":"47a0587d0ddaf223a2d4c8b7ca8920dd9288e93b9510b3969ff209453180a8f9"} Apr 20 08:00:15.751704 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:15.751659 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd" Apr 20 08:00:15.752680 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:15.752662 2572 scope.go:117] "RemoveContainer" containerID="f489b939931f04b9f6a8d5cfd5778ef78b5ac337776f0616db0ee28610b700d7" Apr 20 08:00:15.752760 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:15.752666 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8" Apr 20 08:00:15.753962 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:15.753942 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-55b5d5c58d-k48ss_36381530-e8a0-4b66-87f4-c815e4685fbf/console/0.log" Apr 20 08:00:15.754065 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:15.754021 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-55b5d5c58d-k48ss" event={"ID":"36381530-e8a0-4b66-87f4-c815e4685fbf","Type":"ContainerDied","Data":"0dce5dea4a8d61dd4bdbd1182b7a43e14b50e9df9968594488de48b9f75dfc64"} Apr 20 08:00:15.754125 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:15.754072 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-55b5d5c58d-k48ss" Apr 20 08:00:15.762307 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:15.762232 2572 scope.go:117] "RemoveContainer" containerID="94678a55169bc555a0068f9fc6ec1c2a7744096a71e8e76b89e5a74c1a746b54" Apr 20 08:00:15.766062 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:15.766018 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-q22cn" podStartSLOduration=2.262875287 podStartE2EDuration="23.766004678s" podCreationTimestamp="2026-04-20 07:59:52 +0000 UTC" firstStartedPulling="2026-04-20 07:59:53.207302067 +0000 UTC m=+577.120932093" lastFinishedPulling="2026-04-20 08:00:14.710431449 +0000 UTC m=+598.624061484" observedRunningTime="2026-04-20 08:00:15.765180069 +0000 UTC m=+599.678810114" watchObservedRunningTime="2026-04-20 08:00:15.766004678 +0000 UTC m=+599.679634723" Apr 20 08:00:15.767451 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:15.767422 2572 status_manager.go:895] "Failed to get status for pod" podUID="7b7c173e-13e1-46a9-b092-4f0379a57fb0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-448m8\" is forbidden: User \"system:node:ip-10-0-133-161.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-161.ec2.internal' and this object" Apr 20 08:00:15.791405 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:15.791347 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd" podStartSLOduration=13.7913283 podStartE2EDuration="13.7913283s" podCreationTimestamp="2026-04-20 08:00:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 08:00:15.789638219 +0000 UTC m=+599.703268262" watchObservedRunningTime="2026-04-20 08:00:15.7913283 +0000 UTC m=+599.704958345" Apr 20 08:00:15.803555 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:15.803525 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-55b5d5c58d-k48ss"] Apr 20 08:00:15.806820 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:15.806788 2572 status_manager.go:895] "Failed to get status for pod" podUID="7b7c173e-13e1-46a9-b092-4f0379a57fb0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-448m8\" is forbidden: User \"system:node:ip-10-0-133-161.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-161.ec2.internal' and this object" Apr 20 08:00:15.807006 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:15.806985 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-55b5d5c58d-k48ss"] Apr 20 08:00:16.580656 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:16.580631 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2chjv_29883fa5-e5e1-425a-85c2-3b3bd3ada0aa/console-operator/1.log" Apr 20 08:00:16.580835 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:16.580781 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2chjv_29883fa5-e5e1-425a-85c2-3b3bd3ada0aa/console-operator/1.log" Apr 20 08:00:16.584814 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:16.584796 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovn-acl-logging/0.log" Apr 20 08:00:16.584814 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:16.584814 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovn-acl-logging/0.log" Apr 20 08:00:16.646133 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:16.646099 2572 status_manager.go:895] "Failed to get status for pod" podUID="7b7c173e-13e1-46a9-b092-4f0379a57fb0" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-448m8" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-448m8\" is forbidden: User \"system:node:ip-10-0-133-161.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-161.ec2.internal' and this object" Apr 20 08:00:16.646884 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:16.646863 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36381530-e8a0-4b66-87f4-c815e4685fbf" path="/var/lib/kubelet/pods/36381530-e8a0-4b66-87f4-c815e4685fbf/volumes" Apr 20 08:00:16.647335 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:16.647319 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b7c173e-13e1-46a9-b092-4f0379a57fb0" path="/var/lib/kubelet/pods/7b7c173e-13e1-46a9-b092-4f0379a57fb0/volumes" Apr 20 08:00:20.330171 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:20.330122 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd"] Apr 20 08:00:20.330620 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:20.330374 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd" podUID="c8e884a2-1bec-4439-8d6e-4d6a101a8123" containerName="manager" containerID="cri-o://7bbd4c086a0f9298e21482651288ca5bfd257975bc2def6d71106edccedfde33" gracePeriod=10 Apr 20 08:00:20.332382 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:20.332356 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd" Apr 20 08:00:20.579649 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:20.579625 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd" Apr 20 08:00:20.746873 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:20.746832 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4htk5\" (UniqueName: \"kubernetes.io/projected/c8e884a2-1bec-4439-8d6e-4d6a101a8123-kube-api-access-4htk5\") pod \"c8e884a2-1bec-4439-8d6e-4d6a101a8123\" (UID: \"c8e884a2-1bec-4439-8d6e-4d6a101a8123\") " Apr 20 08:00:20.747046 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:20.746931 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c8e884a2-1bec-4439-8d6e-4d6a101a8123-extensions-socket-volume\") pod \"c8e884a2-1bec-4439-8d6e-4d6a101a8123\" (UID: \"c8e884a2-1bec-4439-8d6e-4d6a101a8123\") " Apr 20 08:00:20.747390 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:20.747354 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8e884a2-1bec-4439-8d6e-4d6a101a8123-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "c8e884a2-1bec-4439-8d6e-4d6a101a8123" (UID: "c8e884a2-1bec-4439-8d6e-4d6a101a8123"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 08:00:20.748979 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:20.748945 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8e884a2-1bec-4439-8d6e-4d6a101a8123-kube-api-access-4htk5" (OuterVolumeSpecName: "kube-api-access-4htk5") pod "c8e884a2-1bec-4439-8d6e-4d6a101a8123" (UID: "c8e884a2-1bec-4439-8d6e-4d6a101a8123"). InnerVolumeSpecName "kube-api-access-4htk5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 08:00:20.772080 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:20.772045 2572 generic.go:358] "Generic (PLEG): container finished" podID="c8e884a2-1bec-4439-8d6e-4d6a101a8123" containerID="7bbd4c086a0f9298e21482651288ca5bfd257975bc2def6d71106edccedfde33" exitCode=0 Apr 20 08:00:20.772220 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:20.772109 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd" Apr 20 08:00:20.772220 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:20.772132 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd" event={"ID":"c8e884a2-1bec-4439-8d6e-4d6a101a8123","Type":"ContainerDied","Data":"7bbd4c086a0f9298e21482651288ca5bfd257975bc2def6d71106edccedfde33"} Apr 20 08:00:20.772220 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:20.772182 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd" event={"ID":"c8e884a2-1bec-4439-8d6e-4d6a101a8123","Type":"ContainerDied","Data":"47a0587d0ddaf223a2d4c8b7ca8920dd9288e93b9510b3969ff209453180a8f9"} Apr 20 08:00:20.772220 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:20.772196 2572 scope.go:117] "RemoveContainer" containerID="7bbd4c086a0f9298e21482651288ca5bfd257975bc2def6d71106edccedfde33" Apr 20 08:00:20.780836 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:20.780816 2572 scope.go:117] "RemoveContainer" containerID="7bbd4c086a0f9298e21482651288ca5bfd257975bc2def6d71106edccedfde33" Apr 20 08:00:20.781105 ip-10-0-133-161 kubenswrapper[2572]: E0420 08:00:20.781087 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bbd4c086a0f9298e21482651288ca5bfd257975bc2def6d71106edccedfde33\": container with ID starting with 7bbd4c086a0f9298e21482651288ca5bfd257975bc2def6d71106edccedfde33 not found: ID does not exist" containerID="7bbd4c086a0f9298e21482651288ca5bfd257975bc2def6d71106edccedfde33" Apr 20 08:00:20.781191 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:20.781113 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bbd4c086a0f9298e21482651288ca5bfd257975bc2def6d71106edccedfde33"} err="failed to get container status \"7bbd4c086a0f9298e21482651288ca5bfd257975bc2def6d71106edccedfde33\": rpc error: code = NotFound desc = could not find container \"7bbd4c086a0f9298e21482651288ca5bfd257975bc2def6d71106edccedfde33\": container with ID starting with 7bbd4c086a0f9298e21482651288ca5bfd257975bc2def6d71106edccedfde33 not found: ID does not exist" Apr 20 08:00:20.793566 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:20.793536 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd"] Apr 20 08:00:20.797207 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:20.797181 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-txprd"] Apr 20 08:00:20.847764 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:20.847729 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4htk5\" (UniqueName: \"kubernetes.io/projected/c8e884a2-1bec-4439-8d6e-4d6a101a8123-kube-api-access-4htk5\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:00:20.847764 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:20.847760 2572 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/c8e884a2-1bec-4439-8d6e-4d6a101a8123-extensions-socket-volume\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:00:22.646487 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:22.646456 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8e884a2-1bec-4439-8d6e-4d6a101a8123" path="/var/lib/kubelet/pods/c8e884a2-1bec-4439-8d6e-4d6a101a8123/volumes" Apr 20 08:00:36.553566 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.553467 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg"] Apr 20 08:00:36.554105 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.553870 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8e884a2-1bec-4439-8d6e-4d6a101a8123" containerName="manager" Apr 20 08:00:36.554105 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.553884 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8e884a2-1bec-4439-8d6e-4d6a101a8123" containerName="manager" Apr 20 08:00:36.554105 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.553904 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36381530-e8a0-4b66-87f4-c815e4685fbf" containerName="console" Apr 20 08:00:36.554105 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.553909 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="36381530-e8a0-4b66-87f4-c815e4685fbf" containerName="console" Apr 20 08:00:36.554105 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.553959 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8e884a2-1bec-4439-8d6e-4d6a101a8123" containerName="manager" Apr 20 08:00:36.554105 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.553967 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="36381530-e8a0-4b66-87f4-c815e4685fbf" containerName="console" Apr 20 08:00:36.613208 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.613172 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg"] Apr 20 08:00:36.613366 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.613267 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.615776 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.615758 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-dl8c2\"" Apr 20 08:00:36.685874 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.685832 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/4d018cfd-d796-4b7a-8057-0fab9fde2781-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.686054 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.685881 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/4d018cfd-d796-4b7a-8057-0fab9fde2781-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.686054 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.685961 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/4d018cfd-d796-4b7a-8057-0fab9fde2781-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.686054 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.685987 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/4d018cfd-d796-4b7a-8057-0fab9fde2781-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.686054 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.686014 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/4d018cfd-d796-4b7a-8057-0fab9fde2781-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.686365 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.686059 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/4d018cfd-d796-4b7a-8057-0fab9fde2781-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.686365 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.686103 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-459t6\" (UniqueName: \"kubernetes.io/projected/4d018cfd-d796-4b7a-8057-0fab9fde2781-kube-api-access-459t6\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.686365 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.686172 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4d018cfd-d796-4b7a-8057-0fab9fde2781-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.686365 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.686240 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/4d018cfd-d796-4b7a-8057-0fab9fde2781-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.787280 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.787238 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-459t6\" (UniqueName: \"kubernetes.io/projected/4d018cfd-d796-4b7a-8057-0fab9fde2781-kube-api-access-459t6\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.787471 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.787297 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4d018cfd-d796-4b7a-8057-0fab9fde2781-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.787471 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.787342 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/4d018cfd-d796-4b7a-8057-0fab9fde2781-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.787471 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.787395 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/4d018cfd-d796-4b7a-8057-0fab9fde2781-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.787471 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.787419 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/4d018cfd-d796-4b7a-8057-0fab9fde2781-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.787663 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.787484 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/4d018cfd-d796-4b7a-8057-0fab9fde2781-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.787663 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.787511 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/4d018cfd-d796-4b7a-8057-0fab9fde2781-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.787663 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.787541 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/4d018cfd-d796-4b7a-8057-0fab9fde2781-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.787663 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.787565 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/4d018cfd-d796-4b7a-8057-0fab9fde2781-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.787849 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.787824 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/4d018cfd-d796-4b7a-8057-0fab9fde2781-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.787912 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.787866 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/4d018cfd-d796-4b7a-8057-0fab9fde2781-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.787999 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.787977 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/4d018cfd-d796-4b7a-8057-0fab9fde2781-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.788101 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.788083 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/4d018cfd-d796-4b7a-8057-0fab9fde2781-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.788180 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.788163 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/4d018cfd-d796-4b7a-8057-0fab9fde2781-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.789871 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.789848 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/4d018cfd-d796-4b7a-8057-0fab9fde2781-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.790027 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.790011 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/4d018cfd-d796-4b7a-8057-0fab9fde2781-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.800231 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.800202 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-459t6\" (UniqueName: \"kubernetes.io/projected/4d018cfd-d796-4b7a-8057-0fab9fde2781-kube-api-access-459t6\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.800336 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.800203 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4d018cfd-d796-4b7a-8057-0fab9fde2781-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-cvmwg\" (UID: \"4d018cfd-d796-4b7a-8057-0fab9fde2781\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:36.923712 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:36.923635 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:37.049691 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:37.049653 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg"] Apr 20 08:00:37.053046 ip-10-0-133-161 kubenswrapper[2572]: W0420 08:00:37.053018 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d018cfd_d796_4b7a_8057_0fab9fde2781.slice/crio-6595bfae4d96289c81161e37d9ce4b396053d8426af5331c099ac3b736014049 WatchSource:0}: Error finding container 6595bfae4d96289c81161e37d9ce4b396053d8426af5331c099ac3b736014049: Status 404 returned error can't find the container with id 6595bfae4d96289c81161e37d9ce4b396053d8426af5331c099ac3b736014049 Apr 20 08:00:37.055218 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:37.055172 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 20 08:00:37.055290 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:37.055248 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 20 08:00:37.055290 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:37.055278 2572 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236216Ki","pods":"250"} Apr 20 08:00:37.832434 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:37.832395 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" event={"ID":"4d018cfd-d796-4b7a-8057-0fab9fde2781","Type":"ContainerStarted","Data":"ddd3bad70cbeab829e18086f20e9765ae050f72cd8165d21c63c826fb115fffb"} Apr 20 08:00:37.832434 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:37.832435 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" event={"ID":"4d018cfd-d796-4b7a-8057-0fab9fde2781","Type":"ContainerStarted","Data":"6595bfae4d96289c81161e37d9ce4b396053d8426af5331c099ac3b736014049"} Apr 20 08:00:37.851215 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:37.851167 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" podStartSLOduration=1.851152192 podStartE2EDuration="1.851152192s" podCreationTimestamp="2026-04-20 08:00:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 08:00:37.848301316 +0000 UTC m=+621.761931359" watchObservedRunningTime="2026-04-20 08:00:37.851152192 +0000 UTC m=+621.764782229" Apr 20 08:00:37.923942 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:37.923898 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:38.929352 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:38.929322 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:39.838968 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:39.838942 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:39.839978 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:39.839961 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-cvmwg" Apr 20 08:00:40.863749 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:40.863662 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-7wwgn"] Apr 20 08:00:41.377522 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.377477 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-7wwgn"] Apr 20 08:00:41.377522 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.377518 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-7wwgn"] Apr 20 08:00:41.377833 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.377643 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-7wwgn" Apr 20 08:00:41.380197 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.380176 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 08:00:41.428289 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.428253 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vh6g\" (UniqueName: \"kubernetes.io/projected/cc5c932f-1946-4794-b647-034fcda4c8c8-kube-api-access-8vh6g\") pod \"limitador-limitador-7d549b5b-7wwgn\" (UID: \"cc5c932f-1946-4794-b647-034fcda4c8c8\") " pod="kuadrant-system/limitador-limitador-7d549b5b-7wwgn" Apr 20 08:00:41.428460 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.428371 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/cc5c932f-1946-4794-b647-034fcda4c8c8-config-file\") pod \"limitador-limitador-7d549b5b-7wwgn\" (UID: \"cc5c932f-1946-4794-b647-034fcda4c8c8\") " pod="kuadrant-system/limitador-limitador-7d549b5b-7wwgn" Apr 20 08:00:41.529586 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.529551 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/cc5c932f-1946-4794-b647-034fcda4c8c8-config-file\") pod \"limitador-limitador-7d549b5b-7wwgn\" (UID: \"cc5c932f-1946-4794-b647-034fcda4c8c8\") " pod="kuadrant-system/limitador-limitador-7d549b5b-7wwgn" Apr 20 08:00:41.529733 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.529625 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vh6g\" (UniqueName: \"kubernetes.io/projected/cc5c932f-1946-4794-b647-034fcda4c8c8-kube-api-access-8vh6g\") pod \"limitador-limitador-7d549b5b-7wwgn\" (UID: \"cc5c932f-1946-4794-b647-034fcda4c8c8\") " pod="kuadrant-system/limitador-limitador-7d549b5b-7wwgn" Apr 20 08:00:41.530217 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.530194 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/cc5c932f-1946-4794-b647-034fcda4c8c8-config-file\") pod \"limitador-limitador-7d549b5b-7wwgn\" (UID: \"cc5c932f-1946-4794-b647-034fcda4c8c8\") " pod="kuadrant-system/limitador-limitador-7d549b5b-7wwgn" Apr 20 08:00:41.537023 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.536989 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vh6g\" (UniqueName: \"kubernetes.io/projected/cc5c932f-1946-4794-b647-034fcda4c8c8-kube-api-access-8vh6g\") pod \"limitador-limitador-7d549b5b-7wwgn\" (UID: \"cc5c932f-1946-4794-b647-034fcda4c8c8\") " pod="kuadrant-system/limitador-limitador-7d549b5b-7wwgn" Apr 20 08:00:41.662925 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.662848 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-2f64v"] Apr 20 08:00:41.688464 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.688437 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-7wwgn" Apr 20 08:00:41.707946 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.707906 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-2f64v"] Apr 20 08:00:41.708075 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.708028 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-2f64v" Apr 20 08:00:41.710823 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.710797 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-lzczx\"" Apr 20 08:00:41.731503 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.731469 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb27q\" (UniqueName: \"kubernetes.io/projected/a4dd4324-5857-4864-b658-fcd02f939398-kube-api-access-vb27q\") pod \"authorino-f99f4b5cd-2f64v\" (UID: \"a4dd4324-5857-4864-b658-fcd02f939398\") " pod="kuadrant-system/authorino-f99f4b5cd-2f64v" Apr 20 08:00:41.779796 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.779758 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-424xg"] Apr 20 08:00:41.798375 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.798337 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-424xg"] Apr 20 08:00:41.798524 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.798472 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-424xg" Apr 20 08:00:41.816610 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.816586 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-7wwgn"] Apr 20 08:00:41.819009 ip-10-0-133-161 kubenswrapper[2572]: W0420 08:00:41.818982 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc5c932f_1946_4794_b647_034fcda4c8c8.slice/crio-36577f0df8a1d2f826ae57f7725fd92daac77e224619078275f4dae6c1f701c2 WatchSource:0}: Error finding container 36577f0df8a1d2f826ae57f7725fd92daac77e224619078275f4dae6c1f701c2: Status 404 returned error can't find the container with id 36577f0df8a1d2f826ae57f7725fd92daac77e224619078275f4dae6c1f701c2 Apr 20 08:00:41.832421 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.832394 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vb27q\" (UniqueName: \"kubernetes.io/projected/a4dd4324-5857-4864-b658-fcd02f939398-kube-api-access-vb27q\") pod \"authorino-f99f4b5cd-2f64v\" (UID: \"a4dd4324-5857-4864-b658-fcd02f939398\") " pod="kuadrant-system/authorino-f99f4b5cd-2f64v" Apr 20 08:00:41.832509 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.832491 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r6td\" (UniqueName: \"kubernetes.io/projected/5702edf2-5fe5-4868-b122-9999b961b526-kube-api-access-9r6td\") pod \"authorino-7498df8756-424xg\" (UID: \"5702edf2-5fe5-4868-b122-9999b961b526\") " pod="kuadrant-system/authorino-7498df8756-424xg" Apr 20 08:00:41.839975 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.839951 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb27q\" (UniqueName: \"kubernetes.io/projected/a4dd4324-5857-4864-b658-fcd02f939398-kube-api-access-vb27q\") pod \"authorino-f99f4b5cd-2f64v\" (UID: \"a4dd4324-5857-4864-b658-fcd02f939398\") " pod="kuadrant-system/authorino-f99f4b5cd-2f64v" Apr 20 08:00:41.849526 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.849501 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-7wwgn" event={"ID":"cc5c932f-1946-4794-b647-034fcda4c8c8","Type":"ContainerStarted","Data":"36577f0df8a1d2f826ae57f7725fd92daac77e224619078275f4dae6c1f701c2"} Apr 20 08:00:41.933676 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.933582 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9r6td\" (UniqueName: \"kubernetes.io/projected/5702edf2-5fe5-4868-b122-9999b961b526-kube-api-access-9r6td\") pod \"authorino-7498df8756-424xg\" (UID: \"5702edf2-5fe5-4868-b122-9999b961b526\") " pod="kuadrant-system/authorino-7498df8756-424xg" Apr 20 08:00:41.941002 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:41.940978 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r6td\" (UniqueName: \"kubernetes.io/projected/5702edf2-5fe5-4868-b122-9999b961b526-kube-api-access-9r6td\") pod \"authorino-7498df8756-424xg\" (UID: \"5702edf2-5fe5-4868-b122-9999b961b526\") " pod="kuadrant-system/authorino-7498df8756-424xg" Apr 20 08:00:42.022084 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:42.022051 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-2f64v" Apr 20 08:00:42.108363 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:42.108327 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-424xg" Apr 20 08:00:42.149485 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:42.149402 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-2f64v"] Apr 20 08:00:42.152923 ip-10-0-133-161 kubenswrapper[2572]: W0420 08:00:42.152893 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4dd4324_5857_4864_b658_fcd02f939398.slice/crio-551f6127116fdf9d68b42506f100b2308cbfac66b5097e7c250060c18b068033 WatchSource:0}: Error finding container 551f6127116fdf9d68b42506f100b2308cbfac66b5097e7c250060c18b068033: Status 404 returned error can't find the container with id 551f6127116fdf9d68b42506f100b2308cbfac66b5097e7c250060c18b068033 Apr 20 08:00:42.238786 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:42.238685 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-424xg"] Apr 20 08:00:42.240891 ip-10-0-133-161 kubenswrapper[2572]: W0420 08:00:42.240863 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5702edf2_5fe5_4868_b122_9999b961b526.slice/crio-6272104d87c68b8ae16dd75a10164fd14f20eb0790eb69481b30bf785070fded WatchSource:0}: Error finding container 6272104d87c68b8ae16dd75a10164fd14f20eb0790eb69481b30bf785070fded: Status 404 returned error can't find the container with id 6272104d87c68b8ae16dd75a10164fd14f20eb0790eb69481b30bf785070fded Apr 20 08:00:42.855631 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:42.855591 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-424xg" event={"ID":"5702edf2-5fe5-4868-b122-9999b961b526","Type":"ContainerStarted","Data":"6272104d87c68b8ae16dd75a10164fd14f20eb0790eb69481b30bf785070fded"} Apr 20 08:00:42.857106 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:42.857078 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-2f64v" event={"ID":"a4dd4324-5857-4864-b658-fcd02f939398","Type":"ContainerStarted","Data":"551f6127116fdf9d68b42506f100b2308cbfac66b5097e7c250060c18b068033"} Apr 20 08:00:46.879351 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:46.879310 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-424xg" event={"ID":"5702edf2-5fe5-4868-b122-9999b961b526","Type":"ContainerStarted","Data":"ce754e751f9d82ddfd30a9deb185ecdce2a4385ce183ac0c806608f45fa6faf5"} Apr 20 08:00:46.880723 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:46.880688 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-2f64v" event={"ID":"a4dd4324-5857-4864-b658-fcd02f939398","Type":"ContainerStarted","Data":"44172e4ea8f25b5006d62e42b0f4be3be6e1bc3f1495ac3001e312dc37c8df5b"} Apr 20 08:00:46.882011 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:46.881993 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-7wwgn" event={"ID":"cc5c932f-1946-4794-b647-034fcda4c8c8","Type":"ContainerStarted","Data":"16fa085c0c240aa152ee10c02644352de7a0a4cc0d6f803b9f9413a732d87ccd"} Apr 20 08:00:46.882199 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:46.882184 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-7wwgn" Apr 20 08:00:46.894044 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:46.893995 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-424xg" podStartSLOduration=2.327590133 podStartE2EDuration="5.893981924s" podCreationTimestamp="2026-04-20 08:00:41 +0000 UTC" firstStartedPulling="2026-04-20 08:00:42.24230578 +0000 UTC m=+626.155935801" lastFinishedPulling="2026-04-20 08:00:45.808697571 +0000 UTC m=+629.722327592" observedRunningTime="2026-04-20 08:00:46.892925104 +0000 UTC m=+630.806555147" watchObservedRunningTime="2026-04-20 08:00:46.893981924 +0000 UTC m=+630.807611968" Apr 20 08:00:46.906462 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:46.906412 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-2f64v" podStartSLOduration=2.184753468 podStartE2EDuration="5.906392913s" podCreationTimestamp="2026-04-20 08:00:41 +0000 UTC" firstStartedPulling="2026-04-20 08:00:42.154275519 +0000 UTC m=+626.067905540" lastFinishedPulling="2026-04-20 08:00:45.875914961 +0000 UTC m=+629.789544985" observedRunningTime="2026-04-20 08:00:46.905754148 +0000 UTC m=+630.819384193" watchObservedRunningTime="2026-04-20 08:00:46.906392913 +0000 UTC m=+630.820022958" Apr 20 08:00:46.922106 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:46.922057 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-7wwgn" podStartSLOduration=2.871698859 podStartE2EDuration="6.922039508s" podCreationTimestamp="2026-04-20 08:00:40 +0000 UTC" firstStartedPulling="2026-04-20 08:00:41.820957709 +0000 UTC m=+625.734587730" lastFinishedPulling="2026-04-20 08:00:45.871298352 +0000 UTC m=+629.784928379" observedRunningTime="2026-04-20 08:00:46.920224887 +0000 UTC m=+630.833854930" watchObservedRunningTime="2026-04-20 08:00:46.922039508 +0000 UTC m=+630.835669551" Apr 20 08:00:46.932869 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:46.932835 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-2f64v"] Apr 20 08:00:48.889776 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:48.889737 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-2f64v" podUID="a4dd4324-5857-4864-b658-fcd02f939398" containerName="authorino" containerID="cri-o://44172e4ea8f25b5006d62e42b0f4be3be6e1bc3f1495ac3001e312dc37c8df5b" gracePeriod=30 Apr 20 08:00:49.133590 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:49.133567 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-2f64v" Apr 20 08:00:49.200771 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:49.200679 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb27q\" (UniqueName: \"kubernetes.io/projected/a4dd4324-5857-4864-b658-fcd02f939398-kube-api-access-vb27q\") pod \"a4dd4324-5857-4864-b658-fcd02f939398\" (UID: \"a4dd4324-5857-4864-b658-fcd02f939398\") " Apr 20 08:00:49.202754 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:49.202727 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4dd4324-5857-4864-b658-fcd02f939398-kube-api-access-vb27q" (OuterVolumeSpecName: "kube-api-access-vb27q") pod "a4dd4324-5857-4864-b658-fcd02f939398" (UID: "a4dd4324-5857-4864-b658-fcd02f939398"). InnerVolumeSpecName "kube-api-access-vb27q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 08:00:49.301779 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:49.301744 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vb27q\" (UniqueName: \"kubernetes.io/projected/a4dd4324-5857-4864-b658-fcd02f939398-kube-api-access-vb27q\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:00:49.894606 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:49.894573 2572 generic.go:358] "Generic (PLEG): container finished" podID="a4dd4324-5857-4864-b658-fcd02f939398" containerID="44172e4ea8f25b5006d62e42b0f4be3be6e1bc3f1495ac3001e312dc37c8df5b" exitCode=0 Apr 20 08:00:49.895035 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:49.894621 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-2f64v" Apr 20 08:00:49.895035 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:49.894637 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-2f64v" event={"ID":"a4dd4324-5857-4864-b658-fcd02f939398","Type":"ContainerDied","Data":"44172e4ea8f25b5006d62e42b0f4be3be6e1bc3f1495ac3001e312dc37c8df5b"} Apr 20 08:00:49.895035 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:49.894682 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-2f64v" event={"ID":"a4dd4324-5857-4864-b658-fcd02f939398","Type":"ContainerDied","Data":"551f6127116fdf9d68b42506f100b2308cbfac66b5097e7c250060c18b068033"} Apr 20 08:00:49.895035 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:49.894699 2572 scope.go:117] "RemoveContainer" containerID="44172e4ea8f25b5006d62e42b0f4be3be6e1bc3f1495ac3001e312dc37c8df5b" Apr 20 08:00:49.903057 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:49.903040 2572 scope.go:117] "RemoveContainer" containerID="44172e4ea8f25b5006d62e42b0f4be3be6e1bc3f1495ac3001e312dc37c8df5b" Apr 20 08:00:49.903346 ip-10-0-133-161 kubenswrapper[2572]: E0420 08:00:49.903323 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44172e4ea8f25b5006d62e42b0f4be3be6e1bc3f1495ac3001e312dc37c8df5b\": container with ID starting with 44172e4ea8f25b5006d62e42b0f4be3be6e1bc3f1495ac3001e312dc37c8df5b not found: ID does not exist" containerID="44172e4ea8f25b5006d62e42b0f4be3be6e1bc3f1495ac3001e312dc37c8df5b" Apr 20 08:00:49.903436 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:49.903351 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44172e4ea8f25b5006d62e42b0f4be3be6e1bc3f1495ac3001e312dc37c8df5b"} err="failed to get container status \"44172e4ea8f25b5006d62e42b0f4be3be6e1bc3f1495ac3001e312dc37c8df5b\": rpc error: code = NotFound desc = could not find container \"44172e4ea8f25b5006d62e42b0f4be3be6e1bc3f1495ac3001e312dc37c8df5b\": container with ID starting with 44172e4ea8f25b5006d62e42b0f4be3be6e1bc3f1495ac3001e312dc37c8df5b not found: ID does not exist" Apr 20 08:00:49.915080 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:49.915053 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-2f64v"] Apr 20 08:00:49.919215 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:49.919191 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-2f64v"] Apr 20 08:00:50.646438 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:50.646404 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4dd4324-5857-4864-b658-fcd02f939398" path="/var/lib/kubelet/pods/a4dd4324-5857-4864-b658-fcd02f939398/volumes" Apr 20 08:00:56.324907 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:56.324869 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-7wwgn"] Apr 20 08:00:56.325327 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:56.325129 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-7wwgn" podUID="cc5c932f-1946-4794-b647-034fcda4c8c8" containerName="limitador" containerID="cri-o://16fa085c0c240aa152ee10c02644352de7a0a4cc0d6f803b9f9413a732d87ccd" gracePeriod=30 Apr 20 08:00:56.325824 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:56.325739 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-7wwgn" Apr 20 08:00:56.873985 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:56.873960 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-7wwgn" Apr 20 08:00:56.920648 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:56.920561 2572 generic.go:358] "Generic (PLEG): container finished" podID="cc5c932f-1946-4794-b647-034fcda4c8c8" containerID="16fa085c0c240aa152ee10c02644352de7a0a4cc0d6f803b9f9413a732d87ccd" exitCode=0 Apr 20 08:00:56.920648 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:56.920622 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-7wwgn" Apr 20 08:00:56.920648 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:56.920618 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-7wwgn" event={"ID":"cc5c932f-1946-4794-b647-034fcda4c8c8","Type":"ContainerDied","Data":"16fa085c0c240aa152ee10c02644352de7a0a4cc0d6f803b9f9413a732d87ccd"} Apr 20 08:00:56.920870 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:56.920657 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-7wwgn" event={"ID":"cc5c932f-1946-4794-b647-034fcda4c8c8","Type":"ContainerDied","Data":"36577f0df8a1d2f826ae57f7725fd92daac77e224619078275f4dae6c1f701c2"} Apr 20 08:00:56.920870 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:56.920672 2572 scope.go:117] "RemoveContainer" containerID="16fa085c0c240aa152ee10c02644352de7a0a4cc0d6f803b9f9413a732d87ccd" Apr 20 08:00:56.928826 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:56.928811 2572 scope.go:117] "RemoveContainer" containerID="16fa085c0c240aa152ee10c02644352de7a0a4cc0d6f803b9f9413a732d87ccd" Apr 20 08:00:56.929077 ip-10-0-133-161 kubenswrapper[2572]: E0420 08:00:56.929057 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16fa085c0c240aa152ee10c02644352de7a0a4cc0d6f803b9f9413a732d87ccd\": container with ID starting with 16fa085c0c240aa152ee10c02644352de7a0a4cc0d6f803b9f9413a732d87ccd not found: ID does not exist" containerID="16fa085c0c240aa152ee10c02644352de7a0a4cc0d6f803b9f9413a732d87ccd" Apr 20 08:00:56.929155 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:56.929091 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16fa085c0c240aa152ee10c02644352de7a0a4cc0d6f803b9f9413a732d87ccd"} err="failed to get container status \"16fa085c0c240aa152ee10c02644352de7a0a4cc0d6f803b9f9413a732d87ccd\": rpc error: code = NotFound desc = could not find container \"16fa085c0c240aa152ee10c02644352de7a0a4cc0d6f803b9f9413a732d87ccd\": container with ID starting with 16fa085c0c240aa152ee10c02644352de7a0a4cc0d6f803b9f9413a732d87ccd not found: ID does not exist" Apr 20 08:00:56.964673 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:56.964635 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/cc5c932f-1946-4794-b647-034fcda4c8c8-config-file\") pod \"cc5c932f-1946-4794-b647-034fcda4c8c8\" (UID: \"cc5c932f-1946-4794-b647-034fcda4c8c8\") " Apr 20 08:00:56.964820 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:56.964698 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vh6g\" (UniqueName: \"kubernetes.io/projected/cc5c932f-1946-4794-b647-034fcda4c8c8-kube-api-access-8vh6g\") pod \"cc5c932f-1946-4794-b647-034fcda4c8c8\" (UID: \"cc5c932f-1946-4794-b647-034fcda4c8c8\") " Apr 20 08:00:56.964998 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:56.964971 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc5c932f-1946-4794-b647-034fcda4c8c8-config-file" (OuterVolumeSpecName: "config-file") pod "cc5c932f-1946-4794-b647-034fcda4c8c8" (UID: "cc5c932f-1946-4794-b647-034fcda4c8c8"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 08:00:56.966736 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:56.966716 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc5c932f-1946-4794-b647-034fcda4c8c8-kube-api-access-8vh6g" (OuterVolumeSpecName: "kube-api-access-8vh6g") pod "cc5c932f-1946-4794-b647-034fcda4c8c8" (UID: "cc5c932f-1946-4794-b647-034fcda4c8c8"). InnerVolumeSpecName "kube-api-access-8vh6g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 08:00:57.071764 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:57.066656 2572 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/cc5c932f-1946-4794-b647-034fcda4c8c8-config-file\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:00:57.071764 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:57.066695 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8vh6g\" (UniqueName: \"kubernetes.io/projected/cc5c932f-1946-4794-b647-034fcda4c8c8-kube-api-access-8vh6g\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:00:57.242454 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:57.242423 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-7wwgn"] Apr 20 08:00:57.245458 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:57.245437 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-7wwgn"] Apr 20 08:00:58.646486 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:00:58.646443 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc5c932f-1946-4794-b647-034fcda4c8c8" path="/var/lib/kubelet/pods/cc5c932f-1946-4794-b647-034fcda4c8c8/volumes" Apr 20 08:01:01.990638 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:01.990604 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-h7nm2"] Apr 20 08:01:01.991018 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:01.990957 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cc5c932f-1946-4794-b647-034fcda4c8c8" containerName="limitador" Apr 20 08:01:01.991018 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:01.990968 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc5c932f-1946-4794-b647-034fcda4c8c8" containerName="limitador" Apr 20 08:01:01.991018 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:01.990985 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a4dd4324-5857-4864-b658-fcd02f939398" containerName="authorino" Apr 20 08:01:01.991018 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:01.990991 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4dd4324-5857-4864-b658-fcd02f939398" containerName="authorino" Apr 20 08:01:01.991186 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:01.991068 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="a4dd4324-5857-4864-b658-fcd02f939398" containerName="authorino" Apr 20 08:01:01.991186 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:01.991077 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="cc5c932f-1946-4794-b647-034fcda4c8c8" containerName="limitador" Apr 20 08:01:01.995267 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:01.995248 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-h7nm2" Apr 20 08:01:01.998660 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:01.998462 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 20 08:01:01.999073 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:01.999050 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-df248\"" Apr 20 08:01:02.000752 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:02.000725 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-h7nm2"] Apr 20 08:01:02.114387 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:02.114352 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnvj5\" (UniqueName: \"kubernetes.io/projected/0a17bf98-d8da-42ea-a485-2468e701bd28-kube-api-access-fnvj5\") pod \"postgres-868db5846d-h7nm2\" (UID: \"0a17bf98-d8da-42ea-a485-2468e701bd28\") " pod="opendatahub/postgres-868db5846d-h7nm2" Apr 20 08:01:02.114605 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:02.114511 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0a17bf98-d8da-42ea-a485-2468e701bd28-data\") pod \"postgres-868db5846d-h7nm2\" (UID: \"0a17bf98-d8da-42ea-a485-2468e701bd28\") " pod="opendatahub/postgres-868db5846d-h7nm2" Apr 20 08:01:02.215501 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:02.215467 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnvj5\" (UniqueName: \"kubernetes.io/projected/0a17bf98-d8da-42ea-a485-2468e701bd28-kube-api-access-fnvj5\") pod \"postgres-868db5846d-h7nm2\" (UID: \"0a17bf98-d8da-42ea-a485-2468e701bd28\") " pod="opendatahub/postgres-868db5846d-h7nm2" Apr 20 08:01:02.215697 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:02.215603 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0a17bf98-d8da-42ea-a485-2468e701bd28-data\") pod \"postgres-868db5846d-h7nm2\" (UID: \"0a17bf98-d8da-42ea-a485-2468e701bd28\") " pod="opendatahub/postgres-868db5846d-h7nm2" Apr 20 08:01:02.215997 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:02.215975 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0a17bf98-d8da-42ea-a485-2468e701bd28-data\") pod \"postgres-868db5846d-h7nm2\" (UID: \"0a17bf98-d8da-42ea-a485-2468e701bd28\") " pod="opendatahub/postgres-868db5846d-h7nm2" Apr 20 08:01:02.223557 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:02.223530 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnvj5\" (UniqueName: \"kubernetes.io/projected/0a17bf98-d8da-42ea-a485-2468e701bd28-kube-api-access-fnvj5\") pod \"postgres-868db5846d-h7nm2\" (UID: \"0a17bf98-d8da-42ea-a485-2468e701bd28\") " pod="opendatahub/postgres-868db5846d-h7nm2" Apr 20 08:01:02.308428 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:02.308331 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-h7nm2" Apr 20 08:01:02.649630 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:02.649605 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-h7nm2"] Apr 20 08:01:02.651497 ip-10-0-133-161 kubenswrapper[2572]: W0420 08:01:02.651468 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a17bf98_d8da_42ea_a485_2468e701bd28.slice/crio-198b0c57923b691d965ab195e9801b248715c8cd01e3b804a714c91a9f453f19 WatchSource:0}: Error finding container 198b0c57923b691d965ab195e9801b248715c8cd01e3b804a714c91a9f453f19: Status 404 returned error can't find the container with id 198b0c57923b691d965ab195e9801b248715c8cd01e3b804a714c91a9f453f19 Apr 20 08:01:02.652870 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:02.652853 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 08:01:02.943801 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:02.943716 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-h7nm2" event={"ID":"0a17bf98-d8da-42ea-a485-2468e701bd28","Type":"ContainerStarted","Data":"198b0c57923b691d965ab195e9801b248715c8cd01e3b804a714c91a9f453f19"} Apr 20 08:01:07.968113 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:07.968074 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-h7nm2" event={"ID":"0a17bf98-d8da-42ea-a485-2468e701bd28","Type":"ContainerStarted","Data":"a6c9ffe5d253546a04fd02ecf4beaab6a81108178180a9ccf3d0fc3f74d6333c"} Apr 20 08:01:07.968606 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:07.968200 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-h7nm2" Apr 20 08:01:07.984691 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:07.984637 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-h7nm2" podStartSLOduration=2.213645635 podStartE2EDuration="6.984622765s" podCreationTimestamp="2026-04-20 08:01:01 +0000 UTC" firstStartedPulling="2026-04-20 08:01:02.652975716 +0000 UTC m=+646.566605737" lastFinishedPulling="2026-04-20 08:01:07.423952846 +0000 UTC m=+651.337582867" observedRunningTime="2026-04-20 08:01:07.982673254 +0000 UTC m=+651.896303297" watchObservedRunningTime="2026-04-20 08:01:07.984622765 +0000 UTC m=+651.898252807" Apr 20 08:01:14.000470 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.000441 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-h7nm2" Apr 20 08:01:14.524151 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.524113 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-c4tfj"] Apr 20 08:01:14.530114 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.530090 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-c4tfj" Apr 20 08:01:14.533636 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.533612 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-c4tfj"] Apr 20 08:01:14.544174 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.544127 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss5m5\" (UniqueName: \"kubernetes.io/projected/e494a913-88e4-4717-a6cb-c2dc6a9a665f-kube-api-access-ss5m5\") pod \"authorino-8b475cf9f-c4tfj\" (UID: \"e494a913-88e4-4717-a6cb-c2dc6a9a665f\") " pod="kuadrant-system/authorino-8b475cf9f-c4tfj" Apr 20 08:01:14.644780 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.644750 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ss5m5\" (UniqueName: \"kubernetes.io/projected/e494a913-88e4-4717-a6cb-c2dc6a9a665f-kube-api-access-ss5m5\") pod \"authorino-8b475cf9f-c4tfj\" (UID: \"e494a913-88e4-4717-a6cb-c2dc6a9a665f\") " pod="kuadrant-system/authorino-8b475cf9f-c4tfj" Apr 20 08:01:14.652639 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.652611 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss5m5\" (UniqueName: \"kubernetes.io/projected/e494a913-88e4-4717-a6cb-c2dc6a9a665f-kube-api-access-ss5m5\") pod \"authorino-8b475cf9f-c4tfj\" (UID: \"e494a913-88e4-4717-a6cb-c2dc6a9a665f\") " pod="kuadrant-system/authorino-8b475cf9f-c4tfj" Apr 20 08:01:14.758664 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.758624 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-c4tfj"] Apr 20 08:01:14.758895 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.758884 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-c4tfj" Apr 20 08:01:14.784186 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.784104 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-6866f9c7f8-6nt7s"] Apr 20 08:01:14.789520 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.789499 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6866f9c7f8-6nt7s" Apr 20 08:01:14.794352 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.794328 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6866f9c7f8-6nt7s"] Apr 20 08:01:14.846149 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.846104 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptc8h\" (UniqueName: \"kubernetes.io/projected/19ce5c94-ae1c-4044-b222-1c786f255b38-kube-api-access-ptc8h\") pod \"authorino-6866f9c7f8-6nt7s\" (UID: \"19ce5c94-ae1c-4044-b222-1c786f255b38\") " pod="kuadrant-system/authorino-6866f9c7f8-6nt7s" Apr 20 08:01:14.875290 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.875259 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6866f9c7f8-6nt7s"] Apr 20 08:01:14.875598 ip-10-0-133-161 kubenswrapper[2572]: E0420 08:01:14.875572 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-ptc8h], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-6866f9c7f8-6nt7s" podUID="19ce5c94-ae1c-4044-b222-1c786f255b38" Apr 20 08:01:14.887325 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.887294 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-c4tfj"] Apr 20 08:01:14.890519 ip-10-0-133-161 kubenswrapper[2572]: W0420 08:01:14.890496 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode494a913_88e4_4717_a6cb_c2dc6a9a665f.slice/crio-9334a0aefaa610e5489bbdf0ad9c759c50f08cbe62d641706037bfd94f8e5423 WatchSource:0}: Error finding container 9334a0aefaa610e5489bbdf0ad9c759c50f08cbe62d641706037bfd94f8e5423: Status 404 returned error can't find the container with id 9334a0aefaa610e5489bbdf0ad9c759c50f08cbe62d641706037bfd94f8e5423 Apr 20 08:01:14.904760 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.904727 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-6c74bc5c86-btw69"] Apr 20 08:01:14.908642 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.908625 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6c74bc5c86-btw69" Apr 20 08:01:14.911125 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.911096 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 20 08:01:14.916365 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.916344 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6c74bc5c86-btw69"] Apr 20 08:01:14.946664 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.946635 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptc8h\" (UniqueName: \"kubernetes.io/projected/19ce5c94-ae1c-4044-b222-1c786f255b38-kube-api-access-ptc8h\") pod \"authorino-6866f9c7f8-6nt7s\" (UID: \"19ce5c94-ae1c-4044-b222-1c786f255b38\") " pod="kuadrant-system/authorino-6866f9c7f8-6nt7s" Apr 20 08:01:14.946801 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.946691 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/8f35128c-80a7-42fb-af31-c1c4dafe8044-tls-cert\") pod \"authorino-6c74bc5c86-btw69\" (UID: \"8f35128c-80a7-42fb-af31-c1c4dafe8044\") " pod="kuadrant-system/authorino-6c74bc5c86-btw69" Apr 20 08:01:14.946801 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.946720 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5qnv\" (UniqueName: \"kubernetes.io/projected/8f35128c-80a7-42fb-af31-c1c4dafe8044-kube-api-access-b5qnv\") pod \"authorino-6c74bc5c86-btw69\" (UID: \"8f35128c-80a7-42fb-af31-c1c4dafe8044\") " pod="kuadrant-system/authorino-6c74bc5c86-btw69" Apr 20 08:01:14.954117 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.954092 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptc8h\" (UniqueName: \"kubernetes.io/projected/19ce5c94-ae1c-4044-b222-1c786f255b38-kube-api-access-ptc8h\") pod \"authorino-6866f9c7f8-6nt7s\" (UID: \"19ce5c94-ae1c-4044-b222-1c786f255b38\") " pod="kuadrant-system/authorino-6866f9c7f8-6nt7s" Apr 20 08:01:14.994243 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.994208 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-c4tfj" event={"ID":"e494a913-88e4-4717-a6cb-c2dc6a9a665f","Type":"ContainerStarted","Data":"9334a0aefaa610e5489bbdf0ad9c759c50f08cbe62d641706037bfd94f8e5423"} Apr 20 08:01:14.994399 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.994288 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6866f9c7f8-6nt7s" Apr 20 08:01:14.999491 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:14.999468 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6866f9c7f8-6nt7s" Apr 20 08:01:15.047203 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:15.047103 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ptc8h\" (UniqueName: \"kubernetes.io/projected/19ce5c94-ae1c-4044-b222-1c786f255b38-kube-api-access-ptc8h\") pod \"19ce5c94-ae1c-4044-b222-1c786f255b38\" (UID: \"19ce5c94-ae1c-4044-b222-1c786f255b38\") " Apr 20 08:01:15.047629 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:15.047293 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/8f35128c-80a7-42fb-af31-c1c4dafe8044-tls-cert\") pod \"authorino-6c74bc5c86-btw69\" (UID: \"8f35128c-80a7-42fb-af31-c1c4dafe8044\") " pod="kuadrant-system/authorino-6c74bc5c86-btw69" Apr 20 08:01:15.047629 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:15.047341 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b5qnv\" (UniqueName: \"kubernetes.io/projected/8f35128c-80a7-42fb-af31-c1c4dafe8044-kube-api-access-b5qnv\") pod \"authorino-6c74bc5c86-btw69\" (UID: \"8f35128c-80a7-42fb-af31-c1c4dafe8044\") " pod="kuadrant-system/authorino-6c74bc5c86-btw69" Apr 20 08:01:15.049265 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:15.049244 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19ce5c94-ae1c-4044-b222-1c786f255b38-kube-api-access-ptc8h" (OuterVolumeSpecName: "kube-api-access-ptc8h") pod "19ce5c94-ae1c-4044-b222-1c786f255b38" (UID: "19ce5c94-ae1c-4044-b222-1c786f255b38"). InnerVolumeSpecName "kube-api-access-ptc8h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 08:01:15.049602 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:15.049585 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/8f35128c-80a7-42fb-af31-c1c4dafe8044-tls-cert\") pod \"authorino-6c74bc5c86-btw69\" (UID: \"8f35128c-80a7-42fb-af31-c1c4dafe8044\") " pod="kuadrant-system/authorino-6c74bc5c86-btw69" Apr 20 08:01:15.054995 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:15.054972 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5qnv\" (UniqueName: \"kubernetes.io/projected/8f35128c-80a7-42fb-af31-c1c4dafe8044-kube-api-access-b5qnv\") pod \"authorino-6c74bc5c86-btw69\" (UID: \"8f35128c-80a7-42fb-af31-c1c4dafe8044\") " pod="kuadrant-system/authorino-6c74bc5c86-btw69" Apr 20 08:01:15.147988 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:15.147953 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ptc8h\" (UniqueName: \"kubernetes.io/projected/19ce5c94-ae1c-4044-b222-1c786f255b38-kube-api-access-ptc8h\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:01:15.219017 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:15.218984 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6c74bc5c86-btw69" Apr 20 08:01:15.337655 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:15.337631 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-6c74bc5c86-btw69"] Apr 20 08:01:15.339412 ip-10-0-133-161 kubenswrapper[2572]: W0420 08:01:15.339384 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f35128c_80a7_42fb_af31_c1c4dafe8044.slice/crio-2f013dd2c020471618f4fea383aba726fc15fedee601909dd419a1e32837ce74 WatchSource:0}: Error finding container 2f013dd2c020471618f4fea383aba726fc15fedee601909dd419a1e32837ce74: Status 404 returned error can't find the container with id 2f013dd2c020471618f4fea383aba726fc15fedee601909dd419a1e32837ce74 Apr 20 08:01:15.999757 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:15.999707 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6c74bc5c86-btw69" event={"ID":"8f35128c-80a7-42fb-af31-c1c4dafe8044","Type":"ContainerStarted","Data":"f3244b6c243e33385d7cac29adf942ffd607477431b5fa7299292a5fe0475617"} Apr 20 08:01:15.999940 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:15.999765 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6c74bc5c86-btw69" event={"ID":"8f35128c-80a7-42fb-af31-c1c4dafe8044","Type":"ContainerStarted","Data":"2f013dd2c020471618f4fea383aba726fc15fedee601909dd419a1e32837ce74"} Apr 20 08:01:16.001108 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.001079 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-c4tfj" event={"ID":"e494a913-88e4-4717-a6cb-c2dc6a9a665f","Type":"ContainerStarted","Data":"a472cd07e578bd609c8e5da141ab2c1706f52b2f56bb7d48fa40068dcf98b42a"} Apr 20 08:01:16.001261 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.001115 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6866f9c7f8-6nt7s" Apr 20 08:01:16.001261 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.001132 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-c4tfj" podUID="e494a913-88e4-4717-a6cb-c2dc6a9a665f" containerName="authorino" containerID="cri-o://a472cd07e578bd609c8e5da141ab2c1706f52b2f56bb7d48fa40068dcf98b42a" gracePeriod=30 Apr 20 08:01:16.014076 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.014029 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-6c74bc5c86-btw69" podStartSLOduration=1.5476169450000001 podStartE2EDuration="2.014016784s" podCreationTimestamp="2026-04-20 08:01:14 +0000 UTC" firstStartedPulling="2026-04-20 08:01:15.340712856 +0000 UTC m=+659.254342877" lastFinishedPulling="2026-04-20 08:01:15.807112695 +0000 UTC m=+659.720742716" observedRunningTime="2026-04-20 08:01:16.0138972 +0000 UTC m=+659.927527244" watchObservedRunningTime="2026-04-20 08:01:16.014016784 +0000 UTC m=+659.927646827" Apr 20 08:01:16.028305 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.028237 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-c4tfj" podStartSLOduration=1.435713096 podStartE2EDuration="2.028216598s" podCreationTimestamp="2026-04-20 08:01:14 +0000 UTC" firstStartedPulling="2026-04-20 08:01:14.891749978 +0000 UTC m=+658.805379999" lastFinishedPulling="2026-04-20 08:01:15.484253467 +0000 UTC m=+659.397883501" observedRunningTime="2026-04-20 08:01:16.026176786 +0000 UTC m=+659.939806829" watchObservedRunningTime="2026-04-20 08:01:16.028216598 +0000 UTC m=+659.941846644" Apr 20 08:01:16.045884 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.045845 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-424xg"] Apr 20 08:01:16.046157 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.046100 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-424xg" podUID="5702edf2-5fe5-4868-b122-9999b961b526" containerName="authorino" containerID="cri-o://ce754e751f9d82ddfd30a9deb185ecdce2a4385ce183ac0c806608f45fa6faf5" gracePeriod=30 Apr 20 08:01:16.057755 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.057727 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6866f9c7f8-6nt7s"] Apr 20 08:01:16.062998 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.062967 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-6866f9c7f8-6nt7s"] Apr 20 08:01:16.308645 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.308621 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-c4tfj" Apr 20 08:01:16.313300 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.313282 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-424xg" Apr 20 08:01:16.459862 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.459773 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r6td\" (UniqueName: \"kubernetes.io/projected/5702edf2-5fe5-4868-b122-9999b961b526-kube-api-access-9r6td\") pod \"5702edf2-5fe5-4868-b122-9999b961b526\" (UID: \"5702edf2-5fe5-4868-b122-9999b961b526\") " Apr 20 08:01:16.459862 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.459832 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss5m5\" (UniqueName: \"kubernetes.io/projected/e494a913-88e4-4717-a6cb-c2dc6a9a665f-kube-api-access-ss5m5\") pod \"e494a913-88e4-4717-a6cb-c2dc6a9a665f\" (UID: \"e494a913-88e4-4717-a6cb-c2dc6a9a665f\") " Apr 20 08:01:16.461917 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.461892 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5702edf2-5fe5-4868-b122-9999b961b526-kube-api-access-9r6td" (OuterVolumeSpecName: "kube-api-access-9r6td") pod "5702edf2-5fe5-4868-b122-9999b961b526" (UID: "5702edf2-5fe5-4868-b122-9999b961b526"). InnerVolumeSpecName "kube-api-access-9r6td". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 08:01:16.461917 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.461902 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e494a913-88e4-4717-a6cb-c2dc6a9a665f-kube-api-access-ss5m5" (OuterVolumeSpecName: "kube-api-access-ss5m5") pod "e494a913-88e4-4717-a6cb-c2dc6a9a665f" (UID: "e494a913-88e4-4717-a6cb-c2dc6a9a665f"). InnerVolumeSpecName "kube-api-access-ss5m5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 08:01:16.560577 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.560550 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ss5m5\" (UniqueName: \"kubernetes.io/projected/e494a913-88e4-4717-a6cb-c2dc6a9a665f-kube-api-access-ss5m5\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:01:16.560577 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.560576 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9r6td\" (UniqueName: \"kubernetes.io/projected/5702edf2-5fe5-4868-b122-9999b961b526-kube-api-access-9r6td\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:01:16.647385 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.647355 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19ce5c94-ae1c-4044-b222-1c786f255b38" path="/var/lib/kubelet/pods/19ce5c94-ae1c-4044-b222-1c786f255b38/volumes" Apr 20 08:01:16.817989 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.817951 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-x276k"] Apr 20 08:01:16.818431 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.818412 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e494a913-88e4-4717-a6cb-c2dc6a9a665f" containerName="authorino" Apr 20 08:01:16.818522 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.818433 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="e494a913-88e4-4717-a6cb-c2dc6a9a665f" containerName="authorino" Apr 20 08:01:16.818522 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.818475 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5702edf2-5fe5-4868-b122-9999b961b526" containerName="authorino" Apr 20 08:01:16.818522 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.818485 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="5702edf2-5fe5-4868-b122-9999b961b526" containerName="authorino" Apr 20 08:01:16.818673 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.818579 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="5702edf2-5fe5-4868-b122-9999b961b526" containerName="authorino" Apr 20 08:01:16.818673 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.818592 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="e494a913-88e4-4717-a6cb-c2dc6a9a665f" containerName="authorino" Apr 20 08:01:16.907361 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.907323 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-x276k"] Apr 20 08:01:16.907556 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.907480 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-x276k" Apr 20 08:01:16.910500 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.910474 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-nrd72\"" Apr 20 08:01:16.957687 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.957655 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-986bffdc5-vnbgx"] Apr 20 08:01:16.966372 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.966343 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-986bffdc5-vnbgx" Apr 20 08:01:16.969885 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:16.969857 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-986bffdc5-vnbgx"] Apr 20 08:01:17.005451 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.005421 2572 generic.go:358] "Generic (PLEG): container finished" podID="5702edf2-5fe5-4868-b122-9999b961b526" containerID="ce754e751f9d82ddfd30a9deb185ecdce2a4385ce183ac0c806608f45fa6faf5" exitCode=0 Apr 20 08:01:17.005607 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.005498 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-424xg" Apr 20 08:01:17.005607 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.005503 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-424xg" event={"ID":"5702edf2-5fe5-4868-b122-9999b961b526","Type":"ContainerDied","Data":"ce754e751f9d82ddfd30a9deb185ecdce2a4385ce183ac0c806608f45fa6faf5"} Apr 20 08:01:17.005607 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.005539 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-424xg" event={"ID":"5702edf2-5fe5-4868-b122-9999b961b526","Type":"ContainerDied","Data":"6272104d87c68b8ae16dd75a10164fd14f20eb0790eb69481b30bf785070fded"} Apr 20 08:01:17.005607 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.005557 2572 scope.go:117] "RemoveContainer" containerID="ce754e751f9d82ddfd30a9deb185ecdce2a4385ce183ac0c806608f45fa6faf5" Apr 20 08:01:17.006718 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.006685 2572 generic.go:358] "Generic (PLEG): container finished" podID="e494a913-88e4-4717-a6cb-c2dc6a9a665f" containerID="a472cd07e578bd609c8e5da141ab2c1706f52b2f56bb7d48fa40068dcf98b42a" exitCode=0 Apr 20 08:01:17.006803 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.006729 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-c4tfj" event={"ID":"e494a913-88e4-4717-a6cb-c2dc6a9a665f","Type":"ContainerDied","Data":"a472cd07e578bd609c8e5da141ab2c1706f52b2f56bb7d48fa40068dcf98b42a"} Apr 20 08:01:17.006803 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.006761 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-c4tfj" event={"ID":"e494a913-88e4-4717-a6cb-c2dc6a9a665f","Type":"ContainerDied","Data":"9334a0aefaa610e5489bbdf0ad9c759c50f08cbe62d641706037bfd94f8e5423"} Apr 20 08:01:17.006803 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.006770 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-c4tfj" Apr 20 08:01:17.013956 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.013942 2572 scope.go:117] "RemoveContainer" containerID="ce754e751f9d82ddfd30a9deb185ecdce2a4385ce183ac0c806608f45fa6faf5" Apr 20 08:01:17.014209 ip-10-0-133-161 kubenswrapper[2572]: E0420 08:01:17.014190 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce754e751f9d82ddfd30a9deb185ecdce2a4385ce183ac0c806608f45fa6faf5\": container with ID starting with ce754e751f9d82ddfd30a9deb185ecdce2a4385ce183ac0c806608f45fa6faf5 not found: ID does not exist" containerID="ce754e751f9d82ddfd30a9deb185ecdce2a4385ce183ac0c806608f45fa6faf5" Apr 20 08:01:17.014287 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.014214 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce754e751f9d82ddfd30a9deb185ecdce2a4385ce183ac0c806608f45fa6faf5"} err="failed to get container status \"ce754e751f9d82ddfd30a9deb185ecdce2a4385ce183ac0c806608f45fa6faf5\": rpc error: code = NotFound desc = could not find container \"ce754e751f9d82ddfd30a9deb185ecdce2a4385ce183ac0c806608f45fa6faf5\": container with ID starting with ce754e751f9d82ddfd30a9deb185ecdce2a4385ce183ac0c806608f45fa6faf5 not found: ID does not exist" Apr 20 08:01:17.014287 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.014231 2572 scope.go:117] "RemoveContainer" containerID="a472cd07e578bd609c8e5da141ab2c1706f52b2f56bb7d48fa40068dcf98b42a" Apr 20 08:01:17.021199 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.021122 2572 scope.go:117] "RemoveContainer" containerID="a472cd07e578bd609c8e5da141ab2c1706f52b2f56bb7d48fa40068dcf98b42a" Apr 20 08:01:17.021391 ip-10-0-133-161 kubenswrapper[2572]: E0420 08:01:17.021375 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a472cd07e578bd609c8e5da141ab2c1706f52b2f56bb7d48fa40068dcf98b42a\": container with ID starting with a472cd07e578bd609c8e5da141ab2c1706f52b2f56bb7d48fa40068dcf98b42a not found: ID does not exist" containerID="a472cd07e578bd609c8e5da141ab2c1706f52b2f56bb7d48fa40068dcf98b42a" Apr 20 08:01:17.021442 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.021397 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a472cd07e578bd609c8e5da141ab2c1706f52b2f56bb7d48fa40068dcf98b42a"} err="failed to get container status \"a472cd07e578bd609c8e5da141ab2c1706f52b2f56bb7d48fa40068dcf98b42a\": rpc error: code = NotFound desc = could not find container \"a472cd07e578bd609c8e5da141ab2c1706f52b2f56bb7d48fa40068dcf98b42a\": container with ID starting with a472cd07e578bd609c8e5da141ab2c1706f52b2f56bb7d48fa40068dcf98b42a not found: ID does not exist" Apr 20 08:01:17.024237 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.024216 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-424xg"] Apr 20 08:01:17.028000 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.027982 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-424xg"] Apr 20 08:01:17.037278 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.037244 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-c4tfj"] Apr 20 08:01:17.039609 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.039591 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-c4tfj"] Apr 20 08:01:17.065786 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.065762 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cckw\" (UniqueName: \"kubernetes.io/projected/2d86c04d-3d5b-47ca-a0df-581cdbc5eb36-kube-api-access-9cckw\") pod \"maas-controller-6d4c8f55f9-x276k\" (UID: \"2d86c04d-3d5b-47ca-a0df-581cdbc5eb36\") " pod="opendatahub/maas-controller-6d4c8f55f9-x276k" Apr 20 08:01:17.066077 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.065879 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqkxs\" (UniqueName: \"kubernetes.io/projected/8cf038c2-95a7-44b4-b9e7-aa1d370386ac-kube-api-access-bqkxs\") pod \"maas-controller-986bffdc5-vnbgx\" (UID: \"8cf038c2-95a7-44b4-b9e7-aa1d370386ac\") " pod="opendatahub/maas-controller-986bffdc5-vnbgx" Apr 20 08:01:17.083713 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.083645 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-x276k"] Apr 20 08:01:17.083878 ip-10-0-133-161 kubenswrapper[2572]: E0420 08:01:17.083859 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-9cckw], unattached volumes=[], failed to process volumes=[]: context canceled" pod="opendatahub/maas-controller-6d4c8f55f9-x276k" podUID="2d86c04d-3d5b-47ca-a0df-581cdbc5eb36" Apr 20 08:01:17.112290 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.112258 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-9bc4dd454-jl5hf"] Apr 20 08:01:17.159135 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.159088 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-9bc4dd454-jl5hf"] Apr 20 08:01:17.159316 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.159284 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-9bc4dd454-jl5hf" Apr 20 08:01:17.167657 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.167628 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqkxs\" (UniqueName: \"kubernetes.io/projected/8cf038c2-95a7-44b4-b9e7-aa1d370386ac-kube-api-access-bqkxs\") pod \"maas-controller-986bffdc5-vnbgx\" (UID: \"8cf038c2-95a7-44b4-b9e7-aa1d370386ac\") " pod="opendatahub/maas-controller-986bffdc5-vnbgx" Apr 20 08:01:17.167809 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.167691 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cckw\" (UniqueName: \"kubernetes.io/projected/2d86c04d-3d5b-47ca-a0df-581cdbc5eb36-kube-api-access-9cckw\") pod \"maas-controller-6d4c8f55f9-x276k\" (UID: \"2d86c04d-3d5b-47ca-a0df-581cdbc5eb36\") " pod="opendatahub/maas-controller-6d4c8f55f9-x276k" Apr 20 08:01:17.178894 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.178867 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqkxs\" (UniqueName: \"kubernetes.io/projected/8cf038c2-95a7-44b4-b9e7-aa1d370386ac-kube-api-access-bqkxs\") pod \"maas-controller-986bffdc5-vnbgx\" (UID: \"8cf038c2-95a7-44b4-b9e7-aa1d370386ac\") " pod="opendatahub/maas-controller-986bffdc5-vnbgx" Apr 20 08:01:17.179000 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.178867 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cckw\" (UniqueName: \"kubernetes.io/projected/2d86c04d-3d5b-47ca-a0df-581cdbc5eb36-kube-api-access-9cckw\") pod \"maas-controller-6d4c8f55f9-x276k\" (UID: \"2d86c04d-3d5b-47ca-a0df-581cdbc5eb36\") " pod="opendatahub/maas-controller-6d4c8f55f9-x276k" Apr 20 08:01:17.268282 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.268246 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x25bz\" (UniqueName: \"kubernetes.io/projected/fae7b42c-035a-4791-bbec-db48e6223c0f-kube-api-access-x25bz\") pod \"maas-controller-9bc4dd454-jl5hf\" (UID: \"fae7b42c-035a-4791-bbec-db48e6223c0f\") " pod="opendatahub/maas-controller-9bc4dd454-jl5hf" Apr 20 08:01:17.278194 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.278170 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-986bffdc5-vnbgx" Apr 20 08:01:17.369894 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.369837 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x25bz\" (UniqueName: \"kubernetes.io/projected/fae7b42c-035a-4791-bbec-db48e6223c0f-kube-api-access-x25bz\") pod \"maas-controller-9bc4dd454-jl5hf\" (UID: \"fae7b42c-035a-4791-bbec-db48e6223c0f\") " pod="opendatahub/maas-controller-9bc4dd454-jl5hf" Apr 20 08:01:17.378033 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.378003 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x25bz\" (UniqueName: \"kubernetes.io/projected/fae7b42c-035a-4791-bbec-db48e6223c0f-kube-api-access-x25bz\") pod \"maas-controller-9bc4dd454-jl5hf\" (UID: \"fae7b42c-035a-4791-bbec-db48e6223c0f\") " pod="opendatahub/maas-controller-9bc4dd454-jl5hf" Apr 20 08:01:17.397256 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.397214 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-986bffdc5-vnbgx"] Apr 20 08:01:17.399270 ip-10-0-133-161 kubenswrapper[2572]: W0420 08:01:17.399236 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cf038c2_95a7_44b4_b9e7_aa1d370386ac.slice/crio-1f7b66bda8ade65ecd50973fae85cf05f6c0a04e96065a50fc5c39de26488527 WatchSource:0}: Error finding container 1f7b66bda8ade65ecd50973fae85cf05f6c0a04e96065a50fc5c39de26488527: Status 404 returned error can't find the container with id 1f7b66bda8ade65ecd50973fae85cf05f6c0a04e96065a50fc5c39de26488527 Apr 20 08:01:17.469612 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.469579 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-9bc4dd454-jl5hf" Apr 20 08:01:17.592268 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:17.592239 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-9bc4dd454-jl5hf"] Apr 20 08:01:17.594897 ip-10-0-133-161 kubenswrapper[2572]: W0420 08:01:17.594874 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfae7b42c_035a_4791_bbec_db48e6223c0f.slice/crio-f36e77e1fe23d741d8ff04ffb87a9d203b8fbc1232a4039afbb97cd6096f4410 WatchSource:0}: Error finding container f36e77e1fe23d741d8ff04ffb87a9d203b8fbc1232a4039afbb97cd6096f4410: Status 404 returned error can't find the container with id f36e77e1fe23d741d8ff04ffb87a9d203b8fbc1232a4039afbb97cd6096f4410 Apr 20 08:01:18.014518 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:18.014457 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-9bc4dd454-jl5hf" event={"ID":"fae7b42c-035a-4791-bbec-db48e6223c0f","Type":"ContainerStarted","Data":"f36e77e1fe23d741d8ff04ffb87a9d203b8fbc1232a4039afbb97cd6096f4410"} Apr 20 08:01:18.016194 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:18.016167 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-986bffdc5-vnbgx" event={"ID":"8cf038c2-95a7-44b4-b9e7-aa1d370386ac","Type":"ContainerStarted","Data":"1f7b66bda8ade65ecd50973fae85cf05f6c0a04e96065a50fc5c39de26488527"} Apr 20 08:01:18.016313 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:18.016250 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-x276k" Apr 20 08:01:18.021231 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:18.021212 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-x276k" Apr 20 08:01:18.178537 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:18.178499 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cckw\" (UniqueName: \"kubernetes.io/projected/2d86c04d-3d5b-47ca-a0df-581cdbc5eb36-kube-api-access-9cckw\") pod \"2d86c04d-3d5b-47ca-a0df-581cdbc5eb36\" (UID: \"2d86c04d-3d5b-47ca-a0df-581cdbc5eb36\") " Apr 20 08:01:18.181228 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:18.181179 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d86c04d-3d5b-47ca-a0df-581cdbc5eb36-kube-api-access-9cckw" (OuterVolumeSpecName: "kube-api-access-9cckw") pod "2d86c04d-3d5b-47ca-a0df-581cdbc5eb36" (UID: "2d86c04d-3d5b-47ca-a0df-581cdbc5eb36"). InnerVolumeSpecName "kube-api-access-9cckw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 08:01:18.279897 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:18.279668 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9cckw\" (UniqueName: \"kubernetes.io/projected/2d86c04d-3d5b-47ca-a0df-581cdbc5eb36-kube-api-access-9cckw\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:01:18.649905 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:18.649802 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5702edf2-5fe5-4868-b122-9999b961b526" path="/var/lib/kubelet/pods/5702edf2-5fe5-4868-b122-9999b961b526/volumes" Apr 20 08:01:18.650382 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:18.650357 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e494a913-88e4-4717-a6cb-c2dc6a9a665f" path="/var/lib/kubelet/pods/e494a913-88e4-4717-a6cb-c2dc6a9a665f/volumes" Apr 20 08:01:19.021067 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:19.021037 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-x276k" Apr 20 08:01:19.050251 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:19.050192 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-x276k"] Apr 20 08:01:19.053080 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:19.053041 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-x276k"] Apr 20 08:01:20.658359 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:20.658260 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d86c04d-3d5b-47ca-a0df-581cdbc5eb36" path="/var/lib/kubelet/pods/2d86c04d-3d5b-47ca-a0df-581cdbc5eb36/volumes" Apr 20 08:01:21.030858 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:21.030816 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-9bc4dd454-jl5hf" event={"ID":"fae7b42c-035a-4791-bbec-db48e6223c0f","Type":"ContainerStarted","Data":"d7e1a8dc36575f99ae75b3d061a87024dc3c8e0126653c981b6ccc5123d88174"} Apr 20 08:01:21.031035 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:21.030925 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-9bc4dd454-jl5hf" Apr 20 08:01:21.032271 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:21.032240 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-986bffdc5-vnbgx" event={"ID":"8cf038c2-95a7-44b4-b9e7-aa1d370386ac","Type":"ContainerStarted","Data":"49efd7aa452db91b0a08c5a96869a4ad20b81dc9c7b7db6595091fa5202215e5"} Apr 20 08:01:21.032411 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:21.032367 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-986bffdc5-vnbgx" Apr 20 08:01:21.049372 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:21.049323 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-9bc4dd454-jl5hf" podStartSLOduration=1.352245836 podStartE2EDuration="4.049309907s" podCreationTimestamp="2026-04-20 08:01:17 +0000 UTC" firstStartedPulling="2026-04-20 08:01:17.596293448 +0000 UTC m=+661.509923468" lastFinishedPulling="2026-04-20 08:01:20.293357501 +0000 UTC m=+664.206987539" observedRunningTime="2026-04-20 08:01:21.046971147 +0000 UTC m=+664.960601183" watchObservedRunningTime="2026-04-20 08:01:21.049309907 +0000 UTC m=+664.962939998" Apr 20 08:01:21.062586 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:21.062542 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-986bffdc5-vnbgx" podStartSLOduration=2.176595756 podStartE2EDuration="5.062527596s" podCreationTimestamp="2026-04-20 08:01:16 +0000 UTC" firstStartedPulling="2026-04-20 08:01:17.400609093 +0000 UTC m=+661.314239114" lastFinishedPulling="2026-04-20 08:01:20.286540922 +0000 UTC m=+664.200170954" observedRunningTime="2026-04-20 08:01:21.062013318 +0000 UTC m=+664.975643362" watchObservedRunningTime="2026-04-20 08:01:21.062527596 +0000 UTC m=+664.976157640" Apr 20 08:01:22.613746 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:22.613716 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-5bdf69bf7c-jjqn5"] Apr 20 08:01:22.617420 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:22.617403 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5bdf69bf7c-jjqn5" Apr 20 08:01:22.619729 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:22.619711 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 08:01:22.619842 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:22.619712 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 08:01:22.627829 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:22.627806 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-5bdf69bf7c-jjqn5"] Apr 20 08:01:22.718975 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:22.718935 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8g8h\" (UniqueName: \"kubernetes.io/projected/98803190-fd71-4b81-8fe2-dd174cec1668-kube-api-access-n8g8h\") pod \"maas-api-5bdf69bf7c-jjqn5\" (UID: \"98803190-fd71-4b81-8fe2-dd174cec1668\") " pod="opendatahub/maas-api-5bdf69bf7c-jjqn5" Apr 20 08:01:22.718975 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:22.718975 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/98803190-fd71-4b81-8fe2-dd174cec1668-maas-api-tls\") pod \"maas-api-5bdf69bf7c-jjqn5\" (UID: \"98803190-fd71-4b81-8fe2-dd174cec1668\") " pod="opendatahub/maas-api-5bdf69bf7c-jjqn5" Apr 20 08:01:22.819685 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:22.819648 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n8g8h\" (UniqueName: \"kubernetes.io/projected/98803190-fd71-4b81-8fe2-dd174cec1668-kube-api-access-n8g8h\") pod \"maas-api-5bdf69bf7c-jjqn5\" (UID: \"98803190-fd71-4b81-8fe2-dd174cec1668\") " pod="opendatahub/maas-api-5bdf69bf7c-jjqn5" Apr 20 08:01:22.819685 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:22.819690 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/98803190-fd71-4b81-8fe2-dd174cec1668-maas-api-tls\") pod \"maas-api-5bdf69bf7c-jjqn5\" (UID: \"98803190-fd71-4b81-8fe2-dd174cec1668\") " pod="opendatahub/maas-api-5bdf69bf7c-jjqn5" Apr 20 08:01:22.819898 ip-10-0-133-161 kubenswrapper[2572]: E0420 08:01:22.819839 2572 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 20 08:01:22.819936 ip-10-0-133-161 kubenswrapper[2572]: E0420 08:01:22.819901 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98803190-fd71-4b81-8fe2-dd174cec1668-maas-api-tls podName:98803190-fd71-4b81-8fe2-dd174cec1668 nodeName:}" failed. No retries permitted until 2026-04-20 08:01:23.31988285 +0000 UTC m=+667.233512891 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/98803190-fd71-4b81-8fe2-dd174cec1668-maas-api-tls") pod "maas-api-5bdf69bf7c-jjqn5" (UID: "98803190-fd71-4b81-8fe2-dd174cec1668") : secret "maas-api-serving-cert" not found Apr 20 08:01:22.828837 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:22.828807 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8g8h\" (UniqueName: \"kubernetes.io/projected/98803190-fd71-4b81-8fe2-dd174cec1668-kube-api-access-n8g8h\") pod \"maas-api-5bdf69bf7c-jjqn5\" (UID: \"98803190-fd71-4b81-8fe2-dd174cec1668\") " pod="opendatahub/maas-api-5bdf69bf7c-jjqn5" Apr 20 08:01:23.324980 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:23.324945 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/98803190-fd71-4b81-8fe2-dd174cec1668-maas-api-tls\") pod \"maas-api-5bdf69bf7c-jjqn5\" (UID: \"98803190-fd71-4b81-8fe2-dd174cec1668\") " pod="opendatahub/maas-api-5bdf69bf7c-jjqn5" Apr 20 08:01:23.327506 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:23.327477 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/98803190-fd71-4b81-8fe2-dd174cec1668-maas-api-tls\") pod \"maas-api-5bdf69bf7c-jjqn5\" (UID: \"98803190-fd71-4b81-8fe2-dd174cec1668\") " pod="opendatahub/maas-api-5bdf69bf7c-jjqn5" Apr 20 08:01:23.528415 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:23.528374 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5bdf69bf7c-jjqn5" Apr 20 08:01:23.659436 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:23.659393 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-5bdf69bf7c-jjqn5"] Apr 20 08:01:23.662268 ip-10-0-133-161 kubenswrapper[2572]: W0420 08:01:23.662235 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98803190_fd71_4b81_8fe2_dd174cec1668.slice/crio-5c3657f41e37d45b60a3f587e691bdc0e86e70208a25eba42bcc660af9613b24 WatchSource:0}: Error finding container 5c3657f41e37d45b60a3f587e691bdc0e86e70208a25eba42bcc660af9613b24: Status 404 returned error can't find the container with id 5c3657f41e37d45b60a3f587e691bdc0e86e70208a25eba42bcc660af9613b24 Apr 20 08:01:24.044102 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:24.044068 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5bdf69bf7c-jjqn5" event={"ID":"98803190-fd71-4b81-8fe2-dd174cec1668","Type":"ContainerStarted","Data":"5c3657f41e37d45b60a3f587e691bdc0e86e70208a25eba42bcc660af9613b24"} Apr 20 08:01:26.065562 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:26.065521 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5bdf69bf7c-jjqn5" event={"ID":"98803190-fd71-4b81-8fe2-dd174cec1668","Type":"ContainerStarted","Data":"8d1cc3938177270a5e612635db09b407cc7958416298fa6b8f7e07c838e2be0d"} Apr 20 08:01:26.065918 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:26.065589 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-5bdf69bf7c-jjqn5" Apr 20 08:01:26.081829 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:26.081781 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-5bdf69bf7c-jjqn5" podStartSLOduration=2.562589747 podStartE2EDuration="4.081767193s" podCreationTimestamp="2026-04-20 08:01:22 +0000 UTC" firstStartedPulling="2026-04-20 08:01:23.663474106 +0000 UTC m=+667.577104126" lastFinishedPulling="2026-04-20 08:01:25.182651538 +0000 UTC m=+669.096281572" observedRunningTime="2026-04-20 08:01:26.080474202 +0000 UTC m=+669.994104245" watchObservedRunningTime="2026-04-20 08:01:26.081767193 +0000 UTC m=+669.995397286" Apr 20 08:01:32.042391 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:32.042357 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-986bffdc5-vnbgx" Apr 20 08:01:32.042758 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:32.042417 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-9bc4dd454-jl5hf" Apr 20 08:01:32.073634 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:32.073606 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-5bdf69bf7c-jjqn5" Apr 20 08:01:32.093222 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:32.093188 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-986bffdc5-vnbgx"] Apr 20 08:01:32.093412 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:32.093376 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-986bffdc5-vnbgx" podUID="8cf038c2-95a7-44b4-b9e7-aa1d370386ac" containerName="manager" containerID="cri-o://49efd7aa452db91b0a08c5a96869a4ad20b81dc9c7b7db6595091fa5202215e5" gracePeriod=10 Apr 20 08:01:32.332390 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:32.332362 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-986bffdc5-vnbgx" Apr 20 08:01:32.386430 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:32.386396 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-76b95bd89-9vk2b"] Apr 20 08:01:32.386821 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:32.386801 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8cf038c2-95a7-44b4-b9e7-aa1d370386ac" containerName="manager" Apr 20 08:01:32.386821 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:32.386818 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cf038c2-95a7-44b4-b9e7-aa1d370386ac" containerName="manager" Apr 20 08:01:32.387009 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:32.386888 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8cf038c2-95a7-44b4-b9e7-aa1d370386ac" containerName="manager" Apr 20 08:01:32.390030 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:32.390012 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-76b95bd89-9vk2b" Apr 20 08:01:32.402326 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:32.402298 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-76b95bd89-9vk2b"] Apr 20 08:01:32.431902 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:32.431873 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqkxs\" (UniqueName: \"kubernetes.io/projected/8cf038c2-95a7-44b4-b9e7-aa1d370386ac-kube-api-access-bqkxs\") pod \"8cf038c2-95a7-44b4-b9e7-aa1d370386ac\" (UID: \"8cf038c2-95a7-44b4-b9e7-aa1d370386ac\") " Apr 20 08:01:32.432053 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:32.432031 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5f6z\" (UniqueName: \"kubernetes.io/projected/ed2b8d52-e1df-4543-8789-a216f6335f45-kube-api-access-s5f6z\") pod \"maas-controller-76b95bd89-9vk2b\" (UID: \"ed2b8d52-e1df-4543-8789-a216f6335f45\") " pod="opendatahub/maas-controller-76b95bd89-9vk2b" Apr 20 08:01:32.434091 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:32.434063 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cf038c2-95a7-44b4-b9e7-aa1d370386ac-kube-api-access-bqkxs" (OuterVolumeSpecName: "kube-api-access-bqkxs") pod "8cf038c2-95a7-44b4-b9e7-aa1d370386ac" (UID: "8cf038c2-95a7-44b4-b9e7-aa1d370386ac"). InnerVolumeSpecName "kube-api-access-bqkxs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 08:01:32.533114 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:32.533084 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5f6z\" (UniqueName: \"kubernetes.io/projected/ed2b8d52-e1df-4543-8789-a216f6335f45-kube-api-access-s5f6z\") pod \"maas-controller-76b95bd89-9vk2b\" (UID: \"ed2b8d52-e1df-4543-8789-a216f6335f45\") " pod="opendatahub/maas-controller-76b95bd89-9vk2b" Apr 20 08:01:32.533282 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:32.533181 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bqkxs\" (UniqueName: \"kubernetes.io/projected/8cf038c2-95a7-44b4-b9e7-aa1d370386ac-kube-api-access-bqkxs\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:01:32.541333 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:32.541311 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5f6z\" (UniqueName: \"kubernetes.io/projected/ed2b8d52-e1df-4543-8789-a216f6335f45-kube-api-access-s5f6z\") pod \"maas-controller-76b95bd89-9vk2b\" (UID: \"ed2b8d52-e1df-4543-8789-a216f6335f45\") " pod="opendatahub/maas-controller-76b95bd89-9vk2b" Apr 20 08:01:32.701311 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:32.701277 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-76b95bd89-9vk2b" Apr 20 08:01:32.827194 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:32.827172 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-76b95bd89-9vk2b"] Apr 20 08:01:32.829610 ip-10-0-133-161 kubenswrapper[2572]: W0420 08:01:32.829583 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded2b8d52_e1df_4543_8789_a216f6335f45.slice/crio-0e6d6a5306c1de09a063bd53a337626905afb75d6cc0e17a6ac9b920da68df1f WatchSource:0}: Error finding container 0e6d6a5306c1de09a063bd53a337626905afb75d6cc0e17a6ac9b920da68df1f: Status 404 returned error can't find the container with id 0e6d6a5306c1de09a063bd53a337626905afb75d6cc0e17a6ac9b920da68df1f Apr 20 08:01:33.090695 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:33.090652 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-76b95bd89-9vk2b" event={"ID":"ed2b8d52-e1df-4543-8789-a216f6335f45","Type":"ContainerStarted","Data":"0e6d6a5306c1de09a063bd53a337626905afb75d6cc0e17a6ac9b920da68df1f"} Apr 20 08:01:33.091678 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:33.091655 2572 generic.go:358] "Generic (PLEG): container finished" podID="8cf038c2-95a7-44b4-b9e7-aa1d370386ac" containerID="49efd7aa452db91b0a08c5a96869a4ad20b81dc9c7b7db6595091fa5202215e5" exitCode=0 Apr 20 08:01:33.091761 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:33.091689 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-986bffdc5-vnbgx" event={"ID":"8cf038c2-95a7-44b4-b9e7-aa1d370386ac","Type":"ContainerDied","Data":"49efd7aa452db91b0a08c5a96869a4ad20b81dc9c7b7db6595091fa5202215e5"} Apr 20 08:01:33.091761 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:33.091708 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-986bffdc5-vnbgx" event={"ID":"8cf038c2-95a7-44b4-b9e7-aa1d370386ac","Type":"ContainerDied","Data":"1f7b66bda8ade65ecd50973fae85cf05f6c0a04e96065a50fc5c39de26488527"} Apr 20 08:01:33.091761 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:33.091715 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-986bffdc5-vnbgx" Apr 20 08:01:33.091761 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:33.091723 2572 scope.go:117] "RemoveContainer" containerID="49efd7aa452db91b0a08c5a96869a4ad20b81dc9c7b7db6595091fa5202215e5" Apr 20 08:01:33.099991 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:33.099817 2572 scope.go:117] "RemoveContainer" containerID="49efd7aa452db91b0a08c5a96869a4ad20b81dc9c7b7db6595091fa5202215e5" Apr 20 08:01:33.100068 ip-10-0-133-161 kubenswrapper[2572]: E0420 08:01:33.100049 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49efd7aa452db91b0a08c5a96869a4ad20b81dc9c7b7db6595091fa5202215e5\": container with ID starting with 49efd7aa452db91b0a08c5a96869a4ad20b81dc9c7b7db6595091fa5202215e5 not found: ID does not exist" containerID="49efd7aa452db91b0a08c5a96869a4ad20b81dc9c7b7db6595091fa5202215e5" Apr 20 08:01:33.100110 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:33.100076 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49efd7aa452db91b0a08c5a96869a4ad20b81dc9c7b7db6595091fa5202215e5"} err="failed to get container status \"49efd7aa452db91b0a08c5a96869a4ad20b81dc9c7b7db6595091fa5202215e5\": rpc error: code = NotFound desc = could not find container \"49efd7aa452db91b0a08c5a96869a4ad20b81dc9c7b7db6595091fa5202215e5\": container with ID starting with 49efd7aa452db91b0a08c5a96869a4ad20b81dc9c7b7db6595091fa5202215e5 not found: ID does not exist" Apr 20 08:01:33.107704 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:33.107680 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-986bffdc5-vnbgx"] Apr 20 08:01:33.111183 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:33.111163 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-986bffdc5-vnbgx"] Apr 20 08:01:34.097647 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:34.097613 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-76b95bd89-9vk2b" event={"ID":"ed2b8d52-e1df-4543-8789-a216f6335f45","Type":"ContainerStarted","Data":"bc645dcdc955353a2108a2964bc1b4c0a18978cbca36ad13224e0b7b90723e5e"} Apr 20 08:01:34.098042 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:34.097723 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-76b95bd89-9vk2b" Apr 20 08:01:34.113300 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:34.113257 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-76b95bd89-9vk2b" podStartSLOduration=1.7983915179999999 podStartE2EDuration="2.113243399s" podCreationTimestamp="2026-04-20 08:01:32 +0000 UTC" firstStartedPulling="2026-04-20 08:01:32.83102255 +0000 UTC m=+676.744652570" lastFinishedPulling="2026-04-20 08:01:33.14587443 +0000 UTC m=+677.059504451" observedRunningTime="2026-04-20 08:01:34.11168896 +0000 UTC m=+678.025319004" watchObservedRunningTime="2026-04-20 08:01:34.113243399 +0000 UTC m=+678.026873441" Apr 20 08:01:34.645877 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:34.645840 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cf038c2-95a7-44b4-b9e7-aa1d370386ac" path="/var/lib/kubelet/pods/8cf038c2-95a7-44b4-b9e7-aa1d370386ac/volumes" Apr 20 08:01:45.107300 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:45.107265 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-76b95bd89-9vk2b" Apr 20 08:01:45.145757 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:45.145729 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-9bc4dd454-jl5hf"] Apr 20 08:01:45.145974 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:45.145954 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-9bc4dd454-jl5hf" podUID="fae7b42c-035a-4791-bbec-db48e6223c0f" containerName="manager" containerID="cri-o://d7e1a8dc36575f99ae75b3d061a87024dc3c8e0126653c981b6ccc5123d88174" gracePeriod=10 Apr 20 08:01:45.392397 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:45.392375 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-9bc4dd454-jl5hf" Apr 20 08:01:45.453419 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:45.453389 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x25bz\" (UniqueName: \"kubernetes.io/projected/fae7b42c-035a-4791-bbec-db48e6223c0f-kube-api-access-x25bz\") pod \"fae7b42c-035a-4791-bbec-db48e6223c0f\" (UID: \"fae7b42c-035a-4791-bbec-db48e6223c0f\") " Apr 20 08:01:45.455404 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:45.455377 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fae7b42c-035a-4791-bbec-db48e6223c0f-kube-api-access-x25bz" (OuterVolumeSpecName: "kube-api-access-x25bz") pod "fae7b42c-035a-4791-bbec-db48e6223c0f" (UID: "fae7b42c-035a-4791-bbec-db48e6223c0f"). InnerVolumeSpecName "kube-api-access-x25bz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 08:01:45.554378 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:45.554344 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x25bz\" (UniqueName: \"kubernetes.io/projected/fae7b42c-035a-4791-bbec-db48e6223c0f-kube-api-access-x25bz\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:01:46.141035 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:46.140997 2572 generic.go:358] "Generic (PLEG): container finished" podID="fae7b42c-035a-4791-bbec-db48e6223c0f" containerID="d7e1a8dc36575f99ae75b3d061a87024dc3c8e0126653c981b6ccc5123d88174" exitCode=0 Apr 20 08:01:46.141503 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:46.141089 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-9bc4dd454-jl5hf" Apr 20 08:01:46.141503 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:46.141086 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-9bc4dd454-jl5hf" event={"ID":"fae7b42c-035a-4791-bbec-db48e6223c0f","Type":"ContainerDied","Data":"d7e1a8dc36575f99ae75b3d061a87024dc3c8e0126653c981b6ccc5123d88174"} Apr 20 08:01:46.141503 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:46.141132 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-9bc4dd454-jl5hf" event={"ID":"fae7b42c-035a-4791-bbec-db48e6223c0f","Type":"ContainerDied","Data":"f36e77e1fe23d741d8ff04ffb87a9d203b8fbc1232a4039afbb97cd6096f4410"} Apr 20 08:01:46.141503 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:46.141176 2572 scope.go:117] "RemoveContainer" containerID="d7e1a8dc36575f99ae75b3d061a87024dc3c8e0126653c981b6ccc5123d88174" Apr 20 08:01:46.149921 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:46.149716 2572 scope.go:117] "RemoveContainer" containerID="d7e1a8dc36575f99ae75b3d061a87024dc3c8e0126653c981b6ccc5123d88174" Apr 20 08:01:46.149985 ip-10-0-133-161 kubenswrapper[2572]: E0420 08:01:46.149965 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7e1a8dc36575f99ae75b3d061a87024dc3c8e0126653c981b6ccc5123d88174\": container with ID starting with d7e1a8dc36575f99ae75b3d061a87024dc3c8e0126653c981b6ccc5123d88174 not found: ID does not exist" containerID="d7e1a8dc36575f99ae75b3d061a87024dc3c8e0126653c981b6ccc5123d88174" Apr 20 08:01:46.150026 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:46.149995 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7e1a8dc36575f99ae75b3d061a87024dc3c8e0126653c981b6ccc5123d88174"} err="failed to get container status \"d7e1a8dc36575f99ae75b3d061a87024dc3c8e0126653c981b6ccc5123d88174\": rpc error: code = NotFound desc = could not find container \"d7e1a8dc36575f99ae75b3d061a87024dc3c8e0126653c981b6ccc5123d88174\": container with ID starting with d7e1a8dc36575f99ae75b3d061a87024dc3c8e0126653c981b6ccc5123d88174 not found: ID does not exist" Apr 20 08:01:46.162154 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:46.162114 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-9bc4dd454-jl5hf"] Apr 20 08:01:46.165611 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:46.165590 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-9bc4dd454-jl5hf"] Apr 20 08:01:46.646299 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:01:46.646267 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fae7b42c-035a-4791-bbec-db48e6223c0f" path="/var/lib/kubelet/pods/fae7b42c-035a-4791-bbec-db48e6223c0f/volumes" Apr 20 08:02:05.495507 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.495412 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn"] Apr 20 08:02:05.496005 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.495984 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fae7b42c-035a-4791-bbec-db48e6223c0f" containerName="manager" Apr 20 08:02:05.496083 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.496009 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="fae7b42c-035a-4791-bbec-db48e6223c0f" containerName="manager" Apr 20 08:02:05.496193 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.496179 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="fae7b42c-035a-4791-bbec-db48e6223c0f" containerName="manager" Apr 20 08:02:05.499592 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.499571 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" Apr 20 08:02:05.503558 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.503529 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 08:02:05.503686 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.503573 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-5swfc\"" Apr 20 08:02:05.503686 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.503582 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 08:02:05.503686 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.503587 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 20 08:02:05.507059 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.506793 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn"] Apr 20 08:02:05.638272 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.638236 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b6b112fb-b08e-47c0-ba76-2746093563d4-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hglfn\" (UID: \"b6b112fb-b08e-47c0-ba76-2746093563d4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" Apr 20 08:02:05.638453 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.638329 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b6b112fb-b08e-47c0-ba76-2746093563d4-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hglfn\" (UID: \"b6b112fb-b08e-47c0-ba76-2746093563d4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" Apr 20 08:02:05.638453 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.638372 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6b112fb-b08e-47c0-ba76-2746093563d4-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hglfn\" (UID: \"b6b112fb-b08e-47c0-ba76-2746093563d4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" Apr 20 08:02:05.638453 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.638393 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck8z8\" (UniqueName: \"kubernetes.io/projected/b6b112fb-b08e-47c0-ba76-2746093563d4-kube-api-access-ck8z8\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hglfn\" (UID: \"b6b112fb-b08e-47c0-ba76-2746093563d4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" Apr 20 08:02:05.638616 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.638551 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b6b112fb-b08e-47c0-ba76-2746093563d4-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hglfn\" (UID: \"b6b112fb-b08e-47c0-ba76-2746093563d4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" Apr 20 08:02:05.638616 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.638588 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b6b112fb-b08e-47c0-ba76-2746093563d4-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hglfn\" (UID: \"b6b112fb-b08e-47c0-ba76-2746093563d4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" Apr 20 08:02:05.739988 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.739934 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b6b112fb-b08e-47c0-ba76-2746093563d4-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hglfn\" (UID: \"b6b112fb-b08e-47c0-ba76-2746093563d4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" Apr 20 08:02:05.740196 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.740018 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6b112fb-b08e-47c0-ba76-2746093563d4-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hglfn\" (UID: \"b6b112fb-b08e-47c0-ba76-2746093563d4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" Apr 20 08:02:05.740196 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.740049 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ck8z8\" (UniqueName: \"kubernetes.io/projected/b6b112fb-b08e-47c0-ba76-2746093563d4-kube-api-access-ck8z8\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hglfn\" (UID: \"b6b112fb-b08e-47c0-ba76-2746093563d4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" Apr 20 08:02:05.740196 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.740116 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b6b112fb-b08e-47c0-ba76-2746093563d4-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hglfn\" (UID: \"b6b112fb-b08e-47c0-ba76-2746093563d4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" Apr 20 08:02:05.740196 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.740158 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b6b112fb-b08e-47c0-ba76-2746093563d4-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hglfn\" (UID: \"b6b112fb-b08e-47c0-ba76-2746093563d4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" Apr 20 08:02:05.740504 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.740222 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b6b112fb-b08e-47c0-ba76-2746093563d4-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hglfn\" (UID: \"b6b112fb-b08e-47c0-ba76-2746093563d4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" Apr 20 08:02:05.740504 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.740411 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b6b112fb-b08e-47c0-ba76-2746093563d4-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hglfn\" (UID: \"b6b112fb-b08e-47c0-ba76-2746093563d4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" Apr 20 08:02:05.740504 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.740424 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b6b112fb-b08e-47c0-ba76-2746093563d4-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hglfn\" (UID: \"b6b112fb-b08e-47c0-ba76-2746093563d4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" Apr 20 08:02:05.740504 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.740486 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b6b112fb-b08e-47c0-ba76-2746093563d4-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hglfn\" (UID: \"b6b112fb-b08e-47c0-ba76-2746093563d4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" Apr 20 08:02:05.742721 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.742688 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b6b112fb-b08e-47c0-ba76-2746093563d4-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hglfn\" (UID: \"b6b112fb-b08e-47c0-ba76-2746093563d4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" Apr 20 08:02:05.742980 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.742963 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b6b112fb-b08e-47c0-ba76-2746093563d4-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hglfn\" (UID: \"b6b112fb-b08e-47c0-ba76-2746093563d4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" Apr 20 08:02:05.748030 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.747973 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck8z8\" (UniqueName: \"kubernetes.io/projected/b6b112fb-b08e-47c0-ba76-2746093563d4-kube-api-access-ck8z8\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-hglfn\" (UID: \"b6b112fb-b08e-47c0-ba76-2746093563d4\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" Apr 20 08:02:05.810794 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.810758 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" Apr 20 08:02:05.944909 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:05.944739 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn"] Apr 20 08:02:05.948389 ip-10-0-133-161 kubenswrapper[2572]: W0420 08:02:05.948307 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6b112fb_b08e_47c0_ba76_2746093563d4.slice/crio-ca89e99c6cfbb9aa8c214a6b5d4f9ca7ca4578e17ffd16ccea7b62f5c3528f40 WatchSource:0}: Error finding container ca89e99c6cfbb9aa8c214a6b5d4f9ca7ca4578e17ffd16ccea7b62f5c3528f40: Status 404 returned error can't find the container with id ca89e99c6cfbb9aa8c214a6b5d4f9ca7ca4578e17ffd16ccea7b62f5c3528f40 Apr 20 08:02:06.214855 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:06.214765 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" event={"ID":"b6b112fb-b08e-47c0-ba76-2746093563d4","Type":"ContainerStarted","Data":"ca89e99c6cfbb9aa8c214a6b5d4f9ca7ca4578e17ffd16ccea7b62f5c3528f40"} Apr 20 08:02:13.247285 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:13.247242 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" event={"ID":"b6b112fb-b08e-47c0-ba76-2746093563d4","Type":"ContainerStarted","Data":"6c35ca761924581eef3150d56ee0dfdf60630654aed1203eb4f64d140a997935"} Apr 20 08:02:15.917870 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:15.917834 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b"] Apr 20 08:02:15.921288 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:15.921267 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" Apr 20 08:02:15.924429 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:15.924410 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 20 08:02:15.932520 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:15.932494 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b"] Apr 20 08:02:16.041622 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:16.041579 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e0aabb5-4108-4518-841f-8d8c4a9ddaf1-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b\" (UID: \"6e0aabb5-4108-4518-841f-8d8c4a9ddaf1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" Apr 20 08:02:16.041821 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:16.041693 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6e0aabb5-4108-4518-841f-8d8c4a9ddaf1-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b\" (UID: \"6e0aabb5-4108-4518-841f-8d8c4a9ddaf1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" Apr 20 08:02:16.041821 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:16.041784 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6e0aabb5-4108-4518-841f-8d8c4a9ddaf1-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b\" (UID: \"6e0aabb5-4108-4518-841f-8d8c4a9ddaf1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" Apr 20 08:02:16.041900 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:16.041869 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6e0aabb5-4108-4518-841f-8d8c4a9ddaf1-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b\" (UID: \"6e0aabb5-4108-4518-841f-8d8c4a9ddaf1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" Apr 20 08:02:16.041939 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:16.041924 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml54w\" (UniqueName: \"kubernetes.io/projected/6e0aabb5-4108-4518-841f-8d8c4a9ddaf1-kube-api-access-ml54w\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b\" (UID: \"6e0aabb5-4108-4518-841f-8d8c4a9ddaf1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" Apr 20 08:02:16.041977 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:16.041967 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6e0aabb5-4108-4518-841f-8d8c4a9ddaf1-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b\" (UID: \"6e0aabb5-4108-4518-841f-8d8c4a9ddaf1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" Apr 20 08:02:16.143055 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:16.143019 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6e0aabb5-4108-4518-841f-8d8c4a9ddaf1-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b\" (UID: \"6e0aabb5-4108-4518-841f-8d8c4a9ddaf1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" Apr 20 08:02:16.143304 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:16.143081 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6e0aabb5-4108-4518-841f-8d8c4a9ddaf1-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b\" (UID: \"6e0aabb5-4108-4518-841f-8d8c4a9ddaf1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" Apr 20 08:02:16.143304 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:16.143121 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ml54w\" (UniqueName: \"kubernetes.io/projected/6e0aabb5-4108-4518-841f-8d8c4a9ddaf1-kube-api-access-ml54w\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b\" (UID: \"6e0aabb5-4108-4518-841f-8d8c4a9ddaf1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" Apr 20 08:02:16.143304 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:16.143192 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6e0aabb5-4108-4518-841f-8d8c4a9ddaf1-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b\" (UID: \"6e0aabb5-4108-4518-841f-8d8c4a9ddaf1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" Apr 20 08:02:16.143304 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:16.143257 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e0aabb5-4108-4518-841f-8d8c4a9ddaf1-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b\" (UID: \"6e0aabb5-4108-4518-841f-8d8c4a9ddaf1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" Apr 20 08:02:16.143515 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:16.143321 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6e0aabb5-4108-4518-841f-8d8c4a9ddaf1-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b\" (UID: \"6e0aabb5-4108-4518-841f-8d8c4a9ddaf1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" Apr 20 08:02:16.143665 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:16.143632 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/6e0aabb5-4108-4518-841f-8d8c4a9ddaf1-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b\" (UID: \"6e0aabb5-4108-4518-841f-8d8c4a9ddaf1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" Apr 20 08:02:16.143801 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:16.143693 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/6e0aabb5-4108-4518-841f-8d8c4a9ddaf1-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b\" (UID: \"6e0aabb5-4108-4518-841f-8d8c4a9ddaf1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" Apr 20 08:02:16.143801 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:16.143778 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e0aabb5-4108-4518-841f-8d8c4a9ddaf1-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b\" (UID: \"6e0aabb5-4108-4518-841f-8d8c4a9ddaf1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" Apr 20 08:02:16.145531 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:16.145511 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/6e0aabb5-4108-4518-841f-8d8c4a9ddaf1-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b\" (UID: \"6e0aabb5-4108-4518-841f-8d8c4a9ddaf1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" Apr 20 08:02:16.145784 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:16.145766 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6e0aabb5-4108-4518-841f-8d8c4a9ddaf1-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b\" (UID: \"6e0aabb5-4108-4518-841f-8d8c4a9ddaf1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" Apr 20 08:02:16.150572 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:16.150545 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml54w\" (UniqueName: \"kubernetes.io/projected/6e0aabb5-4108-4518-841f-8d8c4a9ddaf1-kube-api-access-ml54w\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b\" (UID: \"6e0aabb5-4108-4518-841f-8d8c4a9ddaf1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" Apr 20 08:02:16.233327 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:16.233295 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" Apr 20 08:02:16.368078 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:16.368046 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b"] Apr 20 08:02:16.368889 ip-10-0-133-161 kubenswrapper[2572]: W0420 08:02:16.368853 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e0aabb5_4108_4518_841f_8d8c4a9ddaf1.slice/crio-e14fa33b72e365de476f0cd457709ecdd14c8e5383461d978b7e6ebc99caa4dc WatchSource:0}: Error finding container e14fa33b72e365de476f0cd457709ecdd14c8e5383461d978b7e6ebc99caa4dc: Status 404 returned error can't find the container with id e14fa33b72e365de476f0cd457709ecdd14c8e5383461d978b7e6ebc99caa4dc Apr 20 08:02:16.765681 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:16.765647 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-5bdf69bf7c-jjqn5"] Apr 20 08:02:16.766036 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:16.765985 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-5bdf69bf7c-jjqn5" podUID="98803190-fd71-4b81-8fe2-dd174cec1668" containerName="maas-api" containerID="cri-o://8d1cc3938177270a5e612635db09b407cc7958416298fa6b8f7e07c838e2be0d" gracePeriod=30 Apr 20 08:02:17.037353 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:17.037321 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5bdf69bf7c-jjqn5" Apr 20 08:02:17.155151 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:17.155105 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8g8h\" (UniqueName: \"kubernetes.io/projected/98803190-fd71-4b81-8fe2-dd174cec1668-kube-api-access-n8g8h\") pod \"98803190-fd71-4b81-8fe2-dd174cec1668\" (UID: \"98803190-fd71-4b81-8fe2-dd174cec1668\") " Apr 20 08:02:17.155269 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:17.155216 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/98803190-fd71-4b81-8fe2-dd174cec1668-maas-api-tls\") pod \"98803190-fd71-4b81-8fe2-dd174cec1668\" (UID: \"98803190-fd71-4b81-8fe2-dd174cec1668\") " Apr 20 08:02:17.157439 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:17.157408 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98803190-fd71-4b81-8fe2-dd174cec1668-kube-api-access-n8g8h" (OuterVolumeSpecName: "kube-api-access-n8g8h") pod "98803190-fd71-4b81-8fe2-dd174cec1668" (UID: "98803190-fd71-4b81-8fe2-dd174cec1668"). InnerVolumeSpecName "kube-api-access-n8g8h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 08:02:17.157551 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:17.157416 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98803190-fd71-4b81-8fe2-dd174cec1668-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "98803190-fd71-4b81-8fe2-dd174cec1668" (UID: "98803190-fd71-4b81-8fe2-dd174cec1668"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 08:02:17.256517 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:17.256476 2572 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/98803190-fd71-4b81-8fe2-dd174cec1668-maas-api-tls\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:02:17.256517 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:17.256510 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n8g8h\" (UniqueName: \"kubernetes.io/projected/98803190-fd71-4b81-8fe2-dd174cec1668-kube-api-access-n8g8h\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:02:17.264933 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:17.264899 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" event={"ID":"6e0aabb5-4108-4518-841f-8d8c4a9ddaf1","Type":"ContainerStarted","Data":"5d6de4aa50484888aeb0927fabc3d91c25ded012c1a49bd003222c411aa20b35"} Apr 20 08:02:17.264933 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:17.264937 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" event={"ID":"6e0aabb5-4108-4518-841f-8d8c4a9ddaf1","Type":"ContainerStarted","Data":"e14fa33b72e365de476f0cd457709ecdd14c8e5383461d978b7e6ebc99caa4dc"} Apr 20 08:02:17.266060 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:17.266033 2572 generic.go:358] "Generic (PLEG): container finished" podID="98803190-fd71-4b81-8fe2-dd174cec1668" containerID="8d1cc3938177270a5e612635db09b407cc7958416298fa6b8f7e07c838e2be0d" exitCode=0 Apr 20 08:02:17.266218 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:17.266088 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-5bdf69bf7c-jjqn5" Apr 20 08:02:17.266218 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:17.266090 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5bdf69bf7c-jjqn5" event={"ID":"98803190-fd71-4b81-8fe2-dd174cec1668","Type":"ContainerDied","Data":"8d1cc3938177270a5e612635db09b407cc7958416298fa6b8f7e07c838e2be0d"} Apr 20 08:02:17.266218 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:17.266181 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-5bdf69bf7c-jjqn5" event={"ID":"98803190-fd71-4b81-8fe2-dd174cec1668","Type":"ContainerDied","Data":"5c3657f41e37d45b60a3f587e691bdc0e86e70208a25eba42bcc660af9613b24"} Apr 20 08:02:17.266218 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:17.266197 2572 scope.go:117] "RemoveContainer" containerID="8d1cc3938177270a5e612635db09b407cc7958416298fa6b8f7e07c838e2be0d" Apr 20 08:02:17.276700 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:17.276677 2572 scope.go:117] "RemoveContainer" containerID="8d1cc3938177270a5e612635db09b407cc7958416298fa6b8f7e07c838e2be0d" Apr 20 08:02:17.277025 ip-10-0-133-161 kubenswrapper[2572]: E0420 08:02:17.276998 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d1cc3938177270a5e612635db09b407cc7958416298fa6b8f7e07c838e2be0d\": container with ID starting with 8d1cc3938177270a5e612635db09b407cc7958416298fa6b8f7e07c838e2be0d not found: ID does not exist" containerID="8d1cc3938177270a5e612635db09b407cc7958416298fa6b8f7e07c838e2be0d" Apr 20 08:02:17.277097 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:17.277039 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d1cc3938177270a5e612635db09b407cc7958416298fa6b8f7e07c838e2be0d"} err="failed to get container status \"8d1cc3938177270a5e612635db09b407cc7958416298fa6b8f7e07c838e2be0d\": rpc error: code = NotFound desc = could not find container \"8d1cc3938177270a5e612635db09b407cc7958416298fa6b8f7e07c838e2be0d\": container with ID starting with 8d1cc3938177270a5e612635db09b407cc7958416298fa6b8f7e07c838e2be0d not found: ID does not exist" Apr 20 08:02:17.313766 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:17.313734 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-5bdf69bf7c-jjqn5"] Apr 20 08:02:17.325171 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:17.325130 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-5bdf69bf7c-jjqn5"] Apr 20 08:02:18.272730 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:18.272695 2572 generic.go:358] "Generic (PLEG): container finished" podID="b6b112fb-b08e-47c0-ba76-2746093563d4" containerID="6c35ca761924581eef3150d56ee0dfdf60630654aed1203eb4f64d140a997935" exitCode=0 Apr 20 08:02:18.273226 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:18.272773 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" event={"ID":"b6b112fb-b08e-47c0-ba76-2746093563d4","Type":"ContainerDied","Data":"6c35ca761924581eef3150d56ee0dfdf60630654aed1203eb4f64d140a997935"} Apr 20 08:02:18.647791 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:18.647757 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98803190-fd71-4b81-8fe2-dd174cec1668" path="/var/lib/kubelet/pods/98803190-fd71-4b81-8fe2-dd174cec1668/volumes" Apr 20 08:02:20.285391 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:20.285353 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" event={"ID":"b6b112fb-b08e-47c0-ba76-2746093563d4","Type":"ContainerStarted","Data":"0af1efd0959ce41db6909cb1cc9e77870aaf8f5e0e527923bc5946fd25c85890"} Apr 20 08:02:20.285777 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:20.285570 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" Apr 20 08:02:20.303332 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:20.303291 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" podStartSLOduration=1.492070116 podStartE2EDuration="15.303276235s" podCreationTimestamp="2026-04-20 08:02:05 +0000 UTC" firstStartedPulling="2026-04-20 08:02:05.950298587 +0000 UTC m=+709.863928608" lastFinishedPulling="2026-04-20 08:02:19.761504706 +0000 UTC m=+723.675134727" observedRunningTime="2026-04-20 08:02:20.301208071 +0000 UTC m=+724.214838112" watchObservedRunningTime="2026-04-20 08:02:20.303276235 +0000 UTC m=+724.216906284" Apr 20 08:02:22.293820 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:22.293788 2572 generic.go:358] "Generic (PLEG): container finished" podID="6e0aabb5-4108-4518-841f-8d8c4a9ddaf1" containerID="5d6de4aa50484888aeb0927fabc3d91c25ded012c1a49bd003222c411aa20b35" exitCode=0 Apr 20 08:02:22.294132 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:22.293856 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" event={"ID":"6e0aabb5-4108-4518-841f-8d8c4a9ddaf1","Type":"ContainerDied","Data":"5d6de4aa50484888aeb0927fabc3d91c25ded012c1a49bd003222c411aa20b35"} Apr 20 08:02:23.299004 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:23.298973 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" event={"ID":"6e0aabb5-4108-4518-841f-8d8c4a9ddaf1","Type":"ContainerStarted","Data":"af7897fbe714f55881ac0903d04be7304c64fa6ca1fb8609f940589fb1011211"} Apr 20 08:02:23.299407 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:23.299187 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" Apr 20 08:02:23.316823 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:23.316775 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" podStartSLOduration=8.064555176 podStartE2EDuration="8.316762155s" podCreationTimestamp="2026-04-20 08:02:15 +0000 UTC" firstStartedPulling="2026-04-20 08:02:22.294527309 +0000 UTC m=+726.208157330" lastFinishedPulling="2026-04-20 08:02:22.546734285 +0000 UTC m=+726.460364309" observedRunningTime="2026-04-20 08:02:23.315817175 +0000 UTC m=+727.229447218" watchObservedRunningTime="2026-04-20 08:02:23.316762155 +0000 UTC m=+727.230392197" Apr 20 08:02:31.305109 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:31.305077 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-hglfn" Apr 20 08:02:34.315268 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:34.315237 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b" Apr 20 08:02:35.693168 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.693120 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c"] Apr 20 08:02:35.693612 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.693562 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98803190-fd71-4b81-8fe2-dd174cec1668" containerName="maas-api" Apr 20 08:02:35.693612 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.693575 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="98803190-fd71-4b81-8fe2-dd174cec1668" containerName="maas-api" Apr 20 08:02:35.693695 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.693661 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="98803190-fd71-4b81-8fe2-dd174cec1668" containerName="maas-api" Apr 20 08:02:35.706420 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.706394 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c"] Apr 20 08:02:35.706571 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.706535 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" Apr 20 08:02:35.709122 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.709099 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 20 08:02:35.841125 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.841087 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/56d11ca3-9ffa-496f-8ff7-e6bd731e060c-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c\" (UID: \"56d11ca3-9ffa-496f-8ff7-e6bd731e060c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" Apr 20 08:02:35.841363 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.841190 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/56d11ca3-9ffa-496f-8ff7-e6bd731e060c-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c\" (UID: \"56d11ca3-9ffa-496f-8ff7-e6bd731e060c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" Apr 20 08:02:35.841363 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.841234 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56d11ca3-9ffa-496f-8ff7-e6bd731e060c-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c\" (UID: \"56d11ca3-9ffa-496f-8ff7-e6bd731e060c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" Apr 20 08:02:35.841363 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.841301 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk2sb\" (UniqueName: \"kubernetes.io/projected/56d11ca3-9ffa-496f-8ff7-e6bd731e060c-kube-api-access-bk2sb\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c\" (UID: \"56d11ca3-9ffa-496f-8ff7-e6bd731e060c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" Apr 20 08:02:35.841363 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.841331 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56d11ca3-9ffa-496f-8ff7-e6bd731e060c-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c\" (UID: \"56d11ca3-9ffa-496f-8ff7-e6bd731e060c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" Apr 20 08:02:35.841363 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.841356 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/56d11ca3-9ffa-496f-8ff7-e6bd731e060c-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c\" (UID: \"56d11ca3-9ffa-496f-8ff7-e6bd731e060c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" Apr 20 08:02:35.942264 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.942227 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/56d11ca3-9ffa-496f-8ff7-e6bd731e060c-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c\" (UID: \"56d11ca3-9ffa-496f-8ff7-e6bd731e060c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" Apr 20 08:02:35.942457 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.942280 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56d11ca3-9ffa-496f-8ff7-e6bd731e060c-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c\" (UID: \"56d11ca3-9ffa-496f-8ff7-e6bd731e060c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" Apr 20 08:02:35.942457 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.942328 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bk2sb\" (UniqueName: \"kubernetes.io/projected/56d11ca3-9ffa-496f-8ff7-e6bd731e060c-kube-api-access-bk2sb\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c\" (UID: \"56d11ca3-9ffa-496f-8ff7-e6bd731e060c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" Apr 20 08:02:35.942457 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.942351 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56d11ca3-9ffa-496f-8ff7-e6bd731e060c-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c\" (UID: \"56d11ca3-9ffa-496f-8ff7-e6bd731e060c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" Apr 20 08:02:35.942457 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.942372 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/56d11ca3-9ffa-496f-8ff7-e6bd731e060c-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c\" (UID: \"56d11ca3-9ffa-496f-8ff7-e6bd731e060c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" Apr 20 08:02:35.942668 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.942516 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/56d11ca3-9ffa-496f-8ff7-e6bd731e060c-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c\" (UID: \"56d11ca3-9ffa-496f-8ff7-e6bd731e060c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" Apr 20 08:02:35.942742 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.942719 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/56d11ca3-9ffa-496f-8ff7-e6bd731e060c-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c\" (UID: \"56d11ca3-9ffa-496f-8ff7-e6bd731e060c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" Apr 20 08:02:35.942889 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.942861 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/56d11ca3-9ffa-496f-8ff7-e6bd731e060c-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c\" (UID: \"56d11ca3-9ffa-496f-8ff7-e6bd731e060c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" Apr 20 08:02:35.942953 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.942926 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/56d11ca3-9ffa-496f-8ff7-e6bd731e060c-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c\" (UID: \"56d11ca3-9ffa-496f-8ff7-e6bd731e060c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" Apr 20 08:02:35.944676 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.944623 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/56d11ca3-9ffa-496f-8ff7-e6bd731e060c-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c\" (UID: \"56d11ca3-9ffa-496f-8ff7-e6bd731e060c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" Apr 20 08:02:35.944775 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.944743 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/56d11ca3-9ffa-496f-8ff7-e6bd731e060c-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c\" (UID: \"56d11ca3-9ffa-496f-8ff7-e6bd731e060c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" Apr 20 08:02:35.952370 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:35.952344 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk2sb\" (UniqueName: \"kubernetes.io/projected/56d11ca3-9ffa-496f-8ff7-e6bd731e060c-kube-api-access-bk2sb\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c\" (UID: \"56d11ca3-9ffa-496f-8ff7-e6bd731e060c\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" Apr 20 08:02:36.019165 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:36.019105 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" Apr 20 08:02:36.148992 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:36.148903 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c"] Apr 20 08:02:36.151366 ip-10-0-133-161 kubenswrapper[2572]: W0420 08:02:36.151339 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56d11ca3_9ffa_496f_8ff7_e6bd731e060c.slice/crio-47caa6cd461b7f09f703ea64b678018c4cee69808ba62f4537dba6d192d4bb90 WatchSource:0}: Error finding container 47caa6cd461b7f09f703ea64b678018c4cee69808ba62f4537dba6d192d4bb90: Status 404 returned error can't find the container with id 47caa6cd461b7f09f703ea64b678018c4cee69808ba62f4537dba6d192d4bb90 Apr 20 08:02:36.348351 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:36.348312 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" event={"ID":"56d11ca3-9ffa-496f-8ff7-e6bd731e060c","Type":"ContainerStarted","Data":"699ce19eff7f86dd8a460a7bfb5b739b7ced1af77e1f7a2990f0759bc5da9266"} Apr 20 08:02:36.348351 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:36.348357 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" event={"ID":"56d11ca3-9ffa-496f-8ff7-e6bd731e060c","Type":"ContainerStarted","Data":"47caa6cd461b7f09f703ea64b678018c4cee69808ba62f4537dba6d192d4bb90"} Apr 20 08:02:42.371919 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:42.371878 2572 generic.go:358] "Generic (PLEG): container finished" podID="56d11ca3-9ffa-496f-8ff7-e6bd731e060c" containerID="699ce19eff7f86dd8a460a7bfb5b739b7ced1af77e1f7a2990f0759bc5da9266" exitCode=0 Apr 20 08:02:42.372360 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:42.371952 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" event={"ID":"56d11ca3-9ffa-496f-8ff7-e6bd731e060c","Type":"ContainerDied","Data":"699ce19eff7f86dd8a460a7bfb5b739b7ced1af77e1f7a2990f0759bc5da9266"} Apr 20 08:02:43.377118 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:43.377080 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" event={"ID":"56d11ca3-9ffa-496f-8ff7-e6bd731e060c","Type":"ContainerStarted","Data":"13d3aab590570b96673d473b13bf33c30dba7a48a8d0a62e9f72faad96171511"} Apr 20 08:02:43.377557 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:43.377355 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" Apr 20 08:02:43.394249 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:43.394194 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" podStartSLOduration=8.149022095 podStartE2EDuration="8.394175898s" podCreationTimestamp="2026-04-20 08:02:35 +0000 UTC" firstStartedPulling="2026-04-20 08:02:42.372574978 +0000 UTC m=+746.286204999" lastFinishedPulling="2026-04-20 08:02:42.617728781 +0000 UTC m=+746.531358802" observedRunningTime="2026-04-20 08:02:43.394034135 +0000 UTC m=+747.307664215" watchObservedRunningTime="2026-04-20 08:02:43.394175898 +0000 UTC m=+747.307805942" Apr 20 08:02:54.397063 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:02:54.397034 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c" Apr 20 08:03:32.822344 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:32.822263 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-866b885fb7-fj7qc"] Apr 20 08:03:32.827437 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:32.827418 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-866b885fb7-fj7qc" Apr 20 08:03:32.831760 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:32.831733 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-866b885fb7-fj7qc"] Apr 20 08:03:32.887744 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:32.887703 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e9249804-c51b-45fe-b121-b7e427543972-tls-cert\") pod \"authorino-866b885fb7-fj7qc\" (UID: \"e9249804-c51b-45fe-b121-b7e427543972\") " pod="kuadrant-system/authorino-866b885fb7-fj7qc" Apr 20 08:03:32.887944 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:32.887753 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krmmj\" (UniqueName: \"kubernetes.io/projected/e9249804-c51b-45fe-b121-b7e427543972-kube-api-access-krmmj\") pod \"authorino-866b885fb7-fj7qc\" (UID: \"e9249804-c51b-45fe-b121-b7e427543972\") " pod="kuadrant-system/authorino-866b885fb7-fj7qc" Apr 20 08:03:32.988936 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:32.988896 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e9249804-c51b-45fe-b121-b7e427543972-tls-cert\") pod \"authorino-866b885fb7-fj7qc\" (UID: \"e9249804-c51b-45fe-b121-b7e427543972\") " pod="kuadrant-system/authorino-866b885fb7-fj7qc" Apr 20 08:03:32.989105 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:32.988952 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krmmj\" (UniqueName: \"kubernetes.io/projected/e9249804-c51b-45fe-b121-b7e427543972-kube-api-access-krmmj\") pod \"authorino-866b885fb7-fj7qc\" (UID: \"e9249804-c51b-45fe-b121-b7e427543972\") " pod="kuadrant-system/authorino-866b885fb7-fj7qc" Apr 20 08:03:32.991467 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:32.991442 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/e9249804-c51b-45fe-b121-b7e427543972-tls-cert\") pod \"authorino-866b885fb7-fj7qc\" (UID: \"e9249804-c51b-45fe-b121-b7e427543972\") " pod="kuadrant-system/authorino-866b885fb7-fj7qc" Apr 20 08:03:32.996015 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:32.995990 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krmmj\" (UniqueName: \"kubernetes.io/projected/e9249804-c51b-45fe-b121-b7e427543972-kube-api-access-krmmj\") pod \"authorino-866b885fb7-fj7qc\" (UID: \"e9249804-c51b-45fe-b121-b7e427543972\") " pod="kuadrant-system/authorino-866b885fb7-fj7qc" Apr 20 08:03:33.139337 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:33.139245 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-866b885fb7-fj7qc" Apr 20 08:03:33.272812 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:33.272785 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-866b885fb7-fj7qc"] Apr 20 08:03:33.274811 ip-10-0-133-161 kubenswrapper[2572]: W0420 08:03:33.274770 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode9249804_c51b_45fe_b121_b7e427543972.slice/crio-c267acf026f55807e5e76c32167818ffb34ec1045d3f27dc5e426d3d573206f5 WatchSource:0}: Error finding container c267acf026f55807e5e76c32167818ffb34ec1045d3f27dc5e426d3d573206f5: Status 404 returned error can't find the container with id c267acf026f55807e5e76c32167818ffb34ec1045d3f27dc5e426d3d573206f5 Apr 20 08:03:33.566312 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:33.566274 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-866b885fb7-fj7qc" event={"ID":"e9249804-c51b-45fe-b121-b7e427543972","Type":"ContainerStarted","Data":"c267acf026f55807e5e76c32167818ffb34ec1045d3f27dc5e426d3d573206f5"} Apr 20 08:03:34.571594 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:34.571554 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-866b885fb7-fj7qc" event={"ID":"e9249804-c51b-45fe-b121-b7e427543972","Type":"ContainerStarted","Data":"ae3dcb12b828a2c4fafe1e8d453618ddf4fbdff21b8060fbeec2f27ed2fdcd49"} Apr 20 08:03:34.589851 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:34.589802 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-866b885fb7-fj7qc" podStartSLOduration=2.156781976 podStartE2EDuration="2.589789125s" podCreationTimestamp="2026-04-20 08:03:32 +0000 UTC" firstStartedPulling="2026-04-20 08:03:33.276045761 +0000 UTC m=+797.189675783" lastFinishedPulling="2026-04-20 08:03:33.709052908 +0000 UTC m=+797.622682932" observedRunningTime="2026-04-20 08:03:34.587845977 +0000 UTC m=+798.501476019" watchObservedRunningTime="2026-04-20 08:03:34.589789125 +0000 UTC m=+798.503419167" Apr 20 08:03:34.615374 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:34.615337 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6c74bc5c86-btw69"] Apr 20 08:03:34.615548 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:34.615526 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-6c74bc5c86-btw69" podUID="8f35128c-80a7-42fb-af31-c1c4dafe8044" containerName="authorino" containerID="cri-o://f3244b6c243e33385d7cac29adf942ffd607477431b5fa7299292a5fe0475617" gracePeriod=30 Apr 20 08:03:34.862996 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:34.862973 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6c74bc5c86-btw69" Apr 20 08:03:34.908813 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:34.908779 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5qnv\" (UniqueName: \"kubernetes.io/projected/8f35128c-80a7-42fb-af31-c1c4dafe8044-kube-api-access-b5qnv\") pod \"8f35128c-80a7-42fb-af31-c1c4dafe8044\" (UID: \"8f35128c-80a7-42fb-af31-c1c4dafe8044\") " Apr 20 08:03:34.908980 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:34.908882 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/8f35128c-80a7-42fb-af31-c1c4dafe8044-tls-cert\") pod \"8f35128c-80a7-42fb-af31-c1c4dafe8044\" (UID: \"8f35128c-80a7-42fb-af31-c1c4dafe8044\") " Apr 20 08:03:34.911000 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:34.910967 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f35128c-80a7-42fb-af31-c1c4dafe8044-kube-api-access-b5qnv" (OuterVolumeSpecName: "kube-api-access-b5qnv") pod "8f35128c-80a7-42fb-af31-c1c4dafe8044" (UID: "8f35128c-80a7-42fb-af31-c1c4dafe8044"). InnerVolumeSpecName "kube-api-access-b5qnv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 08:03:34.919436 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:34.919411 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f35128c-80a7-42fb-af31-c1c4dafe8044-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "8f35128c-80a7-42fb-af31-c1c4dafe8044" (UID: "8f35128c-80a7-42fb-af31-c1c4dafe8044"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 08:03:35.010249 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:35.010219 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b5qnv\" (UniqueName: \"kubernetes.io/projected/8f35128c-80a7-42fb-af31-c1c4dafe8044-kube-api-access-b5qnv\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:03:35.010249 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:35.010247 2572 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/8f35128c-80a7-42fb-af31-c1c4dafe8044-tls-cert\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:03:35.576524 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:35.576487 2572 generic.go:358] "Generic (PLEG): container finished" podID="8f35128c-80a7-42fb-af31-c1c4dafe8044" containerID="f3244b6c243e33385d7cac29adf942ffd607477431b5fa7299292a5fe0475617" exitCode=0 Apr 20 08:03:35.576968 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:35.576545 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-6c74bc5c86-btw69" Apr 20 08:03:35.576968 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:35.576572 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6c74bc5c86-btw69" event={"ID":"8f35128c-80a7-42fb-af31-c1c4dafe8044","Type":"ContainerDied","Data":"f3244b6c243e33385d7cac29adf942ffd607477431b5fa7299292a5fe0475617"} Apr 20 08:03:35.576968 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:35.576613 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-6c74bc5c86-btw69" event={"ID":"8f35128c-80a7-42fb-af31-c1c4dafe8044","Type":"ContainerDied","Data":"2f013dd2c020471618f4fea383aba726fc15fedee601909dd419a1e32837ce74"} Apr 20 08:03:35.576968 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:35.576638 2572 scope.go:117] "RemoveContainer" containerID="f3244b6c243e33385d7cac29adf942ffd607477431b5fa7299292a5fe0475617" Apr 20 08:03:35.585006 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:35.584986 2572 scope.go:117] "RemoveContainer" containerID="f3244b6c243e33385d7cac29adf942ffd607477431b5fa7299292a5fe0475617" Apr 20 08:03:35.585296 ip-10-0-133-161 kubenswrapper[2572]: E0420 08:03:35.585277 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3244b6c243e33385d7cac29adf942ffd607477431b5fa7299292a5fe0475617\": container with ID starting with f3244b6c243e33385d7cac29adf942ffd607477431b5fa7299292a5fe0475617 not found: ID does not exist" containerID="f3244b6c243e33385d7cac29adf942ffd607477431b5fa7299292a5fe0475617" Apr 20 08:03:35.585370 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:35.585310 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3244b6c243e33385d7cac29adf942ffd607477431b5fa7299292a5fe0475617"} err="failed to get container status \"f3244b6c243e33385d7cac29adf942ffd607477431b5fa7299292a5fe0475617\": rpc error: code = NotFound desc = could not find container \"f3244b6c243e33385d7cac29adf942ffd607477431b5fa7299292a5fe0475617\": container with ID starting with f3244b6c243e33385d7cac29adf942ffd607477431b5fa7299292a5fe0475617 not found: ID does not exist" Apr 20 08:03:35.599515 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:35.599488 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-6c74bc5c86-btw69"] Apr 20 08:03:35.602750 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:35.602724 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-6c74bc5c86-btw69"] Apr 20 08:03:36.646615 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:03:36.646581 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f35128c-80a7-42fb-af31-c1c4dafe8044" path="/var/lib/kubelet/pods/8f35128c-80a7-42fb-af31-c1c4dafe8044/volumes" Apr 20 08:04:57.798959 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:57.798923 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-76b95bd89-9vk2b"] Apr 20 08:04:57.799488 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:57.799206 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-76b95bd89-9vk2b" podUID="ed2b8d52-e1df-4543-8789-a216f6335f45" containerName="manager" containerID="cri-o://bc645dcdc955353a2108a2964bc1b4c0a18978cbca36ad13224e0b7b90723e5e" gracePeriod=10 Apr 20 08:04:58.059372 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:58.059303 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-76b95bd89-9vk2b" Apr 20 08:04:58.118385 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:58.118343 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5f6z\" (UniqueName: \"kubernetes.io/projected/ed2b8d52-e1df-4543-8789-a216f6335f45-kube-api-access-s5f6z\") pod \"ed2b8d52-e1df-4543-8789-a216f6335f45\" (UID: \"ed2b8d52-e1df-4543-8789-a216f6335f45\") " Apr 20 08:04:58.120449 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:58.120428 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed2b8d52-e1df-4543-8789-a216f6335f45-kube-api-access-s5f6z" (OuterVolumeSpecName: "kube-api-access-s5f6z") pod "ed2b8d52-e1df-4543-8789-a216f6335f45" (UID: "ed2b8d52-e1df-4543-8789-a216f6335f45"). InnerVolumeSpecName "kube-api-access-s5f6z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 08:04:58.219895 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:58.219860 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s5f6z\" (UniqueName: \"kubernetes.io/projected/ed2b8d52-e1df-4543-8789-a216f6335f45-kube-api-access-s5f6z\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:04:58.882831 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:58.882792 2572 generic.go:358] "Generic (PLEG): container finished" podID="ed2b8d52-e1df-4543-8789-a216f6335f45" containerID="bc645dcdc955353a2108a2964bc1b4c0a18978cbca36ad13224e0b7b90723e5e" exitCode=0 Apr 20 08:04:58.883313 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:58.882862 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-76b95bd89-9vk2b" Apr 20 08:04:58.883313 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:58.882880 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-76b95bd89-9vk2b" event={"ID":"ed2b8d52-e1df-4543-8789-a216f6335f45","Type":"ContainerDied","Data":"bc645dcdc955353a2108a2964bc1b4c0a18978cbca36ad13224e0b7b90723e5e"} Apr 20 08:04:58.883313 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:58.882928 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-76b95bd89-9vk2b" event={"ID":"ed2b8d52-e1df-4543-8789-a216f6335f45","Type":"ContainerDied","Data":"0e6d6a5306c1de09a063bd53a337626905afb75d6cc0e17a6ac9b920da68df1f"} Apr 20 08:04:58.883313 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:58.882951 2572 scope.go:117] "RemoveContainer" containerID="bc645dcdc955353a2108a2964bc1b4c0a18978cbca36ad13224e0b7b90723e5e" Apr 20 08:04:58.891320 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:58.891275 2572 scope.go:117] "RemoveContainer" containerID="bc645dcdc955353a2108a2964bc1b4c0a18978cbca36ad13224e0b7b90723e5e" Apr 20 08:04:58.891545 ip-10-0-133-161 kubenswrapper[2572]: E0420 08:04:58.891527 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc645dcdc955353a2108a2964bc1b4c0a18978cbca36ad13224e0b7b90723e5e\": container with ID starting with bc645dcdc955353a2108a2964bc1b4c0a18978cbca36ad13224e0b7b90723e5e not found: ID does not exist" containerID="bc645dcdc955353a2108a2964bc1b4c0a18978cbca36ad13224e0b7b90723e5e" Apr 20 08:04:58.891595 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:58.891553 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc645dcdc955353a2108a2964bc1b4c0a18978cbca36ad13224e0b7b90723e5e"} err="failed to get container status \"bc645dcdc955353a2108a2964bc1b4c0a18978cbca36ad13224e0b7b90723e5e\": rpc error: code = NotFound desc = could not find container \"bc645dcdc955353a2108a2964bc1b4c0a18978cbca36ad13224e0b7b90723e5e\": container with ID starting with bc645dcdc955353a2108a2964bc1b4c0a18978cbca36ad13224e0b7b90723e5e not found: ID does not exist" Apr 20 08:04:58.900193 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:58.900169 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-76b95bd89-9vk2b"] Apr 20 08:04:58.903150 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:58.903116 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-76b95bd89-9vk2b"] Apr 20 08:04:59.647962 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:59.647928 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-76b95bd89-t7mnh"] Apr 20 08:04:59.648348 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:59.648333 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f35128c-80a7-42fb-af31-c1c4dafe8044" containerName="authorino" Apr 20 08:04:59.648401 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:59.648350 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f35128c-80a7-42fb-af31-c1c4dafe8044" containerName="authorino" Apr 20 08:04:59.648401 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:59.648367 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed2b8d52-e1df-4543-8789-a216f6335f45" containerName="manager" Apr 20 08:04:59.648401 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:59.648373 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed2b8d52-e1df-4543-8789-a216f6335f45" containerName="manager" Apr 20 08:04:59.648499 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:59.648439 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f35128c-80a7-42fb-af31-c1c4dafe8044" containerName="authorino" Apr 20 08:04:59.648499 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:59.648449 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed2b8d52-e1df-4543-8789-a216f6335f45" containerName="manager" Apr 20 08:04:59.653000 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:59.652984 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-76b95bd89-t7mnh" Apr 20 08:04:59.655462 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:59.655437 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-nrd72\"" Apr 20 08:04:59.658835 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:59.658809 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-76b95bd89-t7mnh"] Apr 20 08:04:59.733286 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:59.733247 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xsnp\" (UniqueName: \"kubernetes.io/projected/028f7148-25ae-4040-9f41-1f0686ccc5e5-kube-api-access-5xsnp\") pod \"maas-controller-76b95bd89-t7mnh\" (UID: \"028f7148-25ae-4040-9f41-1f0686ccc5e5\") " pod="opendatahub/maas-controller-76b95bd89-t7mnh" Apr 20 08:04:59.834423 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:59.834384 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5xsnp\" (UniqueName: \"kubernetes.io/projected/028f7148-25ae-4040-9f41-1f0686ccc5e5-kube-api-access-5xsnp\") pod \"maas-controller-76b95bd89-t7mnh\" (UID: \"028f7148-25ae-4040-9f41-1f0686ccc5e5\") " pod="opendatahub/maas-controller-76b95bd89-t7mnh" Apr 20 08:04:59.841898 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:59.841864 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xsnp\" (UniqueName: \"kubernetes.io/projected/028f7148-25ae-4040-9f41-1f0686ccc5e5-kube-api-access-5xsnp\") pod \"maas-controller-76b95bd89-t7mnh\" (UID: \"028f7148-25ae-4040-9f41-1f0686ccc5e5\") " pod="opendatahub/maas-controller-76b95bd89-t7mnh" Apr 20 08:04:59.964960 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:04:59.964871 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-76b95bd89-t7mnh" Apr 20 08:05:00.089435 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:05:00.089408 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-76b95bd89-t7mnh"] Apr 20 08:05:00.091771 ip-10-0-133-161 kubenswrapper[2572]: W0420 08:05:00.091738 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod028f7148_25ae_4040_9f41_1f0686ccc5e5.slice/crio-d0358faac22988123e1987ce065afa9392fba775f7628234c2f36041f1399621 WatchSource:0}: Error finding container d0358faac22988123e1987ce065afa9392fba775f7628234c2f36041f1399621: Status 404 returned error can't find the container with id d0358faac22988123e1987ce065afa9392fba775f7628234c2f36041f1399621 Apr 20 08:05:00.649276 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:05:00.649242 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed2b8d52-e1df-4543-8789-a216f6335f45" path="/var/lib/kubelet/pods/ed2b8d52-e1df-4543-8789-a216f6335f45/volumes" Apr 20 08:05:00.892353 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:05:00.892260 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-76b95bd89-t7mnh" event={"ID":"028f7148-25ae-4040-9f41-1f0686ccc5e5","Type":"ContainerStarted","Data":"b9cd8ed8939f66e51f1f356dac2b357f04c8a627177bf8cb95163ea293e18878"} Apr 20 08:05:00.892353 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:05:00.892302 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-76b95bd89-t7mnh" event={"ID":"028f7148-25ae-4040-9f41-1f0686ccc5e5","Type":"ContainerStarted","Data":"d0358faac22988123e1987ce065afa9392fba775f7628234c2f36041f1399621"} Apr 20 08:05:00.892541 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:05:00.892374 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-76b95bd89-t7mnh" Apr 20 08:05:00.908701 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:05:00.908649 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-76b95bd89-t7mnh" podStartSLOduration=1.476737449 podStartE2EDuration="1.908631835s" podCreationTimestamp="2026-04-20 08:04:59 +0000 UTC" firstStartedPulling="2026-04-20 08:05:00.092977511 +0000 UTC m=+884.006607533" lastFinishedPulling="2026-04-20 08:05:00.524871888 +0000 UTC m=+884.438501919" observedRunningTime="2026-04-20 08:05:00.90637355 +0000 UTC m=+884.820003699" watchObservedRunningTime="2026-04-20 08:05:00.908631835 +0000 UTC m=+884.822261868" Apr 20 08:05:11.901994 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:05:11.901958 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-76b95bd89-t7mnh" Apr 20 08:05:16.606766 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:05:16.606738 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2chjv_29883fa5-e5e1-425a-85c2-3b3bd3ada0aa/console-operator/1.log" Apr 20 08:05:16.607264 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:05:16.607202 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2chjv_29883fa5-e5e1-425a-85c2-3b3bd3ada0aa/console-operator/1.log" Apr 20 08:05:16.611408 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:05:16.611383 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovn-acl-logging/0.log" Apr 20 08:05:16.612168 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:05:16.612127 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovn-acl-logging/0.log" Apr 20 08:10:16.634877 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:10:16.634847 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2chjv_29883fa5-e5e1-425a-85c2-3b3bd3ada0aa/console-operator/1.log" Apr 20 08:10:16.636616 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:10:16.636596 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2chjv_29883fa5-e5e1-425a-85c2-3b3bd3ada0aa/console-operator/1.log" Apr 20 08:10:16.639697 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:10:16.639676 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovn-acl-logging/0.log" Apr 20 08:10:16.641555 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:10:16.641532 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovn-acl-logging/0.log" Apr 20 08:15:00.141496 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:00.141460 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29611215-65s7d"] Apr 20 08:15:00.144884 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:00.144867 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611215-65s7d" Apr 20 08:15:00.147674 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:00.147654 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-wlc48\"" Apr 20 08:15:00.161290 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:00.161260 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611215-65s7d"] Apr 20 08:15:00.187056 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:00.187023 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zmb2\" (UniqueName: \"kubernetes.io/projected/4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5-kube-api-access-4zmb2\") pod \"maas-api-key-cleanup-29611215-65s7d\" (UID: \"4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5\") " pod="opendatahub/maas-api-key-cleanup-29611215-65s7d" Apr 20 08:15:00.288323 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:00.288286 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zmb2\" (UniqueName: \"kubernetes.io/projected/4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5-kube-api-access-4zmb2\") pod \"maas-api-key-cleanup-29611215-65s7d\" (UID: \"4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5\") " pod="opendatahub/maas-api-key-cleanup-29611215-65s7d" Apr 20 08:15:00.296209 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:00.296173 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zmb2\" (UniqueName: \"kubernetes.io/projected/4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5-kube-api-access-4zmb2\") pod \"maas-api-key-cleanup-29611215-65s7d\" (UID: \"4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5\") " pod="opendatahub/maas-api-key-cleanup-29611215-65s7d" Apr 20 08:15:00.455455 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:00.455349 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611215-65s7d" Apr 20 08:15:00.577614 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:00.577586 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611215-65s7d"] Apr 20 08:15:00.579701 ip-10-0-133-161 kubenswrapper[2572]: W0420 08:15:00.579668 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e2b2f72_29a0_4c2d_a4cf_442c5cf1f5c5.slice/crio-2637456cd261391f3a7aed785f261dc3fb374b69fe43f1ffa902c803ec12bb30 WatchSource:0}: Error finding container 2637456cd261391f3a7aed785f261dc3fb374b69fe43f1ffa902c803ec12bb30: Status 404 returned error can't find the container with id 2637456cd261391f3a7aed785f261dc3fb374b69fe43f1ffa902c803ec12bb30 Apr 20 08:15:00.581659 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:00.581642 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 08:15:01.098796 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:01.098760 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611215-65s7d" event={"ID":"4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5","Type":"ContainerStarted","Data":"2637456cd261391f3a7aed785f261dc3fb374b69fe43f1ffa902c803ec12bb30"} Apr 20 08:15:03.108111 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:03.108068 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611215-65s7d" event={"ID":"4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5","Type":"ContainerStarted","Data":"470c1a6e99b6aeddf26f407a846903d77abea5f8ddfc3e83b5c56375c2298970"} Apr 20 08:15:03.123420 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:03.123365 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29611215-65s7d" podStartSLOduration=1.383908632 podStartE2EDuration="3.123350311s" podCreationTimestamp="2026-04-20 08:15:00 +0000 UTC" firstStartedPulling="2026-04-20 08:15:00.581767908 +0000 UTC m=+1484.495397929" lastFinishedPulling="2026-04-20 08:15:02.321209587 +0000 UTC m=+1486.234839608" observedRunningTime="2026-04-20 08:15:03.121662678 +0000 UTC m=+1487.035292721" watchObservedRunningTime="2026-04-20 08:15:03.123350311 +0000 UTC m=+1487.036980353" Apr 20 08:15:16.664101 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:16.664070 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2chjv_29883fa5-e5e1-425a-85c2-3b3bd3ada0aa/console-operator/1.log" Apr 20 08:15:16.665523 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:16.665497 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2chjv_29883fa5-e5e1-425a-85c2-3b3bd3ada0aa/console-operator/1.log" Apr 20 08:15:16.668585 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:16.668569 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovn-acl-logging/0.log" Apr 20 08:15:16.669769 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:16.669752 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovn-acl-logging/0.log" Apr 20 08:15:23.187408 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:23.187370 2572 generic.go:358] "Generic (PLEG): container finished" podID="4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5" containerID="470c1a6e99b6aeddf26f407a846903d77abea5f8ddfc3e83b5c56375c2298970" exitCode=6 Apr 20 08:15:23.187852 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:23.187445 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611215-65s7d" event={"ID":"4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5","Type":"ContainerDied","Data":"470c1a6e99b6aeddf26f407a846903d77abea5f8ddfc3e83b5c56375c2298970"} Apr 20 08:15:23.187852 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:23.187786 2572 scope.go:117] "RemoveContainer" containerID="470c1a6e99b6aeddf26f407a846903d77abea5f8ddfc3e83b5c56375c2298970" Apr 20 08:15:24.192191 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:24.192131 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611215-65s7d" event={"ID":"4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5","Type":"ContainerStarted","Data":"323f77617dc0380c3e063de0fa270c9fc54b595aa1ea77268985c70b1ce1107d"} Apr 20 08:15:44.274218 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:44.274182 2572 generic.go:358] "Generic (PLEG): container finished" podID="4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5" containerID="323f77617dc0380c3e063de0fa270c9fc54b595aa1ea77268985c70b1ce1107d" exitCode=6 Apr 20 08:15:44.274671 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:44.274251 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611215-65s7d" event={"ID":"4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5","Type":"ContainerDied","Data":"323f77617dc0380c3e063de0fa270c9fc54b595aa1ea77268985c70b1ce1107d"} Apr 20 08:15:44.274671 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:44.274296 2572 scope.go:117] "RemoveContainer" containerID="470c1a6e99b6aeddf26f407a846903d77abea5f8ddfc3e83b5c56375c2298970" Apr 20 08:15:44.274671 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:44.274647 2572 scope.go:117] "RemoveContainer" containerID="323f77617dc0380c3e063de0fa270c9fc54b595aa1ea77268985c70b1ce1107d" Apr 20 08:15:44.274956 ip-10-0-133-161 kubenswrapper[2572]: E0420 08:15:44.274933 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29611215-65s7d_opendatahub(4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5)\"" pod="opendatahub/maas-api-key-cleanup-29611215-65s7d" podUID="4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5" Apr 20 08:15:59.641956 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:15:59.641920 2572 scope.go:117] "RemoveContainer" containerID="323f77617dc0380c3e063de0fa270c9fc54b595aa1ea77268985c70b1ce1107d" Apr 20 08:16:00.010240 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:16:00.010202 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611215-65s7d"] Apr 20 08:16:00.339278 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:16:00.339235 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611215-65s7d" event={"ID":"4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5","Type":"ContainerStarted","Data":"6b89107f978026fc13e5f132f9565241359280fd21f5e4854f16c203b08a52f9"} Apr 20 08:16:00.339458 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:16:00.339332 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29611215-65s7d" podUID="4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5" containerName="cleanup" containerID="cri-o://6b89107f978026fc13e5f132f9565241359280fd21f5e4854f16c203b08a52f9" gracePeriod=30 Apr 20 08:16:20.424655 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:16:20.424618 2572 generic.go:358] "Generic (PLEG): container finished" podID="4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5" containerID="6b89107f978026fc13e5f132f9565241359280fd21f5e4854f16c203b08a52f9" exitCode=6 Apr 20 08:16:20.425117 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:16:20.424689 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611215-65s7d" event={"ID":"4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5","Type":"ContainerDied","Data":"6b89107f978026fc13e5f132f9565241359280fd21f5e4854f16c203b08a52f9"} Apr 20 08:16:20.425117 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:16:20.424739 2572 scope.go:117] "RemoveContainer" containerID="323f77617dc0380c3e063de0fa270c9fc54b595aa1ea77268985c70b1ce1107d" Apr 20 08:16:20.482415 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:16:20.482391 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611215-65s7d" Apr 20 08:16:20.569206 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:16:20.569084 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zmb2\" (UniqueName: \"kubernetes.io/projected/4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5-kube-api-access-4zmb2\") pod \"4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5\" (UID: \"4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5\") " Apr 20 08:16:20.571150 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:16:20.571102 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5-kube-api-access-4zmb2" (OuterVolumeSpecName: "kube-api-access-4zmb2") pod "4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5" (UID: "4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5"). InnerVolumeSpecName "kube-api-access-4zmb2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 08:16:20.670476 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:16:20.670437 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4zmb2\" (UniqueName: \"kubernetes.io/projected/4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5-kube-api-access-4zmb2\") on node \"ip-10-0-133-161.ec2.internal\" DevicePath \"\"" Apr 20 08:16:21.429580 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:16:21.429550 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611215-65s7d" event={"ID":"4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5","Type":"ContainerDied","Data":"2637456cd261391f3a7aed785f261dc3fb374b69fe43f1ffa902c803ec12bb30"} Apr 20 08:16:21.430032 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:16:21.429585 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611215-65s7d" Apr 20 08:16:21.430032 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:16:21.429590 2572 scope.go:117] "RemoveContainer" containerID="6b89107f978026fc13e5f132f9565241359280fd21f5e4854f16c203b08a52f9" Apr 20 08:16:21.448323 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:16:21.448298 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611215-65s7d"] Apr 20 08:16:21.450197 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:16:21.450177 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611215-65s7d"] Apr 20 08:16:22.646572 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:16:22.646538 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5" path="/var/lib/kubelet/pods/4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5/volumes" Apr 20 08:20:16.692456 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:20:16.692413 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2chjv_29883fa5-e5e1-425a-85c2-3b3bd3ada0aa/console-operator/1.log" Apr 20 08:20:16.693656 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:20:16.693632 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2chjv_29883fa5-e5e1-425a-85c2-3b3bd3ada0aa/console-operator/1.log" Apr 20 08:20:16.697043 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:20:16.697019 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovn-acl-logging/0.log" Apr 20 08:20:16.697873 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:20:16.697856 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovn-acl-logging/0.log" Apr 20 08:25:16.719850 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:25:16.719815 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2chjv_29883fa5-e5e1-425a-85c2-3b3bd3ada0aa/console-operator/1.log" Apr 20 08:25:16.722281 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:25:16.720761 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2chjv_29883fa5-e5e1-425a-85c2-3b3bd3ada0aa/console-operator/1.log" Apr 20 08:25:16.724604 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:25:16.724578 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovn-acl-logging/0.log" Apr 20 08:25:16.725543 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:25:16.725521 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovn-acl-logging/0.log" Apr 20 08:27:00.282895 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:00.282852 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-866b885fb7-fj7qc_e9249804-c51b-45fe-b121-b7e427543972/authorino/0.log" Apr 20 08:27:04.519737 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:04.519695 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-76b95bd89-t7mnh_028f7148-25ae-4040-9f41-1f0686ccc5e5/manager/0.log" Apr 20 08:27:04.921749 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:04.921672 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-687c889b9-kgmn5_0972f1b7-083d-4e02-bbba-f354c5c4e05f/manager/0.log" Apr 20 08:27:05.161916 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:05.161889 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-h7nm2_0a17bf98-d8da-42ea-a485-2468e701bd28/postgres/0.log" Apr 20 08:27:06.530609 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:06.530573 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-866b885fb7-fj7qc_e9249804-c51b-45fe-b121-b7e427543972/authorino/0.log" Apr 20 08:27:06.787522 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:06.787426 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-g6z8n_5950397d-fe46-4d7f-9ca1-584e4f8ee9f7/manager/0.log" Apr 20 08:27:06.915096 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:06.915069 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-q22cn_d0750385-e266-4f01-b87d-6b1800b78342/kuadrant-console-plugin/0.log" Apr 20 08:27:07.056183 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:07.056068 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-t2984_6695d048-f5af-437e-955c-d4c90e9d091d/registry-server/0.log" Apr 20 08:27:07.791926 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:07.791895 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557fg825f_806e8845-f708-4670-8e6f-195d03dd6803/istio-proxy/0.log" Apr 20 08:27:08.301296 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:08.301244 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-cvmwg_4d018cfd-d796-4b7a-8057-0fab9fde2781/istio-proxy/0.log" Apr 20 08:27:08.424999 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:08.424969 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-656885888d-85974_520d7307-df9b-4a84-8836-d8b02ebe3ddb/router/0.log" Apr 20 08:27:08.936372 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:08.936338 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c_56d11ca3-9ffa-496f-8ff7-e6bd731e060c/storage-initializer/0.log" Apr 20 08:27:08.944616 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:08.944587 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-tcr4c_56d11ca3-9ffa-496f-8ff7-e6bd731e060c/main/0.log" Apr 20 08:27:09.072941 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:09.072916 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-hglfn_b6b112fb-b08e-47c0-ba76-2746093563d4/storage-initializer/0.log" Apr 20 08:27:09.087604 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:09.087581 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-hglfn_b6b112fb-b08e-47c0-ba76-2746093563d4/main/0.log" Apr 20 08:27:09.213911 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:09.213832 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b_6e0aabb5-4108-4518-841f-8d8c4a9ddaf1/storage-initializer/0.log" Apr 20 08:27:09.225327 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:09.225299 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccpzg9b_6e0aabb5-4108-4518-841f-8d8c4a9ddaf1/main/0.log" Apr 20 08:27:16.838049 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:16.838013 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-fxwgx_723bdbde-5646-4818-b4d4-06690f364a5a/global-pull-secret-syncer/0.log" Apr 20 08:27:16.978072 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:16.978040 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-l8wjp_a7425036-5ff0-42c9-9f51-3d27f67f9232/konnectivity-agent/0.log" Apr 20 08:27:17.046247 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:17.046208 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-161.ec2.internal_71c8bd7a095056144ad8091ca3c68103/haproxy/0.log" Apr 20 08:27:20.792257 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:20.792225 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-866b885fb7-fj7qc_e9249804-c51b-45fe-b121-b7e427543972/authorino/0.log" Apr 20 08:27:20.858595 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:20.858561 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-g6z8n_5950397d-fe46-4d7f-9ca1-584e4f8ee9f7/manager/0.log" Apr 20 08:27:20.891996 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:20.891966 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-q22cn_d0750385-e266-4f01-b87d-6b1800b78342/kuadrant-console-plugin/0.log" Apr 20 08:27:20.945576 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:20.945546 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-t2984_6695d048-f5af-437e-955c-d4c90e9d091d/registry-server/0.log" Apr 20 08:27:22.394997 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:22.394971 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_66717b20-de30-4421-80d5-ccd76ced1dc5/alertmanager/0.log" Apr 20 08:27:22.419855 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:22.419829 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_66717b20-de30-4421-80d5-ccd76ced1dc5/config-reloader/0.log" Apr 20 08:27:22.441235 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:22.441208 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_66717b20-de30-4421-80d5-ccd76ced1dc5/kube-rbac-proxy-web/0.log" Apr 20 08:27:22.465749 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:22.465722 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_66717b20-de30-4421-80d5-ccd76ced1dc5/kube-rbac-proxy/0.log" Apr 20 08:27:22.487586 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:22.487563 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_66717b20-de30-4421-80d5-ccd76ced1dc5/kube-rbac-proxy-metric/0.log" Apr 20 08:27:22.507880 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:22.507858 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_66717b20-de30-4421-80d5-ccd76ced1dc5/prom-label-proxy/0.log" Apr 20 08:27:22.526868 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:22.526838 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_66717b20-de30-4421-80d5-ccd76ced1dc5/init-config-reloader/0.log" Apr 20 08:27:22.564222 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:22.564193 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-kb8qs_b28cd900-1a9e-4c36-a8d1-7409e30f8de9/cluster-monitoring-operator/0.log" Apr 20 08:27:22.671499 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:22.671390 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-4s2g8_1b008e26-4827-4055-94d5-c9036809bd51/monitoring-plugin/0.log" Apr 20 08:27:22.785923 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:22.785897 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-p5czn_c687b47d-f84d-4ae3-a6ce-0f5596edfa15/node-exporter/0.log" Apr 20 08:27:22.811922 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:22.811890 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-p5czn_c687b47d-f84d-4ae3-a6ce-0f5596edfa15/kube-rbac-proxy/0.log" Apr 20 08:27:22.831184 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:22.831156 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-p5czn_c687b47d-f84d-4ae3-a6ce-0f5596edfa15/init-textfile/0.log" Apr 20 08:27:23.206407 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:23.206377 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-pft7m_59fb366d-19d3-4935-bf25-956a86cebf06/prometheus-operator/0.log" Apr 20 08:27:23.225386 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:23.225355 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-pft7m_59fb366d-19d3-4935-bf25-956a86cebf06/kube-rbac-proxy/0.log" Apr 20 08:27:23.278647 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:23.278577 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-566cbb6c9c-bplzf_11ff8e8f-b0d7-41a4-8371-85195f48d57f/telemeter-client/0.log" Apr 20 08:27:23.299023 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:23.298999 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-566cbb6c9c-bplzf_11ff8e8f-b0d7-41a4-8371-85195f48d57f/reload/0.log" Apr 20 08:27:23.319676 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:23.319652 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-566cbb6c9c-bplzf_11ff8e8f-b0d7-41a4-8371-85195f48d57f/kube-rbac-proxy/0.log" Apr 20 08:27:25.229975 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:25.229937 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2chjv_29883fa5-e5e1-425a-85c2-3b3bd3ada0aa/console-operator/1.log" Apr 20 08:27:25.234233 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:25.234213 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2chjv_29883fa5-e5e1-425a-85c2-3b3bd3ada0aa/console-operator/2.log" Apr 20 08:27:25.639883 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:25.639857 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c5fff674-2tc24_c0f25198-a444-44da-8838-ebd671c31732/console/0.log" Apr 20 08:27:25.667542 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:25.667510 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-556kx_f76edbc7-6592-484d-8941-01f12cf229e7/download-server/0.log" Apr 20 08:27:25.988687 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:25.988657 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m"] Apr 20 08:27:25.989080 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:25.989067 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5" containerName="cleanup" Apr 20 08:27:25.989136 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:25.989082 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5" containerName="cleanup" Apr 20 08:27:25.989136 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:25.989096 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5" containerName="cleanup" Apr 20 08:27:25.989136 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:25.989101 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5" containerName="cleanup" Apr 20 08:27:25.989136 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:25.989113 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5" containerName="cleanup" Apr 20 08:27:25.989136 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:25.989120 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5" containerName="cleanup" Apr 20 08:27:25.989341 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:25.989199 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5" containerName="cleanup" Apr 20 08:27:25.989341 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:25.989207 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5" containerName="cleanup" Apr 20 08:27:25.989341 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:25.989214 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="4e2b2f72-29a0-4c2d-a4cf-442c5cf1f5c5" containerName="cleanup" Apr 20 08:27:25.992506 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:25.992489 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m" Apr 20 08:27:25.995443 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:25.995417 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mvtfk\"/\"kube-root-ca.crt\"" Apr 20 08:27:25.995443 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:25.995441 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-mvtfk\"/\"openshift-service-ca.crt\"" Apr 20 08:27:25.996992 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:25.996971 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-mvtfk\"/\"default-dockercfg-qxwb7\"" Apr 20 08:27:26.000905 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.000885 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m"] Apr 20 08:27:26.099390 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.099356 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c7438d5b-31ed-4cf1-9f47-7a787179fb6a-podres\") pod \"perf-node-gather-daemonset-cx57m\" (UID: \"c7438d5b-31ed-4cf1-9f47-7a787179fb6a\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m" Apr 20 08:27:26.099572 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.099395 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7438d5b-31ed-4cf1-9f47-7a787179fb6a-sys\") pod \"perf-node-gather-daemonset-cx57m\" (UID: \"c7438d5b-31ed-4cf1-9f47-7a787179fb6a\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m" Apr 20 08:27:26.099572 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.099455 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c7438d5b-31ed-4cf1-9f47-7a787179fb6a-proc\") pod \"perf-node-gather-daemonset-cx57m\" (UID: \"c7438d5b-31ed-4cf1-9f47-7a787179fb6a\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m" Apr 20 08:27:26.099572 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.099472 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd5mx\" (UniqueName: \"kubernetes.io/projected/c7438d5b-31ed-4cf1-9f47-7a787179fb6a-kube-api-access-rd5mx\") pod \"perf-node-gather-daemonset-cx57m\" (UID: \"c7438d5b-31ed-4cf1-9f47-7a787179fb6a\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m" Apr 20 08:27:26.099572 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.099533 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c7438d5b-31ed-4cf1-9f47-7a787179fb6a-lib-modules\") pod \"perf-node-gather-daemonset-cx57m\" (UID: \"c7438d5b-31ed-4cf1-9f47-7a787179fb6a\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m" Apr 20 08:27:26.189373 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.189341 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-5fnb5_2d9a4659-6387-4f04-8cbd-81120ca54057/volume-data-source-validator/0.log" Apr 20 08:27:26.201067 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.201044 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c7438d5b-31ed-4cf1-9f47-7a787179fb6a-podres\") pod \"perf-node-gather-daemonset-cx57m\" (UID: \"c7438d5b-31ed-4cf1-9f47-7a787179fb6a\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m" Apr 20 08:27:26.201189 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.201075 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7438d5b-31ed-4cf1-9f47-7a787179fb6a-sys\") pod \"perf-node-gather-daemonset-cx57m\" (UID: \"c7438d5b-31ed-4cf1-9f47-7a787179fb6a\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m" Apr 20 08:27:26.201189 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.201103 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c7438d5b-31ed-4cf1-9f47-7a787179fb6a-proc\") pod \"perf-node-gather-daemonset-cx57m\" (UID: \"c7438d5b-31ed-4cf1-9f47-7a787179fb6a\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m" Apr 20 08:27:26.201189 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.201170 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c7438d5b-31ed-4cf1-9f47-7a787179fb6a-proc\") pod \"perf-node-gather-daemonset-cx57m\" (UID: \"c7438d5b-31ed-4cf1-9f47-7a787179fb6a\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m" Apr 20 08:27:26.201334 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.201194 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c7438d5b-31ed-4cf1-9f47-7a787179fb6a-sys\") pod \"perf-node-gather-daemonset-cx57m\" (UID: \"c7438d5b-31ed-4cf1-9f47-7a787179fb6a\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m" Apr 20 08:27:26.201334 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.201208 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c7438d5b-31ed-4cf1-9f47-7a787179fb6a-podres\") pod \"perf-node-gather-daemonset-cx57m\" (UID: \"c7438d5b-31ed-4cf1-9f47-7a787179fb6a\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m" Apr 20 08:27:26.201334 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.201216 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rd5mx\" (UniqueName: \"kubernetes.io/projected/c7438d5b-31ed-4cf1-9f47-7a787179fb6a-kube-api-access-rd5mx\") pod \"perf-node-gather-daemonset-cx57m\" (UID: \"c7438d5b-31ed-4cf1-9f47-7a787179fb6a\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m" Apr 20 08:27:26.201334 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.201254 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c7438d5b-31ed-4cf1-9f47-7a787179fb6a-lib-modules\") pod \"perf-node-gather-daemonset-cx57m\" (UID: \"c7438d5b-31ed-4cf1-9f47-7a787179fb6a\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m" Apr 20 08:27:26.201484 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.201404 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c7438d5b-31ed-4cf1-9f47-7a787179fb6a-lib-modules\") pod \"perf-node-gather-daemonset-cx57m\" (UID: \"c7438d5b-31ed-4cf1-9f47-7a787179fb6a\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m" Apr 20 08:27:26.209312 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.209290 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd5mx\" (UniqueName: \"kubernetes.io/projected/c7438d5b-31ed-4cf1-9f47-7a787179fb6a-kube-api-access-rd5mx\") pod \"perf-node-gather-daemonset-cx57m\" (UID: \"c7438d5b-31ed-4cf1-9f47-7a787179fb6a\") " pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m" Apr 20 08:27:26.303660 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.303570 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m" Apr 20 08:27:26.427719 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.427691 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m"] Apr 20 08:27:26.430416 ip-10-0-133-161 kubenswrapper[2572]: W0420 08:27:26.430388 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc7438d5b_31ed_4cf1_9f47_7a787179fb6a.slice/crio-bcd39b085055ee011de3aa7ad4e8039976b456aa171a362a09d415049aaf9ee3 WatchSource:0}: Error finding container bcd39b085055ee011de3aa7ad4e8039976b456aa171a362a09d415049aaf9ee3: Status 404 returned error can't find the container with id bcd39b085055ee011de3aa7ad4e8039976b456aa171a362a09d415049aaf9ee3 Apr 20 08:27:26.432375 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.432358 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 08:27:26.901237 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.901202 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m" event={"ID":"c7438d5b-31ed-4cf1-9f47-7a787179fb6a","Type":"ContainerStarted","Data":"e857e6399fbdc08e949b8d9a7764676898e40509b1367f5ba4a9746a97885589"} Apr 20 08:27:26.901237 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.901240 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m" event={"ID":"c7438d5b-31ed-4cf1-9f47-7a787179fb6a","Type":"ContainerStarted","Data":"bcd39b085055ee011de3aa7ad4e8039976b456aa171a362a09d415049aaf9ee3"} Apr 20 08:27:26.901448 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.901375 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m" Apr 20 08:27:26.918373 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:26.918325 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m" podStartSLOduration=1.918310697 podStartE2EDuration="1.918310697s" podCreationTimestamp="2026-04-20 08:27:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 08:27:26.917790589 +0000 UTC m=+2230.831420634" watchObservedRunningTime="2026-04-20 08:27:26.918310697 +0000 UTC m=+2230.831940961" Apr 20 08:27:27.048735 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:27.048704 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7fps4_0993c493-f978-431e-9000-290ab9fb0bbe/dns/0.log" Apr 20 08:27:27.081443 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:27.081417 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7fps4_0993c493-f978-431e-9000-290ab9fb0bbe/kube-rbac-proxy/0.log" Apr 20 08:27:27.142050 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:27.142024 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-45gfj_38e938ac-334a-46a9-bd54-099927b87530/dns-node-resolver/0.log" Apr 20 08:27:27.645454 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:27.645422 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-64bbf84747-kdrcj_c4f5392a-e25d-4f52-b745-c32d100b1565/registry/0.log" Apr 20 08:27:27.677614 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:27.677586 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-fcfwc_4462eeb2-94f5-4c1c-bc8e-62fe9f10c78f/node-ca/0.log" Apr 20 08:27:28.454249 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:28.454215 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557fg825f_806e8845-f708-4670-8e6f-195d03dd6803/istio-proxy/0.log" Apr 20 08:27:28.642926 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:28.642892 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-cvmwg_4d018cfd-d796-4b7a-8057-0fab9fde2781/istio-proxy/0.log" Apr 20 08:27:28.665307 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:28.665271 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-656885888d-85974_520d7307-df9b-4a84-8836-d8b02ebe3ddb/router/0.log" Apr 20 08:27:29.120666 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:29.120639 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9bxrl_b659a68e-b039-4864-b691-ff12b7393ed7/serve-healthcheck-canary/0.log" Apr 20 08:27:29.568618 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:29.568588 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-rjwff_8c112e80-851a-4290-86a7-51a64594d25e/insights-operator/0.log" Apr 20 08:27:29.568789 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:29.568739 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-rjwff_8c112e80-851a-4290-86a7-51a64594d25e/insights-operator/1.log" Apr 20 08:27:29.587509 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:29.587478 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-92gbs_5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2/kube-rbac-proxy/0.log" Apr 20 08:27:29.605957 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:29.605931 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-92gbs_5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2/exporter/0.log" Apr 20 08:27:29.626007 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:29.625976 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-92gbs_5c7f4336-12d9-4746-8e7e-0aa8c48a9eb2/extractor/0.log" Apr 20 08:27:31.755925 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:31.755837 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-76b95bd89-t7mnh_028f7148-25ae-4040-9f41-1f0686ccc5e5/manager/0.log" Apr 20 08:27:31.848682 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:31.848655 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-687c889b9-kgmn5_0972f1b7-083d-4e02-bbba-f354c5c4e05f/manager/0.log" Apr 20 08:27:31.895735 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:31.895701 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-h7nm2_0a17bf98-d8da-42ea-a485-2468e701bd28/postgres/0.log" Apr 20 08:27:32.915930 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:32.915902 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-mvtfk/perf-node-gather-daemonset-cx57m" Apr 20 08:27:32.949072 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:32.949046 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-56d8f7c9b7-2vgfb_5a75ba04-f150-4f98-9d3e-b2f056ad7cad/manager/0.log" Apr 20 08:27:37.708011 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:37.707937 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-f6qxs_da90ef7d-a0bf-47dd-ab2b-79b951b2d24c/kube-storage-version-migrator-operator/1.log" Apr 20 08:27:37.708764 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:37.708748 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-f6qxs_da90ef7d-a0bf-47dd-ab2b-79b951b2d24c/kube-storage-version-migrator-operator/0.log" Apr 20 08:27:38.989046 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:38.989013 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lfk6z_893e108a-cd48-4c06-80c6-167a8ad53ac2/kube-multus-additional-cni-plugins/0.log" Apr 20 08:27:39.008038 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:39.008011 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lfk6z_893e108a-cd48-4c06-80c6-167a8ad53ac2/egress-router-binary-copy/0.log" Apr 20 08:27:39.027807 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:39.027782 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lfk6z_893e108a-cd48-4c06-80c6-167a8ad53ac2/cni-plugins/0.log" Apr 20 08:27:39.048161 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:39.048114 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lfk6z_893e108a-cd48-4c06-80c6-167a8ad53ac2/bond-cni-plugin/0.log" Apr 20 08:27:39.067587 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:39.067563 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lfk6z_893e108a-cd48-4c06-80c6-167a8ad53ac2/routeoverride-cni/0.log" Apr 20 08:27:39.086040 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:39.086016 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lfk6z_893e108a-cd48-4c06-80c6-167a8ad53ac2/whereabouts-cni-bincopy/0.log" Apr 20 08:27:39.107670 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:39.107645 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lfk6z_893e108a-cd48-4c06-80c6-167a8ad53ac2/whereabouts-cni/0.log" Apr 20 08:27:39.163727 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:39.163691 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xsjgg_edf19122-ee32-4e12-a720-45239728231d/kube-multus/0.log" Apr 20 08:27:39.215467 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:39.215435 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-brq5h_07219834-44d6-42ab-9058-aed46274d1a8/network-metrics-daemon/0.log" Apr 20 08:27:39.232894 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:39.232865 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-brq5h_07219834-44d6-42ab-9058-aed46274d1a8/kube-rbac-proxy/0.log" Apr 20 08:27:40.083745 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:40.083720 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovn-controller/0.log" Apr 20 08:27:40.098701 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:40.098671 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovn-acl-logging/0.log" Apr 20 08:27:40.108096 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:40.108071 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovn-acl-logging/1.log" Apr 20 08:27:40.124519 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:40.124488 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/kube-rbac-proxy-node/0.log" Apr 20 08:27:40.147937 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:40.147908 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 08:27:40.165496 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:40.165471 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/northd/0.log" Apr 20 08:27:40.183518 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:40.183496 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/nbdb/0.log" Apr 20 08:27:40.206320 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:40.206291 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/sbdb/0.log" Apr 20 08:27:40.301620 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:40.301585 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-55zsn_f989907c-6b39-4f73-8cb2-9fb3915c446d/ovnkube-controller/0.log" Apr 20 08:27:41.851133 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:41.851098 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-qr7pr_3e84a5ff-c8e6-4c91-95b6-66697b65f3e6/network-check-target-container/0.log" Apr 20 08:27:42.911465 ip-10-0-133-161 kubenswrapper[2572]: I0420 08:27:42.911434 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-ng5xl_2fdddb28-e042-4361-94c8-ed537e5237f2/iptables-alerter/0.log"